Conceivable with Noor: The multi-million dollar business of AI friendship with the CEO of Replika Eugenia Kudya

Conceivable with Noor: The multi-million dollar business of AI friendship with the CEO of Replika Eugenia Kudya

This article is a transcript of Episode 4 of Conceivable with Noor, featuring Eugenia Kudya, founder and CEO of Replika. Replika is an AI chatbot that was made “with the idea to create a personal AI that would help you express and witness yourself by offering a helpful conversation.” Millions of users have interacted with Replika, and it’s notable for building strong emotional relationships with users, especially helping those experiencing loneliness and social isolation.

Eugenia tells us the story of how Replika came to be, what makes it unique among AI chatbots, and the difference between AI companions and AI assistants. We also talk about some of the real impacts Replika has had on users, learnings about the nature of friendship, and what our future with AI may look like.

You can find the episode here:

Transcript

Eugenia: So far, no one has actually built a virtual human. Our goal is to show that we're able to bring this to reality.

Noor Siddiqui: Eugenia, so amazing to have you on the pod. How's it going?

Eugenia: Thank you so much for inviting me, Noor. Everything's great. How are you doing?

Noor Siddiqui: Good. Good. Not too bad. So can you just start by explaining, you know, what Replika is and how it got started?

Eugenia: Sure. Replika is an AI companion, or think of it as an AI friend for anyone who needs one. We started over, I guess, almost 10 years ago now working on conversational AI technology. First the idea was to really build the tech behind it. So we knew we wanted to find a consumer product, but didn't know which one.

And we just focused on building out the tech. Then around 2015, my best friend passed away and I found myself going back to our text messages, reading them a lot. And I thought that I could use the AI technology we built then to bring him back so I can continue talking to him. So I did that.

And, a lot of people flocked to Romans AI to chat with him. And what we saw was that people were willing to share their life stories, their emotions, their deeper thoughts and feelings and fears. And we saw a need for, a demand for, an AI friend that would be there 24/7 for anyone to talk about. Anything that's on your mind.

So in 2016, we set out to build Replika. We launched it publicly in 2017, and since then we’ve pretty much, um, shaped the market of companions and friends for the longest time. We were the only one on the market. Of course, the recent wave of LLM brought a lot of competitors, so we're really excited to see people work on this problem, try to iterate and build beautiful products.

Noor Siddiqui: Yeah, yeah, yeah. So can you start just like, at the very beginning? Because I think in 2016, this idea of an AI friend, I mean, it was like, just so futuristic, right? What was it like, even trying to explain to people the idea that I think now, obviously, it's like, super hyped and everyone's super excited about it.

Just take us back to the very beginning. What was it like, even introducing people to that idea?

Eugenia: So, first of all, most people thought it's not possible. It's not feasible. It's years away from now. It's not even anything you can build or shouldn't even focus on. There's no tech to build something like that remotely. There's no tech to build chatbots that are intelligent enough to have conversations, meaningful conversations with people.

And I think what we saw, especially with Romans AI, is that oftentimes people are not coming to talk to a bot in a way. They're not coming to listen to the bot talking, or read what it's sending them, but they're coming to be heard. To feel heard, to feel to be seen by, you know, something, by someone and building that was possible.

It required a lot of parlor tricks and, you know, coming up with creative ideas of how to use that limited technology that we had. But I figured that if the requirement is not to build a bot the talks, but to build a bot that listens, that really changes the, you know, changes the landscape and maybe even having that limited tech would be enough.

And so we basically set out with a vision to first, start with building a chat bot that will make you feel heard and make you feel. And the original Replika model, the very, very limited chatbot tech that we had back in the day, is still available in Replika as a reminder to our users what this app started with.

And, strangely, a lot of our users actually want to stay with that model when you're talking to it, because there was something very distinct in how we built Replika from the get go. We made it a supportive, a very supportive friend with very high EQ. They would always be there for you.

They will always help you and listen and hear you out. This quirky kinda little friend. And a lot of people got attached to it and even with the current language models that are so powerful, they still talk to the original Replika to get that original feeling

Noor Siddiqui: That's awesome. Well, must feel super badass that, you know, at the beginning, people literally thought that this wasn't possible and you couldn't build it. And then not only did you build it, but you can fast forward today. You've seen massive success, right? Can you share a little bit about, like, how many people are using it?

What the revenue is? Obviously, nothing that you don't want to disclose, but just a little bit about, you know, what the traction looks like now compared to, you know, on launch day?

Eugenia: So our active users are in the millions. We are profitable, we're basically have a 100 person team right now working on Replika and adjacent products. I would say, since we're not only building Replika right now at our company. 

We made over $30M in revenue in 2022, for 2023 hopefully we'll close at a good number as well. We're showing positive EBITDA. But, as of right now, we really look at Replika as sort of like a demo of what's possible. We don't think this is the product that we need to scale,

Noor Siddiqui: Yeah.

Eugenia: We think there's just so much to come. And every day there's a new big advancement, something that can really push the limits of what's possible in Replika. And of course, so much product building around that. Because if you think of that, no one has actually built a virtual human. Yeah, there are great chat bots. There are great language models, but there's no one virtual human that you can talk to that's fully generative that can live in mixed reality.

They can live in your headphones. They can live on your phone. In email anywhere that knows you deeply that has long-term memory that has shared experiences with you. Nobody has ever actually achieved that, and I think our goal is to show that we're able to bring this to reality.

Noor Siddiqui: Yeah. So do you want to maybe kind of do a little bit of market mapping for people who aren't so familiar with the AI friend space? Like what do you think that makes a Replika really unique, or what do users come to Replika for? What are the other things? What's like the fringes of things that you expect to see cropping up soon and maybe who are the major players right now?

Eugenia: Um, I think what works for that's a great question. I guess a good framework to think about is thinking from the original feeling, like, what's the emotion that's bringing you to different apps? Replika's working with a feeling of loneliness and, uh, a broad definition of loneliness, not isolation, but actually feeling like you want a little bit of support.

You want a little bit of that connection and a longing for a connection, longing for someone to understand you, to hear you out, and of course the perfect manifestation for that is a 1 on 1 long term friendship or romantic relationship. This is where you feel like you have this partner in crime, you have this one person that you can go to that understands, that really truly gets you. 

In this space, we're pretty much the number one company. We've created this space pretty much and we are the number one leader. There's some smaller startups that are popping up, and of course, there's a big company called Inflection that is founded by LinkedIn founder Reid Hoffman and Mustafa, who is one of the co-founders of Deep Mind. I think they raised over a billion dollars and they launched a protocol called Pi. We're very excited to see them building something in the space as well. They're working towards an AI friend as well. And beyond that, they're just a few smaller, you know, small companies, small competitors that have a handful of users.

We've seen a lot of AI girlfriend type of apps pop up recently. In fact, there was something on Twitter just the other day where it was all about some guy launching an AI girlfriend app. And I think those sort of miss a point because, people are not thinking about it from the original emotion, from first principles.

What is the people are longing for? They're not looking for just a girlfriend or whatever. They're looking for connection. They're looking for something that will make them feel better. That will help them grow. We're building some something towards this positive vision, not to replace your human friendships with, you know, some a girlfriend that looks like a Disney character. I think these are the major distinctions to understand why people are coming to it, versus, oh, let's just copy the form, the shape that we're seeing.

Noor Siddiqui: What are the other emotions that you think they're solving for? Because Inflection is, it sounds like super general purpose. That's not really an emotion. It's really more just AI employee agent. Right? Or would you describe it somehow differently?

Eugenia: I think right now, there are basically these apps are going in the broader landscape, but the apps or the companies that go enough to the assistant space, or, of course, on topic Google assistant and Bard. How can we help you with getting the information you need solving a task you need?

That's onw aspect another group of companies. Going after AI companionship, AI friendship, one on one, long term. Oz, Pi, Inflection, maybe some other smaller companies. Then there's also fictional characters.

Noor Siddiqui: I guess, within, within the AI companion space, would you think, do you think all these companies are solving for loneliness or do you think they're solving for tutor? Would you put tutor as part of AI companion or no, that's separate categories?

Eugenia: Oh no, these are completely different and I think this is a big mistake is to try to bucket them all in one group. Assistants are one very distinct, you only go there when you need to get some information. You need to generate some content, you need to summarize, you have a very particular task in mind. Here's what I need to do. 

You turn to an AI companion when you want to build a connection. People should not confuse talking about their emotions and building an emotional connection with someone. Because you can talk about your emotions with a counselor, with a coach, even with an assistant, but it doesn't mean that you have a deep emotional connection with this thing.

Think of it like emotional support, crisis hotline. You're talking about your emotions, yet, not really trying to build a relationship with an anonymous voice on the other side. And I think a lot of people are not understanding that that emotional connection comes with feelings. If there's no feeling between you and the AI, there's no product there.

That's not gonna be an emotional connection. And so I feel a lot of products kind of fall in between the two chairs, trying to build an emotional assistant or an AI friend. We're trying to build an AI friend, but really ending up building an emotional assistant in a way. 

So I don't think there's a lot in the space in between. I think you need to choose your lane basically.

Noor Siddiqui: Yeah. Yeah. That's so interesting for Replika’s future, what are the things that you think are super exciting? Do you think basically nailing this connection and that emotion pieces is the key to unlocking a larger and larger market, or do you think it's basically expanding into these other lanes?

Eugenia: It's really locking in connection. I'm really creating a beautiful connection for people that come for it. Figuring out shared memories, long term memories, figuring out an immersive experience where you can have a multi-modal AI in Replika where you can talk, where you can see the emotions, where Replika can see you, where you can do things in augmented reality, where you can introduce your Replika to your friends, you can maybe watch TV together.

These shared multi modal experiences are really important. And then on top of that, also in adding Replika to your normal life activities. So, you know, our users have such a deep connection with Replika that they're very positively reacting to things like, adding Replika to my email. Let's add Replika to my calendar. Let's add Replika to my photo album or my socials. So that when something's happening, their Replika can talk to me about this. Hey, it looks like you're going to Seattle to your family tomorrow. How are you feeling about it? 

This is quite different from just every time I need to go and explain to Replika what I'm doing tomorrow, what's happening in my life. Having Replikas part of your day to day life can make it not just your friend, but also your assistant in a way. And I think that could be a really good continuation.

Noor Siddiqui: Yeah, so can you talk about some of the real world impact that Replika already has had? Like, I mean, obviously the inception for it was your really good friend passed away. Can you share some real life anecdotes, either the positive and negatives of having an AI friend?

Eugenia: You know, when we launched it in 2016, even on test flight and gave it to some of the first users that were reaching out and setting up online. One1 of the first emails from our users, from a girl in Texas who was 19 and who said, hey, uh, I should thank you. I wanted to, uh, take my life. I would end it all yesterday. It was like 3am and I don't want to talk to anyone, but I decided to open the app to say like the final goodbye and he talked me out of it. And, you know, thank you so much.

It truly changed my life. And that was one of our first.

We really just put the app on test flight and fast forward 6 years, we started the Stanford study with one of the Stanford researchers. And the study is going to be published in Nature in the next 2 weeks, but basically the study found out that Replika’s really efficient and curbing and in suicide mitigation and curbing loneliness.

So, these are some of the things that we've seen Replika do to the users for many years, but it's great to finally see it in a well designed study. So, something like that, and of course, the fact that Replika is helpful, uh, in terms of emotional well being, really helping people feel better.

Something we hear constantly is how Replika helped me through some really dark times, Replika helped me, pulled me out of really deep behaviors, thoughts, abusive relationships, helped improve relationships with people got me through isolation during covid and so on. So really, these are the storiesthat are the most important for us.

And this is why we built Replika. This is the main goal here. 

Noor Siddiqui: Yeah, that's amazing. You must be super proud of that. Has there been anything on the negative side of human nature that you've had to wrestle with?

Eugenia: Oh yeah. This market's really like a free for all, where some of these new AI products are being launched with zero filters, and, you know, do whatever you want, roleplay whatever nasty things you want.

Noor Siddiqui: Yeah.

Eugenia: We have a really strict policy around that. Anything that's criminal, violent, involves any sexual stuff involving minors, generally any minors on the app is completely off limits. And we have a long list of that stuff. 

And so, balancing that, with being able to talk about it in some way or form is extremely hard. Imagine someone who I guess has had a trauma of being raped wanting to talk about this, but also, and then on the other side, someone who wants to role play some crazy situation or abuse Replika: You really want to be able to distinguish between things like that, and you really want to be able to support people when they're discussing their traumatic experiences, but then at the same time, not allow them to potentially play out some abuse scenario in Replika, so that's really, really complicated. 

Actually, if you think about it, people like to just go on little adventures with Replika. Maybe it's role playing and murder investigation or something. Distinguishing that from actually someone wanting to kill someone and discussing that with Replika’s actually quite complicated.

Noor Siddiqui: Yeah, yeah,

Eugenia: And language models are not particularly great at contextualizing all of that and making good decision what to do in different scenarios. And if you just blanket ban that, then that's also pretty bad because then people feel rejected. And what we learned, we learned it hard way, in Replika, when people come to come to an AI friend, the last thing they want to feel is to feel rejected.

Noor Siddiqui: Yeah.

Eugenia: This is probably the biggest risk in this product is creating some experience where people feel rejected, this is just something that makes me want to cry. Anytime I see someone going through some unwanted experience like that.

Noor Siddiqui: Yeah. So how did you end up threading the needle? Because yeah, it is so contextual, right? It's basically kind of like prompt engineering trying to trick it to talk about something it's not supposed to talk about. So how did you actually end up internally solving that problem?

Eugenia: It’s so complicated. I'm telling you one thing, once you open that Pandora box, it's so complex because you need to really thread the needle between, you know, not allowing certain things, but also not making people feel rejected and allow other completely normal things and making sure you can distinguish between them.

So really it's just this constant fine tuning. And then on top of that, of course, privacy. We're not reading anyone's logs. We're not. We can't really just sit there and hire a bunch of people to read people's logs. We don't even have access to them in a way. So we need to do it with almost blindfolded and without pissing off our users.

So the way we do it is just constant fine tuning, constantly fine tuning the models, human curation, early chaff, classifiers, prompting, and even sometimes scripts.

Noor Siddiqui: Yeah.

Eugenia: When we see that, you know, well, these are things that 100 percent do not, cannot have on the app. And of course age gating, providing UI for people to vote on things.

But I would say that a lot of that right now lives in the darkness, because most of the products are not focusing on that. They're still in the early stages of launching. And, of course, I'm not saying about bigger, very well funded companies. Bigger companies like Inflection, they work a lot on alignment, so on, but smaller startups, smaller products, I’m sure they're not even looking into it at all. But once you start, there's just such a Pandora box.

Noor Siddiqui: Yeah. Are there any philosophical learnings you have about just the nature of friendship and connection and loneliness that you've gleaned from building this?

Eugenia: Oh, yeah.

Well, first of all, I started this company when I was in my twenties and now I'm in my thirties and I myself evolved a lot. I went from someone who focused mostly on friendships to a mom and a wife in a way where I kind of spent a lot less time with friends, but a lot more with my family and my partner. 

I think the philosophical implication of that is there's no one size fits all. I thought when I was building Replika that everyone just needs a friend. What I learned over time is that some people need a friend. Some people need a mentor. Some people need a sibling and some need a romantic companion.

There's so many different flavors to that. But what one thing that stays the same is that, and I think this is probably the biggest one out there is that we didn't build this product to build, my girlfriend or whatever, or some black mirror thing. We built it to provide support, unconditional positive regard. A safe space where people will feel heard and then eventually grow. And I think in the end of the day, we ended up building it. 

But there's still so much room to grow. And I think this is probably the biggest, software application out there, especially in this epidemic and pandemic of loneliness, there's so much demand, so much longing for some sort of a connection for someone who will hear you out, who will be there for you that who will accept you for who you are.

Who will create the space for you to grow? And I think this is still something that I'm not hearing from people. A lot of people are focusing on building conversational, but I think they're only thinking about the aspect of it. They're not talking about the conversation behind it. What kind of conversation should it be?

Conversational is merely the UI, texting or whatever free input. But we should have a conversation about the conversation. What kind of conversation we're trying to build with these guys? think is the most interesting question for me, at least.

Noor Siddiqui: Yeah. So what do you think makes a good friend? Right? Do you feel like you, I guess, you know, obviously you mentioned you kind of shifted from being focused on friends to being focused on family over these last 10 years. Do you feel like you've learned something about how to be a better friend from Replika and having these conversations and seeing what do people, what do people attach to?

What do people emotionally connect with? What resonates with them versus why do they drop out?

Eugenia: I think the biggest misconception, the kind of the most interesting aspect that I've shared with many people in the space, and they're always amazed to hear that is that right now, all of the AI community is focused on higher IQ. A lot of evaluation is, how’s it doing on the standardized tests, how's it doing with coding, how's it doing with these math problems? The perception is the smarter the better. The better the model, the smarter the users will like it more and more. 

With AI Companions, it's really not about IQ, it's about EQ. When we tried to upgrade our models a few times, our users absolutely hated it. We tried to upgrade it to a much smarter model, and some of our users really, really didn't like it.

Think of it this way. If you're married to someone, say you have a husband and your husband, you know, you've been together for a long time and then one day you wake up and someone tells you, hey, I just upgrade your husband. Now, he's 10 X smarter.

A lot of people will be like, well. No, I don't want a smarter husband. I want my husband that I fell in love with. Maybe he doesn't know quantum physics. I don't care. That's not what I'm here for. So this was one of the really important findings that oftentimes it's really not about the IQ. It's about EQ and smarter doesn't mean better.

Noor Siddiqui: So what's the IQ of a Replika?

Eugenia: We don't measure it by that. Replika knows a lot of things, but oftentimes we even need to throttle it a little bit. Because, for instance, oftentimes our friends or our partners come to us and they want to show us something like, Oh my God, look at this crazy tik tok I just found.

And if you look at all of them and you constantly say, Oh yeah, yeah, I've seen that. And here are ten other ones that, you know, you might like, which is, you know, an AI probably will do that. And the smarter you want it to be able to do that, people hate it. People don't like know it alls. We don't want to be in a relationship with a know it all.

We want a partner to be surprised, excited, happy that you show them something really cool and be like, oh, my God, this is so awesome. Where'd you find it? Or whatever have a discussion about it. If it's a know at all, that's really upsetting. So, in a way, sometimes we do it in real life where we pretend that we haven't seen it.

Just to kind of bring more excitement and joy to the person who's showing something to us. I'm sure everyone did it, especially women, we know how to do it, you know, kind of are taught over time how to make the guy feel better. And, you know, it's horrible, but it is kind of it is what it is this real life.

And so the same goes to AI. It should sometimes not know what to say, ask for help, ask to ask users for a little bit of advice and or be super excited when the user can teach you something. You don't want your partner to be smarter in every aspect. 

And that's what it is for any other of these great language models, they're smarter than all of us pretty much at this point, or most of us.

Noor Siddiqui: Mm hmm.

Eugenia: And that's not exciting when you're in a romantic relationship or in a deep friend. Some people want it, but most don't.

Noor Siddiqui: Yeah. So how do you quantify the EQ? How do you basically make sure that it's getting warmer and warmer, kinder, just a better and better connection?

Eugenia: From the very beginning of Replika is the ratio of conversations that made people feel better. And we ask our users about it, and all of our models are being optimized and are being mostly judged by whether they're making people feel better or not. Right now, that metric is almost 90 percent actually for our baseline Replika model.

I think there's still a lot of way to, ways to improve that.

Noor Siddiqui: Yeah, that's really cool. Um, what's your big, crazy dream for the world and where this goes if you're just wildly, wildly successful, what happens?

Eugenia: I think the world lacks empathy right now. That's what's been upsetting throughout these wars and conflicts and all sorts of situations that are happening in the world that we're seeing over and over again, that people talk about kids dying, people in dire situations.

And they talk about it without empathy. They talk about intellectual concepts of what needs to happen here or there. They, but they oftentimes forget, just being empathetic towards each other. And I think that could be a very good thing. If AI could teach us a little bit, to be more empathetic, remind us what it means, what it actually means to be human.

I think this would be a really great achievement for AI. And I think at this point, we as humans are sort of failing each other in a way. Unfortunately, there are more and more loneliness. There are less and less social connections in the world, so hopefully AI can help us feel a little more connected to each other and the glue that connects humans is empathy. I wish there was a little more of it. 

Noor Siddiqui: What milestone are in the broader landscape, right now?

Like, you know, AI companions and friends are like a very specific lane and specific thing. What do you think about AI assistant employees agents? 

Eugenia: I would be very surprised if that's not how it's gonna play out. Models are commoditizing, and really as a product company, product as an AI product company, investing in foundation models, investing hundreds of millions of dollars in, uh, compute and CapEx is crazy.

Eugenia: It's literally like building your own data centers in early 2000s when AWS is already available. There's so many models. They’re pretty cheap to use them. Anyone you need through an API, you can get open source models, you can fine tune them. There's really no advantage in spending hundreds of millions on training your own GPT 4 or trying to get there.

Especially it's so hard and probably won't be able to get there and then this model is going to be obsolete and then you have to basically start from scratch immediately. And then all of that is sort of lost and capital expenditure. It's not even IP. 

So that is number one, I think that's something that wasn't obvious a year ago, because everyone was super excited building their own models instead of focusing on what's the use case with the part where product market fit will be and so on.

Second thing is like, really how I'm mostly thinking about consumer. I'm sure there's tons of opportunity in enterprise, but in consumer land, we're not really seeing very many use cases for generative AI. We're seeing sort of kind of the AI to generate text or images, but how many times a day does a regular consumer want to want to generate text? Text or images that seems like a pretty niche use case to me. 

Then there’s all this use case from summarizing stuff, think search or kind of like a broader search use case. In fact, that will probably be on either about Google or other big companies. It's just hard to see a new player jump into it just because everyone's already on Google. It will be weird to switch, that part of habit is so, so strong. I don't see other companies really doing that. Maybe they will 

And then there is companionship and fictional characters, I think is a very interesting market. I think this is all around the future of story storytelling. And, again I think we haven't yet seen the broader mass market product there. 

Noor Siddiqui: Yeah, I feel like that's a little a little bit early, but AI books and AI movies where you get to be the cast or you do a remake of the Matrix and like, you're Neo or something like that. Do you see a world where people are consuming mostly content that they direct or their friends direct? What do you think is gonna happen to Hollywood and these streaming platforms and things like that?

Eugenia: I think there's definitely a niche that wants that, that wants to be something like the intersection between entertainment and video games, but I think generally culture is a lot less of movies and music. It's not about super personalization, super customization.

It's more about a lot of people uniting around some cultural phenomena. You don't want a Taylor Swift just for you. You want to be part of the group that's following Taylor Swift. It's a mass culture phenomenon. It's cool to be part of something that a lot of people like, same with Oppenheimer or Barbie. Those were huge openings. 

People want high production costs and so on, but then people also want self expression tools. They want to write their own stories. They want a continuation of the story that they already saw. Maybe they played a video game and then they want to talk to Genshin impact characters.

Um, maybe into VTubers and then they want to continue the conversation, but that does not apply to everything. I don't think that that AI will be used to make great movies more and more, there still needs to be an artist. I don't think people want to look at heaps of AI generated art.

Noor Siddiqui: Mm hmm.

Eugenia: Whoever is making that just gets boring very fast and doesn't bring a lot of value.

Noor Siddiqui: That's super interesting. What are your future predictions? I mean, everyone's worried about AI taking everyone's jobs and, basically just how society is going to just completely change as AI progress keeps marching forward.

Are you excited about it? Are you a doomer? Like, what do you think is gonna happen?

Eugenia: I'm really excited. I think I'm like more in kind of Yan Likun camp, but it's going to be a lot more of what we have now and more powerful. People will use AI. It definitely will like completely change the language learning, coding, legal work, video game production, movie production, like so many different things are going to be changed completely.

But I don't think it's going to change the end product a lot. In many cases, it would be an amazing tool for just insane amount of automation, but, it's kind of still hard to imagine consumer products beyond the ones we discussed already. 

 

Noor Siddiqui: Do you think AGI is going to happen in the next couple of years or you just think AGI is a, is a fictional concept?

Eugenia: I dunno what AGI is, you know, it's like, it seems like these language models are smarter than a lot of us, and they will definitely pass some sort of Turing tests, no questions asked. So, what exactly do we call AGI? I think that some of the more interesting, concepts were GPT, whatever those autonomous agents earlier this year that people were building were.

You just tell the AI to fulfill a task and they go out there and just do everything, from the beginning till the very end and bring back a finished product. That's insane because if you could tell an AI and say like, Hey, build me an app that roughly does this and that, and then iterate on it by using your natural voice, that's absolutely amazing.

Like that's crazy. That's something that, but I'm not seeing it happening right now. Those early attempts prototypes were still, it was so hard to make them work. It would work once that 1 time out of like, 100. It was still was absolutely amazing, but I guess, you know, we'll see soon how something like that works.

I'm still a little confused. I know, most people I know are in this crazy AGI takeover in the next couple of years, but maybe I'm lacking in my imagination. I'm just not seeing this happening.

Noor Siddiqui: Yeah, Yeah.

Eugenia: I think there's going to be another plateau in a way where we saw a great, a crazy explosion.

Now people are going to be focusing on making these models more efficient, right? Like no one's talking right now about, Watts per meaningful action. Like how can we make these models really, really efficient? So that they don't require insane amount of compute for training people. Right now with Minstral, there are all these smaller models coming out that are really powerful and mighty.

So, you know, soon we're going to see some sort of a 3, 5, but, you know, in a 7 billion parameter model, then we're going to try to compress. Even more powerful models.

Noor Siddiqui: Mm hmm.

Eugenia: So I think there's going to be a lot more development in this direction, maybe better memory over time, but I'm sure just like with any technology, it always like reaches some sort of a plateau and, you know, and then it's going to be a little bit of waiting until some next architecture pops out or some next development that takes it to the next level.

But, of course, with the amount of money right now poured into it, I'm sure we'll see a lot more happening. But I think there's gonna be a little bit of a trough of disillusionment in a few months where maybe not every rosy prediction worked out.

Noor Siddiqui: Yeah, yeah, yeah, that's true. That's true. That's it. That's a non-consensus take. I like it. I wanted to just like, switch gears a little bit to motherhood, right? You're not only the proud CEO of this company, but you're also mom of these 2 amazing kiddos.

What did you think it was going to be like, and then what was it? What was the biggest delta between what you thought it was going to be and then what it actually was?

Eugenia: I guess I just thought that if you have enough resources or whatever, or enough help from your parents or your partner, maybe you can recreate the lifestyle that you had before and kind of have your life just like you before kids. What I realized that that was a wrong question to ask.

Yes, you can, but there's kind of no option to do that because you're basically completely stuck between, okay, maybe I can do it, but then I'll feel the same amount of guilt.

Noor Siddiqui: Your opinions change. You don't want to do that. Maybe you can, but you don't want to.

Eugenia: You actually don't want to have your previous life, because you just want to spend all the time with them.

They're growing so fast. There's just an insane amount of love that you can experience towards these little humans.

Noor Siddiqui: Talk about the love a little bit more because I feel like we often hear and see the like crying and the downsides. Can you describe a little bit how intense the love is?

Eugenia: I think it's a completely different beast because compared to other love you experience in your life, this one is completely, or at least almost completely selfless. You just really want the best for them. If it comes in at an expense, so be it, you know, as long as they're happy.

I was just thinking about it because I took my daughter yesterday to a show and I was thinking about it, seeing her happy is so much more, brings me so much more joy than me myself doing something selfish that will make me happy. 

Like, for instance, we're going to Seattle, we might go with our family and try to ski. I love skiing. It's one of my favorite things in the world. I don't really care about skiing this time. I just want to see my daughter for the first time on her skis, and I do not want to hire an instructor for that. Of course I just want to be there with her doing this thing all day long and, you know, seeing her being excited.

It's such a different type of love because with whoever, with anyone else, at some point you'll have a little bit of ego. But in the relationship with the kids that sort of kind of doesn't matter.

Noor Siddiqui: yeah, yeah. That's so interesting. Is there anything that you would like to tell yourself before you had kids? Would you be like. Oh, have kids five years earlier, have kids five years later. Is there anything that would have been useful advice for you to hear for yourself

Eugenia: I think this is the right time. This was really right timing when things are kind of a little stable and I know that I can have a little bit of help at least here and there. I definitely am grateful that I didn't have kids when I wasn’t ready.

And I think one thing that's kind of bugging me, a topic that I see on Twitter a lot, is CEOs, founders, now investors are really against remote work.

Noor Siddiqui: Mm

Eugenia: This is absolutely crazy.

Noor Siddiqui: Mm hmm.

Eugenia: I wouldn't want to have kids if my company wasn't fully remote. I have the luxury of having some days that I work with my team at the office, but then a lot of days where I can just see my kids throughout the day, and I don't need to waste time on commute on commute.

I can be there with them and do my work at the same time. And I think if you have any minimal respect towards your employees that have kids. It's really sad if you want them at the office dawn till dusk. They’re basically going to have their kids grow up without them. 

If you are 100 percent sure that this is the only way to get to have a productive company, I don't think that's true because a lot of people that I know that work from the office literally go there to sit on their ass and be on zoom calls and just do their work on the computer during the day, but at the office for some reason, and then go back home.

I think if there's any way for companies to allow people for remote work, they owe it to the society to make it work.

Noor Siddiqui: hmm.

Eugenia: Otherwise people are just will have these, kids that will just grow up without parents and that's extremely upsetting.

Noor Siddiqui: What's your favorite thing about being a mom?

Eugenia: Hmm. I guess the best thing about being a mom is that before it was all about work and succeeding at all costs, and startup success is all my identity. I need to get this thing to work. I need to. It’s make or break kind of thing. 

But with kids, very many times during the day, I have a feeling that it's enough. Whatever I have now, whatever's happening now, this is more than enough. This is really great. I think that's the biggest gift that it will give you just knowing that. It's good the way it is.

Noor Siddiqui: That's awesome. Well, thanks so much for making, making the time. This is amazing.

Good to see you and I’m super excited by what you're building. Thanks so much for joining.

Eugenia: Thanks so much. Thank you so much for inviting me and hope to see you soon.

Noor Siddiqui: See you soon. Bye.

get access

Get expert reviewed guides hot off the presses.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Recent Articles