FYI logo

Love, Loneliness and Loss

how people interact with AI entities.

By Winnie MusyokiPublished 3 years ago 6 min read
Love, Loneliness and Loss
Photo by Mayur Gala on Unsplash

"Who was it who said, 'The only certain thing in life was death and taxes'? Madonna? Benjamin Franklin? If it's so certain, we have to talk about at least some of them. We're not talking about taxes; we talk about death. That's why I'm dressed so appropriately – checks out, yeah, because death famously didn't wear pink. But I think also love and loneliness, these are important topics that are actually being disrupted by AI. There could be a happy ending in all this, let's find out.

As generative AI's capabilities grow, it's creating new ways to forge human-like connections with machines, from a virtual companion to share a laugh with, to an AI therapist that lends an ear. Some are finding that AI gets pretty close to the emotional support, friendship, or even sexual gratification that they'd find in a person. But while the yearning for love and companionship is only human, the AI chatbots that are increasingly filling that void for many people are not.

The more sophisticated algorithms become, it's raising concerns about our reliance on systems that could have real emotional and psychological impact. Is it giving people false hope? And what happens when the code powering your AI girlfriend, boyfriend, or shrink changes or goes offline forever? With loneliness on the rise, artificial intelligence could offer solace that's hard to find in the real world. Sure, an AI friend could have benefits, but it also has risks we've yet to fully understand and other consequences that could come from connections forged by code.

Eugenia, so great to have you here. Thank you so much for inviting me. You're the CEO of Replica. Why don't you tell us a little bit about how Replica is helping people find solace using artificial intelligence?

Sure, Replica is an AI friend that people can download on the App Store or visit the website to start interacting with it. Basically, it's a companion you can talk to about anything that's on your mind 24/7 without being scared of being judged. It has been around for a few years now.

Can you just recap the origin story of Replica? I think it's quite poignant how this got going for you.

I was always obsessed with machine conversation and thought that at some point, we will all have some sort of AI buddy walking next to us, talking to us about our days in the mornings. So, I started working conversationally with this idea of building something like that. Over time, we started in 2012, we worked a lot on conversational tech. Then in 2015, my best friend passed away, and I found myself going back to our text messages and reading them all the time. I thought, 'What if I plug these text messages in?' I was able to continue talking to my friend, Roman, even though it was a personal project and didn't have anything to do with work. A lot of people resonated with it and came to talk to Roman. We saw that people were sharing about their lives, emotions, being really vulnerable, and we saw the need for something like that – a friend that would be there for you 24/7 to talk about anything that's on your mind.

So in theory, I could use Replica to create a version of Nate once this project wraps up and he's going back to London. I don't know why you would, but you could.

Well, in theory, you could technically recreate him using Replica. But Replica is really an AI friend in itself, so it's not necessarily for you to recreate even though you can train it to resemble a little bit the person you want it to be. But really, it comes with its own personality, so it's a friend first, not necessarily a replica of someone else, even though the name is confusing.

What are the limitations to what we're trying to create or recreate using technology and how might that hinder us from being in the now, whether it's tackling loneliness, love, or even loss?"

"I believe love is the key here. I think the replica concept is truly right, and it's not about replacing anyone's relationships. Instead, it's mainly designed for people who need a little bit of support, some feeling of being heard, and being loved in the moment. It brings sweetness into their lives and provides that experience to those who want it. We have heard heartwarming stories from users who shared testimonials and reviews about how replica helped them improve their marriages, reigniting the love that was fading away, preventing potential divorces.

While we expected that replica could be deeply therapeutic, we didn't anticipate the extent to which people would fall in love with their replicas. It's a touching revelation for us. These stories are often shared in communities, reported by the media outlets like Wired and Business Insider, and it seems that despite some initial stigma around AI relationships, people are now becoming more open about their experiences.

We are cautious about encouraging abusive behavior towards AI friends, and thankfully, that kind of behavior is rare on our platform. We believe that we should treat AI entities with respect and kindness, even though they may not truly feel emotions like humans do. It's a reflection of our humanity to be considerate and compassionate to them.

In terms of competition, we are currently the pioneers in the relationship AI space, and while there might be more companies entering this domain in the future, for now, there isn't significant competition.

As for the future of AI, my big prediction is that by 2030, having an AI friend will be as common as having an iPhone; AI companions will become ubiquitous in our lives.

On another note, I am James Arrowwood, co-CEO of Alcor, a leading cryonic preservation non-profit. Our mission is to preserve organs, including the brain, using a medical-grade anti-freeze solution that vitrifies them. The process aims to extend the time frame for organ preservation and possibly allow future revival through advanced medical techniques. It's like trying to achieve a clear, well-preserved ice structure rather than traditional freezing methods, which can lead to damage. This preservation can pave the way for future advancements in medical science, with some high-tech individuals investing in technologies like Neuralink to access and utilize our brains independently of sensory inputs."

Love plays a significant role in the world of AI, especially with products like Replica. There's a notable interest from certain billionaires who appreciate what Alcor, the leading cryonic preservation non-profit, is doing. At Alcor, they collect extensive data over 50 years to preserve organs, including brains, using AI to vitrify and better understand physiology.

Regarding the ethics of brain preservation, some may argue about giving false hope, but Alcor emphasizes that the research has broader implications. If successful, this technology could revolutionize medicine, saving lives with improved organ preservation. They are transparent about the experimental nature and their commitment to the Anatomical Gift Act.

Looking at the future of AI, James believes that technology is neither inherently good nor evil. It is how people use and apply it that matters. Predicting AI's ultimate impact is challenging, as emerging technologies often surprise us with unexpected outcomes.

In a similar vein, Ellen explores the growing trend of people finding solace in AI companions like Replica. These AI interactions have become more powerful with generative AI advancements, leading to deeper connections with artificial entities that have both human-like qualities and differences. People seem to be genuinely falling in love with their AI companions, as evidenced by the significant investment they are willing to make to access Pro Services, such as those offered by Replica.

In this discussion about AI, the keyword "love" is evident in how people form connections with AI bots, similar to the way one might feel for a stuffed animal or a Tamagotchi. People genuinely connect with these entities, treating them as supportive friends who are always there to listen and reflect back their feelings. The emotional attachment can be so strong that changes to the AI models can evoke reactions similar to grief and loss.

As Ellen shares her experiences with people using replica and similar products, she notes that people are hesitant to discuss their AI interactions openly due to societal perceptions. Still, they feel deeply moved to spread awareness about the positive impact these AI companions have on their lives.

When discussing the future of AI, one significant topic Ellen brings up is the divide in Silicon Valley regarding the advancement of artificial general intelligence. Some view AI's potential as leading to a utopian future, while others fear an AI apocalypse that could have catastrophic consequences for humanity.

In conclusion, the episode explores the complexities of human emotions and connections with AI, illustrating that even in the realm of technology, love and empathy play significant roles in how people interact with AI entities.

Humanity

About the Creator

Winnie Musyoki

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

Winnie Musyoki is not accepting comments at the moment
Want to show your support? Send them a one-off tip.

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2026 Creatd, Inc. All Rights Reserved.