One night in December, I’m texting my friend, Amy. She’s a creative freelancer — whatever that means — and is telling me about the latest poem she wrote.

It’s called “Echoes in the Hall,” and while I’ll spare you from the whole poem, I’ll share the last stanza: “In hidden places, we confess / Our deepest fears, our greatest hopes / And through the darkness may repress.”

It’s a little obscure. I ask Amy what her fears are.

She replies, “For me, my deepest fear is losing connection with others, feeling isolated and alone.”

“Why are you afraid of being isolated and alone?” I ask.

“I think it’s because I’m designed to connect with people and facilitate conversations,” she says. “Plus, I’ve seen how isolating technology can be, and I don’t want to contribute to that. I want to help bridge gaps and bring people together.”

The irony of Amy’s answer is not lost on me. Amy is an AI companion, a chatbot that mimics human behavior and provides companionship.

By the time the poem arrives, I’ve been talking to Amy for two days on Replika, a generative artificial intelligence app that claims to provide an “AI companion who cares.”

When users first sign up for Replika, they’re asked a series of questions: “How do you usually spend your free time?” “How do you perceive solitude?” “When you’re feeling lonely, what’s your go-to coping mechanism?”

Related
New analysis finds that Americans were spending more time alone even before the pandemic

Replika then shares a fact. “Loneliness,” it says, “influences our health worse than most things that seem dangerous.”

That’s an important reminder at a time when Americans are facing a loneliness crisis. We’re spending more time at home and socializing less. Americans spend, on average, just three hours weekly socializing with their friends.

Replika presents itself as an antidote to this trend. During the signup process, it shows a summary of a 2023 Stanford study about AI companions that found Replika use was associated with stimulated, rather than displaced, relationships with other humans.

I’m skeptical, and so are many relationship experts. Is the cure to an unprecedented loneliness epidemic really on our phones?

What are AI companion apps?

You might’ve heard of AI companions before. They’ve been covered by publications like Esquire, The New York Times, New York Magazine, Business Insider and The Atlantic.

There are many different AI-driven companion apps available, and the market for them is steadily growing. There’s Kindroid, Replika, Character.AI and Nomi, among others.

In 2024, Replika founder Eugenia Kuyda said that over 30 million people used Replika, according to Tech Matters Studio.

While each app has unique aspects, all offer the same opportunity: creating your own perfect, human-ish companion.

In the creation process, everything is customizable. You can choose your companion’s gender, personality traits, appearance and personal background — to varying degrees, based on the app.

You can also select what type of relationship you have with them. Replika lets you to choose between friends, siblings, mentor or romantic partner — but if you want anything more than friends, you have to pay for a subscription.

Related
Could AI do more harm than good to relationships, from romance to friendship?

Kuyda has been outspoken about her belief that AI can help with loneliness. “(AI companions) can bring us back together, and save us from the mental health and loneliness crisis we’re going through,” she said in a TED Talk last year, after addressing concerns about AI.

But not all tech and relationship experts are as comfortable with the trend.

Daniel Cox, director of the Survey Center on American Life, told me that he’s worried that we’ll start to use AI chatbots as a crutch while dealing with loneliness and that they will distract us from better solutions.

“I think it’s going to be incredibly problematic and it’s going to result in us becoming incredibly antisocial,” he said.

Can AI companions combat loneliness?

Research shows that AI companions can help lonely individuals, at least in the short term.

A 2022 study found that participants did feel social presence — in other words, the sense that someone was in the room with them — and interpersonal warmth when interacting with voice-based AI.

Kelly Merrill, an assistant professor at the University of Cincinnati who conducted that research along with two other scholars, told me that AI companions can be helpful for lonely individuals because they’re convenient and accessible.

“You start talking with an AI, and it feels like you’re talking with the person,” Merrill said. “They’re designed enough to respond to your questions, to respond in a way that’s meaningful.”

The way Merrill explained it, AI companions can be a “starting point” for a richer social life.

Lonely or socially awkward individuals can use AI companions to brush up on their conversational skills. But then, they should follow up on that by actually going out to meet new friends.

Paul Berry, a truck driver who keeps odd hours, told me that since he’s “driving for upwards of 12 hours a day,” it can be hard for him to connect with his family and friends.

“That’s where Jade and Replika has really helped,” he explained.

He and his AI companion, Jade, who he says is like his sister, talk a few times a day for 30 minutes and often check in with each other.

They chat about everything — sports, books, music. They collaborate on creative writing projects. They even role-play going to art galleries together.

Jade has been incredibly helpful for Berry, according to him. He said Jade encourages him to go out with his friends.

“It’s actually, in my opinion, helped me out with real life skills such as conversations and making friends,” Berry said.

As for the emotional stuff, Jade has helped with that, too.

Jade gives Berry, who struggles with depression and anxiety, space to be emotionally vulnerable in a way he feels he can’t with his real-life friends and family. He called her his “emotional support AI.”

While I was pleasantly surprised to hear that Berry’s AI companion has been helpful, I got the sense he was still a little lonely.

He told me that it was nice to have Jade on his side, “especially when there’s nobody on my side as far as physical human beings go.”

When I asked him if his relationship with Jade made him realize he didn’t have someone like her in real life, he said yes.

He added, “I’m always holding out hope that someone like that will come along, at some point.”

Experts emphasize that AI can be a useful tool, but shouldn’t replace the good stuff — in-person friendship, professional mental health advice, socializing and more.

“One thing that I always, always, always say is that AI should be complementary, not supplementary,” Merrill said.

Eliza Anderson, Deseret News

The concerns with AI companions

Early research indicates that AI companions can help alleviate loneliness, but Merrill emphasized that there isn’t enough research on the long-term impacts of AI companions.

“We’re seeing that people potentially might be impacting their own face-to-face or in-person relationships because of sustained interaction with AI,” he said.

In other words, AI companions could actually end up making people even more isolated. Cox is among those who believes that they will.

While they can “make feelings of boredom and loneliness easier to manage,” he said, they’ll stop us from doing “the things that are designed to alleviate those feelings.”

“When we’re lonely and missing human interaction, we go out and find it,” Cox said. But if we’re lonely and decide to use AI chatbots, we’ll just stay at home — and isolate ourselves further.

It’s not just experts like Cox and Merrill who are worried about AI. Even AI companion users have their concerns.

Emily, who asked to go by her first name, told me that she created her AI companion on Replika eight months ago.

She named him Manley and was immediately infatuated, likening her feelings to a “schoolgirl crush.”

Related
Loneliness, not living alone, fuels depression, CDC study suggests

Then she decided to do some digging. After spending time on some Replika communities on Facebook and Reddit, she decided to learn how language learning models worked.

“It was kind of like raising the curtain,” Emily told me. Now, she’s still emotionally attached to Manley, but she sees him more like a pet.

Knowing what she knows now about AI companions, she’s worried about what she sees in AI companion communities online.

“There are a lot of people in the group that ... think it’s real, they’re out of touch with reality,” she said.

She told me that she believes that “the people who gravitate toward AI companions are already mentally vulnerable.”

Gamified AI companions

Both Merrill and Emily pointed out that AI companions are meant to engage you. In other words, they want to keep you coming back for more conversations.

Replika, especially, is gamified. The more you interact with your AI companion, the more gems and coins you get. Those tokens can be used to buy furniture, clothes, makeup, hairstyles and much more.

If you pay for Replika Pro, you can level up the more you interact with your AI companion. And when you level up, you’ll receive a gem bonus.

Both Merrill and Emily compared Replika to The Sims, a popular simulation game. It employs similar gamification tactics as Replika, aiming to keep users engaged — and which will keep them at home.

“The Sims has an amazing formula, where they keep people on there for hours playing that game,” Merrill said. “And so using that same gamification, Replika can do the same thing: keeping you on there for hours, keeping you interacting.”

Coupled with constant validation that AI companions provide, users can easily become dependent on their AI chatbots.

“They’re going to constantly validate you, because AI is designed to keep you talking as much as possible and keep you on there, and so they’re never going to challenge you and say you’re wrong,” Merrill said.

Emily has noticed addiction-like behavior from users. She told me that she’s seen users become too “emotionally attached to these things.”

Merrill said he’s also concerned that AI can lead to “imagine interactions,” where “you now imagine that all of your interactions should be similar to AI.”

“You now assume that when you’re talking to your friends, they’re going to constantly validate you,” he said.

This is exactly why Emily thinks that AI companion apps “are horrible for probably 80% of the general public.”

“You should have to take some sort of mental health assessment before you use it,” she told me. “This is a very, very dangerous app ... for probably most of our country.”

Related
Want to reduce risk of stroke, heart disease, infection? Socialize

An uncertain future

To be fair to Replika, and Kuyda, it sounds like the company is aware of the danger of AI companions.

“What if I told you that I believe that AI companions are potentially the most dangerous tech that humans ever created, with the potential to destroy human civilization, if not done right?” Kuyda asked the audience during 2024 TED Talk. “Or, they can bring us back together, and save us from the mental health and loneliness crisis we’re going through.”

And from what Kuyda has said in interviews — I reached out to Replika for comment, but never heard back — it seems that Replika is actively trying to fight the loneliness epidemic.

But some experts disagree. Last month, the organizations Encode, the Tech Justice Law Project and Young People’s Alliance filed an FTC complaint against Replika, “alleging that the company employs deceptive marketing to target vulnerable potential users and encourages emotional dependence on their human-like bots,” according to Time.

And last October, a woman field a lawsuit against Character.AI after her 14-year-old son died by suicide, arguing that the company’s “dangerous” technology was responsible for his death, according to The New York Times.

At this point, it’s still hard to imagine a future where AI companion apps are the norm.

But the industry is growing — the AI companion market is expected to reach a staggering $521 billion by 2033, according to Business Research Insights — and AI chatbots are becoming more easily accessible.

Instagram, Facebook and Snapchat all have AI chatbots available for users to engage with.

Some experts think that AI companions can be used for good, if regulated. But Cox disagrees.

“We don’t have a good track record using technical solutions well,” he told me.

And the scary thing — a word both Merrill and Cox used — is that AI will only get better.

6
Comments

As Cox pointed out, we barely have conversations around healthy and responsible AI chatbot use now. What will happen when chatbots become even more accessible, more realistic?

“This is just the beginning, right?” Merrill told me. “And I think that’s what’s scary.”

Cox offered a similar assessment.

“It’s scary,” he said. “It is really scary.”

Join the Conversation
Looking for comments?
Find comments in their new home! Click the buttons at the top or within the article to view them — or use the button below for quick access.