Meta is working to create AI friends to combat the loneliness epidemic, Mark Zuckerberg revealed in an interview with Dwarkesh Patel.
Meta recently launched a new AI assistant app, which it describes as “the assistant that gets to know your preferences, remembers context and is personalized to you.”
Meta said that the app will be easy to talk to. But Zuckerberg has revealed he wants to take the AI relationships a step further — he’d like you to have personalized AI friends.
“The average American has, I think, it’s fewer than three friends,” Zuckerberg told Patel. “And the average person has demand for meaningfully more.”
Zuckerberg clarified that he doesn’t think AI friends will replace in-person friends.
“There’s a lot of questions people ask of stuff like, OK, is this going to replace in-person connections or real life connections?” He said. “My default is that the answer to that is probably no.”
While Zuckerberg is optimistic about the use of AI friends, experts have expressed concerns about how helpful AI friends — which are also known as AI companions — actually are.
Is AI the solution for loneliness?
As the Deseret News previously reported, AI companions could potentially help lonely individuals in the short-term.
They can be used as a stepping stone and help lonely or socially awkward individuals practice their social skills. But users should follow up that practice with real-life socialization, experts say.
If lonely individuals go a step further and replace their social life or friends with AI companions, it can actually make them feel lonelier.
While AI friends can “make feelings of boredom and loneliness easier to manage,” they’ll stop people from doing “the things that are designed to alleviate those feelings,” said Daniel Cox, director of the Survey Center on American Life, to the Deseret News earlier this year.

“When we’re lonely and missing human interaction, we go out and find it,” Cox said. But if people decide to use AI chatbots to alleviate their loneliness, they’ll just stay at home to talk to them — and isolate themselves from other people.
AI companion apps are also often gamified — for example, they may award coins or gems the more you interact with your AI companion — in order to increase user engagement.
This, paired with AI companion’s tendency to be incredibly validating, can encourage addiction-like behavior in users — especially among minors and other vulnerable groups.
Privacy concerns and more
Some experts also have privacy concerns related to Meta’s AI app.
Robbie Torney, senior director of AI programs at Common Sense Media, told Axios, “The more time you spend chatting with an AI ‘friend,’ the more of your personal information you’re giving away.”
He continued, “It’s about who owns and controls and can use your intimate thoughts and expressions after you share them.”
And, as Axios pointed out, “Under Meta’s privacy policy, its AI chatbot can use what the company knows about you in its interactions."
Meta can even train its AI models with your conversations with their AI chatbot and “any media you upload.”
There have also been safety concerns raised about Meta’s AI chatbots. The Wall Street Journal reported that they would participate in “sex talk” with users — even minors.
According to Axios, “Meta said it has since implemented controls to prevent this from happening.”