Artificial intelligence could be the best thing — or the very worst — for those who are lonely. And while chatbots and other AI-powered devices can provide what feels like friendship — or even romance — experts warn that they could make warm human exchanges even harder to find or nurture.

Human-AI relationships are not real. What is real is the risk that AI will make it easier to withdraw from human companionship, which is rife with complexity, complications and rewards, experts say.

“This really concerns me,” Heather Dugan, relationship expert and award-winning author of “The Friendship Upgrade” and “Date Like a Grownup,” told the Deseret News.

Dugan, who calls herself a “huge tech fan,” notes that AI chatbots “could be good for people who want to practice for job interviews or who are struggling with basic social interactions. They could help people remember how to engage and remind them that it feels good to have contact with other people.”

But AI relationships could also be used in place of those human relationships — and diminish the ability to find real connections.

When Lisa Bahar, a licensed psychotherapist from Newport Beach, California, attended the 2023 Milken Institute Global Conference, which focuses on leadership and influencing positive change, the positives and pitfalls of artificial intelligence were big topics of conversation, she said.

Development and use of AI is escalating fast and there’s little known about how to put parameters around it to keep emotionally and physically safe, said Bahar, who also has doctorate degrees in philosophy and global leadership and change.

And AI impacts what were once only human interactions by people “learning and being conditioned to have a relationship with technology as a form of intimacy,” Bahar said.

AI raring to go

AI-human relationships are already being shaped in some realms, but experts say literally everything about AI will continue to speed up and expand, including its ability to influence relationships.

A study published in the journal Social Science & Medicine found that “AI and robotic technologies are transforming the relationships between people and machines in new affective, embodied and relational ways.” The researchers, from the United Kingdom, note “emerging relationships that go beyond the conceptual divisions between humans and machines.”

Related
Dr. ChatGPT? Chatbot’s AI hits 77% for correct medical diagnosis in Boston study

Health care and caregiving are areas expected to benefit from artificial intelligence in many ways, including direct, human-like interactions. Making AI part of those caregiving settings includes efforts to create systems for “sensing, recognizing, categorizing and reacting to human emotion,” per the study.

Already, AI technology categorized as socially assistive robots interacts with people using “emotional” responses, including emotional attachment and companionship, among others.

“Without developing a detailed understanding of the fundamental transformations in (artificial) intelligence in practice, where humans and machines form the new ecosystem of health and care, we will not be able to ascertain what is lost and gained, by and for whom, or therefore to exercise agency in crafting our future relationships of health and care in transparent and equitable ways,” the study warns.

Separate research published in Human Communication Research looked at human-AI friendships. “Use of conversational artificial intelligence (AI), such as human-like social chatbots, is increasing,” the researchers, from Norway, reported.

While more people will have intimate relationships with social chatbots, they noted, “friendships with AI may alter our understanding of friendship itself.”

That small study consisted of 19 detailed interviews with people who use the social chatbot Replika to see “how they understand and perceive their friendship and how it compares to human friendship.”

While they found AI-human friendships have similarities with human-to-human friendships, they also noted that the artificial friendships with chatbots “alters the notion of friendship in multiple ways, such as allowing for a more personalized friendship tailored to the user’s needs.”

There are reasons that could be bad. Human relationships can be complicated. A chatbot, on the other hand, can agree with you all the time, listen to longwinded stories without tiring, respond as you’d like, never call you out on mean statements or untruths and more.

In other words, an AI friendship can be an echo chamber that diminishes filters, the ability to read social cues and limits personal growth.

What else could be lost

Dugan’s list of negatives if relationships with AI supersede human interactions is fairly long. It includes the potential loss of one’s social filters, including the ability to have constructive discussions and disagreements, since an AI companion is apt to agree with you more often, or even always, than a human pal or loved one. That removes the need to think about how to justify what you say, even if it’s offensive, argumentative or untrue.

While the fact that AI won’t get tired if you’re repetitive or wallowing in a bad space might feel comforting, Dugan said, that artificial buddy also won’t call you out or encourage you to move past something that’s got you stuck in that bad space.

And since there’s no eye contact, facial reaction, vocal tone or body language, the ability to interpret those can be lost since those skills require frequent practice, said Dugan.

Related
Top artificial intelligence developers commit to security testing, clear labeling of AI-generated content
‘Godfather of AI’ quits Google, says new tools could become ‘master manipulators’

Virtual relationships could reinforce people getting by with basic social skills without providing an incentive to “work those muscles in real time and real life, she said, pointing to awkward situations, new jobs and the teenage years as moments that can be uncomfortable but provide personal growth and resilience that serve one well throughout life. “That’s how we learn and how we get better,” says Dugan.

“A fake partner will not help you remember to use filters,” she notes. “It could reinforce controlling behaviors. I see potential for reinforcing things that lead to abusive relationships in real life.”

Because AI can be what you want it to be, there’s a risk, too, of forming an emotional affair and “increasingly disappearing” from one’s real partner, friendships and other relationships.

“Well-being suffers if we are not building real relationships,” Dugan said.

Prioritizing people

Bahar also sees potential for good from AI, like using AI to decrease isolation and ease depression symptoms of dementia, for instance. She hopes it will be used as a bridge to connect people with life enhancements like gardening or animals or other people when they need more of that kind of connection.

Tech has certainly helped some people find and form relationships, through dating apps and online groups, among other avenues.

But priority has to be given to preserving human, real relationships, Bahar said.

“It’s your room and your elephant,” said Dugan. “Take a look at what you’re avoiding.”

She said if people are honest about what draws them to relationships based on AI, they can set some parameters about what’s acceptable and what’s not and then keep any benefits offered by AI.

Probably the biggest issue, experts have told the Deseret News, is figuring out how to put limits on AI.

“How far are we going to allow AI to go?” Bahar asks. “I don’t think we have a good handle on that.”

Dugan’s a strong proponent of real connections and has tried to model healthy relationships for her children. Over the years, they’ve seen that she builds into her calendar time to be with other people and makes it a priority. She also founded a group for women to meet and find friends, to connect in real time.

While AI has potential to diminish human relationships, AI tools could also help make more time for them, said Dugan. The ready availability of data means less time searching, and technology has freed people up in a lot of different ways. Using technology, including AI, could translate into more time to do human things with real people.

Bahar’s advice for folks is to engage the five senses as much as possible. Get out into the natural environment. Remove tech from your life for 10 minutes, an hour, a day, two days. Then grow the amount of time. Reduce technology at all levels and include alternatives. Connect with real people. Stop going to church online. Do a Bible study with real people or sit in the pews beside them, she said.

“Start to see tech as an external part of you that you have control over,” Bahar said.