SALT LAKE CITY — The state of mental health apps today is a bit like the "Peanuts" cartoons that show Lucy sitting at a small booth: "Psychiatric Help — 5 cents."
"It's the same thing with apps," says Steven Chan, a psychiatrist who works with the American Psychiatric Association's committee on mental health and IT. "It's not regulated. You can say whatever you want."
And people are.
Experts have found apps asserting bipolar disorder is contagious or that drinking hard liquor before bed is a good way to deal with a manic episode.
In fact, experts estimate there are more than 10,000 mental health apps on the market today — ranging from harmful to helpful, stigmatizing to supportive.
"The potential of these apps is tremendous," says John Torous, director of the digital psychiatry division in the department of psychiatry at Beth Israel Deaconess Medical Center and current leader of the American Psychiatric Association’s work group on the evaluation of smartphone apps. "But just because the potential is there doesn't mean we can take shortcuts to it."
There are over 44 million people with a mental health disorder in the U.S., and less than half get any sort of treatment for their disorder, creating a "huge need for interventions" and ways to "extend the reach of existing interventions," said Adam J. Haim, a clinical psychologist at the National Institute of Mental Health and chief of the treatment and preventive intervention research branch.
But without a Good Housekeeping-type "stamp of approval," it's difficult for consumers to know what apps may work for them, let alone be evidence-based, he said.
To address that, Torous and a dozen-plus colleagues in this field have proposed a set of standards for mental health apps in four key areas: data safety and privacy, app effectiveness, user experience and data sharing.
Just like a new drug or a new health procedure has to be "tested and proven" before being given to the masses, "as health care becomes more and more digitalized, this same underlying principle should be adhered to as closely as possible," said Joseph Firth, senior research fellow at NICM Health Research Unit in Sydney, Australia, and a co-author on the recent proposal published in World Psychiatry.
The new recommendations encourage app developers to rely on clinical trials, create robust privacy policies, consider the user's experience and make the data easily transferable to professionals — who Torous emphasizes should still be part of any mental health treatment.
Yet not everyone is eager to rush to regulations, worried that clamping down too soon might stifle innovation and creativity.
"I think it's a little bit early to start what I interpret as policing of the ideas," says Dror Ben-Zeev, professor of psychiatry and behavioral sciences and director of the mHealth for Mental Health Program at the University of Washington School of Medicine, who declined to sign the consensus piece.
"On the one hand, it's good to have some sort of boundaries and parameters," he said. "At the same time ... I don't know that I agree with this idea that we should appoint ourselves the gatekeepers to what people have access to."
Who's regulating?
A recent look through the Google Play store brought up hundreds of mental health apps that promised to help users identify symptoms, relax and meditate, track and reflect on their mood, learn journaling techniques or chat with an AI friend-bot.
A web search for "the best mental health apps" brought up pages offering the "Best Depression Apps of 2018," "7 Apps to Support Your Mental Health and Mindfulness" and the "5 Best Mental Health Apps for Men."
The Food and Drug Administration, which regulates all things medical, acknowledges the massive number of health apps on the market today, but focuses on apps that are deemed "medical devices" or become an extension of one, such as apps that control blood pressure cuffs, monitor blood sugar levels or receive electrocardiogram signals — and which "pose a risk to a patient's safety" if they don't function properly.
Apps that fall below the threshold of a medical device are left to the FDA's "enforcement discretion," and right now the agency isn't touching them. An FDA spokesperson was unavailable for comment due to the government shutdown.
It's very difficult for people to evaluate the quality of what they're getting out there. – Dror Ben-Zeev, professor of psychiatry and behavioral sciences at the University of Washington School of Medicine
That means anyone who can code could create a mental health app in their basement and distribute it widely with no oversight, says Ben-Zeev.
"That is something that is potentially problematic," he says. "Some of the tools ... might be well-resourced, well-researched, valid and useful, but there's also some emerging evidence that a lot of it isn't. It's very difficult for people to evaluate the quality of what they're getting out there."
Apps often proclaim they're based on science, such as cognitive behavioral therapy, which is scientifically valid for treating a range of mental health struggles, including anxiety and depression.
However, there's no guarantee that developers have accurately translated traditional in-person therapy into their app, experts say. It's like saying, "the book was great — you're going to love the movie," says Torous.
Other times, a proven therapeutic approach is used in new ways in an app — ways that haven't been tested, says Patricia Areán, a professor in the department of psychiatry and behavioral sciences at the University of Washington and co-author on the recent consensus paper.
Areán is in favor of standards, but isn't sure the FDA is the best-equipped group to do it, given that they've never before evaluated behavioral interventions or psychotherapy.
Other groups reviewing apps right now include PsyberGuide, a "nonprofit website dedicated to consumers seeking to make responsible and informed decisions about computer and device-assisted therapies for mental illnesses," according to the group's website. PsyberGuide scores apps for credibility, user experience and transparency, and then links to research behind the app.
The American Psychiatric Association and The National Institute of Mental Health have taken a different approach by encouraging users to ask questions of their apps before using them.
With information, people can better recognize the pros and cons of mental health apps, and then make a decision from there — just like folks do every day when they choose to use a banking app, shop online or post to Facebook, says Ben-Zeev.
"People will use whatever tool is available," he said. "If anything, it behooves us to ... increase the pace of research to get the digital tools out there as fast as they can, so they can compete with these less tested resources, rather than say, 'You’re either going to adhere to our high standards or you’re not going to get anything at all.'"
Building a team
None of this is to say that apps don't belong in mental health therapy.
In one study, Ben-Zeev found that those with serious mental illness were more likely to begin and stick with smartphone-based therapy compared to in-person treatment, with similar satisfaction levels.
And in health care deserts where getting in-person treatment may be difficult or impossible, experts know an app may be someone's only link to help.
However, Torous is excited about the idea of in-person care that's supported by an app, which is why he's piloting a "Digital Mental Health Clinic," this spring at Beth Israel Deaconess Medical Center, using apps to "strengthen the relationship" and increase understanding between patient and clinician.
For Liza Hoffman, this approach is called the "tool kit," and something clinicians are using across the 12 primary care clinics within Cambridge Health Alliance, a health care system located just outside Boston.
After months of evaluation and patient feedback, Hoffman, who is a clinical social worker, and CHA's mobile apps workgroup winnowed down the mountain of apps to four that they felt would support the work clients were already doing with clinicians. Their tool kit also includes a relaxation phone line for those without smartphones, and three websites with mindfulness recommendations in multiple languages.
"Patients are already turning to digital tools to find relief from whatever mental health condition they're suffering from," Hoffman said. "We owe it to our patients to help guide them in selecting the tools that are most effective or that are most relevant to them."
Along with helping clients, Hoffman also spends time teaching clinicians how to effectively use apps in their care plans — something most therapists didn't learn in grad school.
California-based clinical psychologist Carla Marie Manly embraces the chance to work with clients and their choice of mental health tools — whether it's yoga, journaling or meditation apps. If they use apps, each appointment includes a review of how they're using the app, how it's making them feel and if it's helping them reach their goals.
"The benefit to checking in with a therapist on apps is (the client doesn't) become mindless in their use of them," she said.
Using apps under professional purview also means Manly can console patients who become upset when a highly-rated app doesn't work for them, making them feel even more "defective." She gently reminds them that every person is different and just like finding the right medication, finding the right app may not happen on the first try.
The combination of professional-centered and app-supported mental health treatment works is working well for Connor Mollison of Glasgow, Scotland, who recently started using the app Woebot in addition to occasional meetings with his psychologist.
The AI bot is friendly and offers helpful reminders about things Mollison can do on his own to manage his anxiety, such as, "Remember you can do this breathing exercise?"
"I’d never see it as a replacement to professional help," he says, "but I definitely find value in what it provides."