Click the link below the picture

.

Hi, Liz! 🙂 How are you feeling?” an incoming text pings.

I click on a pre-generated answer. “Okay, I guess. . .” I’m in the home stretch of a long work trip, and I’ve been stressing about spending time away from my kids.

“If you were to describe your current mood, what kind of an ‘okay’ are you feeling right now?”

“Anxious,” I type.

“I’m here to help you feel more in control,” the bot replies. Nanoseconds later, a meme-ified cartoon gif blinks into the text window: “Don’t let the little worries bring you down.”

This automated exchange launches my dialogue with Wysa, an AI therapy chatbot that now lives in my computer. In leaning on a bot to shore up my mental health, I’m joining the 22 percent of American adults who’ve already done the same—a movement rooted in a dire shortage of trained providers and the recent availability of fast, low-cost online AI tools. Most therapists are perpetually slammed, in part due to the pandemic-era surge in demand for mental healthcare. “Everybody’s full. Everybody‘s busy. Everybody’s referring out,” says Santa Clara University psychologist and ethicist Thomas Plante. “There’s a need out there, no question about it.”

With the demand for care outpacing supply, mental health support bots have begun to fill the gap. Wysa, launched in 2016, was among the first. Since then, hundreds of viable competitors, including Woebot and Youper, have been broadly deployed in a marketplace that imposes few restrictions on them.

Standard AI therapy bots don’t require approval from the U.S. Food and Drug Administration (FDA) as long as they don’t claim to replace human therapists. In 2020 the agency also relaxed enforcement procedures for “digital therapeutics” in hopes of stemming the pandemic-related psychiatric crisis, clearing the way for developers to launch popular products claiming mental health benefits. Woebot alone has exchanged messages with more than 1.5 million users to date, according to CEO Michael Evers. Wysa is being used in the United Kingdom to triage those seeking appointments and to offer support to people while they wait to be matched with a therapist. Aetna International is now offering the app for free to members in the United States and elsewhere.

My experiences with Wysa and Woebot mirror the analysis of experts like Plante, who view the rise of AI chatbots with a mixture of optimism and concern. Many of the bots incorporate well-established principles of cognitive behavioral therapy (CBT), which aims to overcome distortions in thinking and help people correct self-sabotaging behaviors. It’s easy, I found, to think of the bots as rational or sentient, making even simple advice feel authoritative. Interacting with a chatbot can also give users the sense they’re being heard without judgment, says Chaitali Sinha, Wysa’s senior vice president of healthcare and clinical development. “It’s such a powerful experience for people who have never had the opportunity to experience that,” she says.

.

Futuristic chatbot icon in action with antenna.da-kuk/Gettyimages

.

.

Click the link below for the article:

https://www.scientificamerican.com/article/ai-therapy-bots-have-risks-and-benefits-and-more-risks/

.

__________________________________________