Can Artificial Intelligence Replace Human Therapists?
Kanebridge News
Share Button

Can Artificial Intelligence Replace Human Therapists?

Three experts discuss the promise—and problems—of relying on algorithms for our mental health.

By Lisa Ward
Mon, Mar 29, 2021 1:13pmGrey Clock 6 min

Could artificial intelligence reduce the need for human therapists?

Websites, smartphone apps and social-media sites are dispensing mental-health advice, often using artificial intelligence. Meanwhile, clinicians and researchers are looking to AI to help define mental illness more objectively, identify high-risk people and ensure quality of care.

Some experts believe AI can make treatment more accessible and affordable. There has long been a severe shortage of mental-health professionals, and since the Covid pandemic, the need for support is greater than ever. For instance, users can have conversations with AI-powered chatbots, allowing then to get help anytime, anywhere, often for less money than traditional therapy.

The algorithms underpinning these endeavours learn by combing through large amounts of data generated from social-media posts, smartphone data, electronic health records, therapy-session transcripts, brain scans and other sources to identify patterns that are difficult for humans to discern.

Despite the promise, there are some big concerns. The efficacy of some products is questionable, a problem only made worse by the fact that private companies don’t always share information about how their AI works. Problems about accuracy raise concerns about amplifying bad advice to people who may be vulnerable or incapable of critical thinking, as well as fears of perpetuating racial or cultural biases. Concerns also persist about private information being shared in unexpected ways or with unintended parties.

The Wall Street Journal hosted a conversation via email and Google Doc about these issues with John Torous, director of the digital-psychiatry division at Beth Israel Deaconess Medical Center and assistant professor at Harvard Medical School; Adam Miner, an instructor at the Stanford School of Medicine; and Zac Imel, professor and director of clinical training at the University of Utah and co-founder of LYSSN.io, a company using AI to evaluate psychotherapy. Here’s an edited transcript of the discussion.

Leaps forward

WSJ: What is the most exciting way AI and machine learning are being used to diagnose mental disorders and improve treatments?

DR. MINER: AI can speed up access to appropriate services, like crisis response. The current Covid pandemic is a strong example where we see both the potential for AI to help facilitate access and triage, while also bringing up privacy and misinformation risks. This challenge—deciding which interventions and information to champion—is an issue in both pandemics and in mental health care, where we have many different treatments for many different problems.

DR. IMEL: In the near term, I am most excited about using AI to augment or guide therapists, such as giving feedback after the session or even providing tools to support self-reflection. Passive phone-sensing apps [that run in the background on users’ phones and attempt to monitor users’ moods] could be exciting if they predict later changes in depression and suggest interventions to do something early. Also, research on remote sensing in addiction, using tools to detect when a person might be at risk of relapse and suggesting an intervention or coping skills, is exciting.

DR. TOROUS: On a research front, AI can help us unlock some of the complexities of the brain and work toward understanding these illnesses better, which can help us offer new, effective treatment. We can generate a vast amount of data about the brain from genetics, neuroimaging, cognitive assessments and now even smartphone signals. We can utilize AI to find patterns that may help us unlock why people develop mental illness, who responds best to certain treatments and who may need help immediately. Using new data combined with AI will likely help us unlock the potential of creating new personalized and even preventive treatments.

WSJ: Do you think automated programs that use AI-driven chatbots are an alternative to therapy?

DR. TOROUS: In a recent paper I co-authored, we looked at the more recent chatbot literature to see what the evidence says about what they really do. Overall, it was clear that while the idea is exciting, we are not yet seeing evidence matching marketing claims. Many of the studies have problems. They are small. They are difficult to generalize to patients with mental illness. They look at feasibility outcomes instead of clinical-improvement endpoints. And many studies do not feature a control group to compare results.

DR. MINER: I don’t think it is an “us vs. them, human vs. AI” situation with chatbots. The important backdrop is that we, as a community, understand we have real access issues and some people might not be ready or able to get help from a human. If chatbots prove safe and effective, we could see a world where patients access treatment and decide if and when they want another person involved. Clinicians would be able to spend time where they are most useful and wanted.

WSJ: Are there cases where AI is more accurate or better than human psychologists, therapists or psychiatrists?

DR. IMEL: Right now, it’s pretty hard to imagine replacing human therapists. Conversational AI is not good at things we take for granted in human conversation, like remembering what was said 10 minutes ago or last week and responding appropriately.

DR. MINER: This is certainly where there is both excitement and frustration. I can’t remember what I had for lunch three days ago, and an AI system can recall all of Wikipedia in seconds. For raw processing power and memory, it isn’t even a contest between humans and AI systems. However, Dr. Imel’s point is crucial around conversations: Things humans do without effort in conversation are currently beyond the most powerful AI system.

An AI system that is always available and can hold thousands of simple conversations at the same time may create better access, but the quality of the conversations may suffer. This is why companies and researchers are looking at AI-human collaboration as a reasonable next step.

DR. IMEL: For example, studies show AI can help “rewrite” text statements to be more empathic. AI isn’t writing the statement, but trained to help a potential listener possibly tweak it.

WSJ: As the technology improves, do you see chatbots or smartphone apps siphoning off any patients who might otherwise seek help from therapists?

DR. TOROUS: As more people use apps as an introduction to care, it will likely increase awareness and interest of mental health and the demand for in-person care. I have not met a single therapist or psychiatrist who is worried about losing business to apps; rather, app companies are trying to hire more therapists and psychiatrists to meet the rising need for clinicians supporting apps.

DR. IMEL: Mental-health treatment has a lot in common with teaching. Yes, there are things technology can do in order to standardise skill building and increase access, but as parents have learned in the last year, there is no replacing what a teacher does. Humans are imperfect, we get tired and are inconsistent, but we are pretty good at connecting with other humans. The future of technology in mental health is not about replacing humans, it’s about supporting them.

WSJ: What about schools or companies using apps in situations when they might otherwise hire human therapists?

DR. MINER: One challenge we are facing is that the deployment of apps in schools and at work often lacks the rigorous evaluation we expect in other types of medical interventions. Because apps can be developed and deployed so quickly, and their content can change rapidly, prior approaches to quality assessment, such as multiyear randomized trials, are not feasible if we are to keep up with the volume and speed of app development.

Judgment calls

WSJ: Can AI be used for diagnoses and interventions?

DR. IMEL: I might be a bit of a downer here—building AI to replace current diagnostic practices in mental health is challenging. Determining if someone meets criteria for major depression right now is nothing like finding a tumour in a CT scan—something that is expensive, labour-intensive and prone to errors of attention, and where AI is already proving helpful. Depression is measured very well with a nine-question survey.

DR. MINER: I agree that diagnosis and treatment are so nuanced that AI has a long way to go before taking over those tasks from a human.

Through sensors, AI can measure symptoms, like sleep disturbances, pressured speech or other changes in behaviour. However, it is unclear if these measurements fully capture the nuance, judgment and context of human decision making. An AI system may capture a person’s voice and movement, which is likely related to a diagnosis like major depressive disorder. But without more context and judgment, crucial information can be left out. This is especially important when there are cultural differences that could account for diagnosis-relevant behaviour.

Ensuring new technologies are designed with awareness of cultural differences in normative language or behaviour is crucial to engender trust in groups who have been marginalised based on race, age, or other identities.

WSJ: Is privacy also a concern?

DR. MINER: We’ve developed laws over the years to protect mental-health conversations between humans. As apps or other services start asking to be a part of these conversations, users should be able to expect transparency about how their personal experiences will be used and shared.

DR. TOROUS: In prior research, our team identified smartphone apps [used for depression and smoking cessation that] shared data with commercial entities. This is a red flag that the industry needs to pause and change course. Without trust, it is not possible to offer effective mental health care.

DR. MINER: We undervalue and poorly design for trust in AI for healthcare, especially mental health. Medicine has designed processes and policies to engender trust, and AI systems are likely following different rules. The first step is to clarify what is important to patients and clinicians in terms of how information is captured and shared for sensitive disclosures.

 

Reprinted by permission of The Wall Street Journal, Copyright 2021 Dow Jones & Company. Inc. All Rights Reserved Worldwide. Original date of publication: March 27, 2021.



MOST POPULAR
11 ACRES ROAD, KELLYVILLE, NSW

This stylish family home combines a classic palette and finishes with a flexible floorplan

35 North Street Windsor

Just 55 minutes from Sydney, make this your creative getaway located in the majestic Hawkesbury region.

Related Stories
Lifestyle
The Uglification of Everything
By Peggy Noonan 26/04/2024
Money
Personal Wardrobe of the Iconic Late Fashion Designer Vivienne Westwood Goes up for Auction
By CASEY FARMER 25/04/2024
Money
Rediscovered John Lennon Guitar Heads to Auction, Expected to Set Records
By Eric Grossman 24/04/2024
The Uglification of Everything

Artistic culture has taken a repulsive turn. It speaks of a society that hates itself, and hates life.

By Peggy Noonan
Fri, Apr 26, 2024 5 min

I wish to protest the current ugliness. I see it as a continuing trend, “the uglification of everything.” It is coming out of our culture with picked-up speed, and from many media silos, and I don’t like it.

You remember the 1999 movie “The Talented Mr. Ripley,” from the Patricia Highsmith novel. It was fabulous—mysteries, murders, a sociopath scheming his way among high-class expats on the Italian Riviera. The laid-back glamour of Jude Law, the Grace Kelly-ness of Gwyneth Paltrow, who looks like a Vogue magazine cover decided to take a stroll through the streets of 1950s Venice, the truly brilliant acting of Matt Damon, who is so well-liked by audiences I’m not sure we notice anymore what a great actor he is. The director, Anthony Minghella, deliberately showed you pretty shiny things while taking you on a journey to a heart of darkness.

There’s a new version, a streaming series from Netflix, called “Ripley.” I turned to it eagerly and watched with puzzlement. It is unrelievedly ugly. Grimy, gloomy, grim. Tom Ripley is now charmless, a pale and watchful slug slithering through ancient rooms. He isn’t bright, eager, endearing, only predatory. No one would want to know him! Which makes the story make no sense. Again, Ripley is a sociopath, but few could tell because he seemed so sweet and easy. In the original movie, Philip Seymour Hoffman has an unforgettable turn as a jazz-loving, prep-schooled, in-crowd snob. In this version that character is mirthless, genderless, hidden. No one would want to know him either. Marge, the Paltrow role in the movie, is ponderous and plain, like a lost 1970s hippie, which undercuts a small part of the tragedy: Why is the lovely woman so in love with a careless idler who loves no one?

The ugliness seemed a deliberate artistic decision, as did the air of constant menace, as if we all know life is never nice.

I go to the No. 1 program on Netflix this week, “Baby Reindeer.” People speak highly of it. It’s about a stalker and is based on a true story, but she’s stalking a comic so this might be fun. Oh dear, no. It is again unrelievedly bleak. Life is low, plain and homely. No one is ever nice or kind; all human conversation is opaque and halting; work colleagues are cruel and loud. Everyone is emotionally incapable and dumb. No one laughs except for the morbidly obese stalker, who cackles madly. The only attractive person is the transgender girlfriend, who has a pretty smile and smiles a lot, but cries a lot too and is vengeful.

Good drama always makes you think. I thought: Do I want to continue living?

I go to the Daily Mail website, once my guilty pleasure. High jinks of the rich and famous, randy royals, fast cars and movie stars, models and rock stars caught in the drug bust. It was great! But it seems to have taken a turn and is more about crime, grime, human sadness and degradation—child abuse, mothers drowning their babies, “Man murders family, self.” It is less a portal into life’s mindless, undeserved beauty, than a testimony to its horrors.

I go to the new “Cabaret.” Who doesn’t love “Cabaret”? It is dark, witty, painful, glamorous. The music and lyrics have stood the test of time. The story’s backdrop: The soft decadence of Weimar is being replaced by the hard decadence of Nazism.

It is Kander and Ebb’s masterpiece, revived again and again. And this revival is hideous. It is ugly, bizarre, inartistic, fundamentally stupid. Also obscene but in a purposeless way, without meaning.

I had the distinct feeling the producers take their audience to be distracted dopamine addicts with fractured attention spans and no ability to follow a story. They also seemed to have no faith in the story itself, so they went with endless pyrotechnics. This is “Cabaret” for the empty-headed. Everyone screams. The songs are slowed, because you might need a moment to take it in. Almost everyone on stage is weirdly hunched, like a gargoyle, everyone overacts, and all of it is without art.

On the way in, staffers put stickers on the cameras of your phone, “to protect our intellectual property,” as one said.

It isn’t an easy job to make the widely admired Eddie Redmayne unappealing, but by God they did it. As he’s a producer I guess he did it, too. He takes the stage as the Emcee in a purple leather skirt with a small green cone on his head and appears further on as a clown with a machine gun and a weird goth devil. It is all so childish, so plonkingly empty.

Here is something sad about modern artists: They are held back by a lack of limits.

Bob Fosse, the director of the classic 1972 movie version, got to push against society’s limits and Broadway’s and Hollywood’s prohibitions. He pushed hard against what was pushing him, which caused friction; in the heat of that came art. Directors and writers now have nothing to push against because there are no rules or cultural prohibitions, so there’s no friction, everything is left cold, and the art turns in on itself and becomes merely weird.

Fosse famously loved women. No one loves women in this show. When we meet Sally Bowles, in the kind of dress a little girl might put on a doll, with heavy leather boots and harsh, garish makeup, the character doesn’t flirt, doesn’t seduce or charm. She barks and screams, angrily.

Really it is harrowing. At one point Mr. Redmayne dances with a toilet plunger, and a loaf of Italian bread is inserted and removed from his anal cavity. I mentioned this to my friend, who asked if I saw the dancer in the corner masturbating with a copy of what appeared to be “Mein Kampf.”

That’s what I call intellectual property!

In previous iterations the Kit Kat Club was a hypocrisy-free zone, a place of no boundaries, until the bad guys came and it wasn’t. I’m sure the director and producers met in the planning stage and used words like “breakthrough” and “a ‘Cabaret’ for today,” and “we don’t hide the coming cruelty.” But they do hide it by making everything, beginning to end, lifeless and grotesque. No innocence is traduced because no innocence exists.

How could a show be so frantic and outlandish and still be so tedious? It’s almost an achievement.

And for all that there is something smug about it, as if they’re looking down from some great, unearned height.

I left thinking, as I often do now on seeing something made ugly: This is what purgatory is going to be like. And then, no, this is what hell is going to be like—the cackling stalker, the pale sociopath, Eddie Redmayne dancing with a plunger.

Why does it all bother me?

Because even though it isn’t new, uglification is rising and spreading as an artistic attitude, and it can’t be good for us. Because it speaks of self-hatred, and a society that hates itself, and hates life, won’t last. Because it gives those who are young nothing to love and feel soft about. Because we need beauty to keep our morale up.

Because life isn’t merde, in spite of what our entertainment geniuses say.

 

MOST POPULAR
11 ACRES ROAD, KELLYVILLE, NSW

This stylish family home combines a classic palette and finishes with a flexible floorplan

35 North Street Windsor

Just 55 minutes from Sydney, make this your creative getaway located in the majestic Hawkesbury region.

Related Stories
Money
How to Make Your Phone Last Forever: 6 Simple Tips
By JUSTIN POT 28/12/2023
Lifestyle
Homeowners’ Spare Rooms Worth $700 A Month In Today’s Rental Crisis
By Bronwyn Allen 03/11/2023
Property
California Is Desperate for Affordable Housing But Can’t Stop Getting in Its Own Way
By WILL PARKER 14/12/2023
0
    Your Cart
    Your cart is emptyReturn to Shop