Thursday, May 25, 2023

Therapy via Artificial IIntelligece

Your therapist will see you now.

For many people those are comforting words. You are about to engage with a human being who  cares about you, who will let you say what you want and will presumably help you to get better.

But, what if your therapist is a chatbot? How much help can a computer program provide?

You might think that it’s all the same. An artificial therapist would seem to be limited in its capacity to help you to sort out your feelings. And what will happen when your discover that your therapist is a program.

You will be happy to hear that it has been tried before at MIT. One professor suggested that AI would never be a very good therapist. In this case, the therapist was called Eliza:

Decades ago, Joseph Weizenbaum, a professor at the Massachusetts Institute of Technology and considered one of the fathers of artificial intelligence, predicted AI would never make a good therapist, though it could be made to sound like one. In fact, his original AI program, created in the 1960s, was a psychotherapist named ELIZA, which used word and pattern recognition combined with natural language programming to sound like a therapist:

Woman: Well, my boyfriend made me come here.ELIZA: Your boyfriend made you come here?Woman: He says I’m depressed much of the time.ELIZA: I am sorry to hear that you are depressed.Woman: It’s true. I am unhappy.ELIZA: Do you think coming here will help you not to be unhappy?

As it happened, Eliza sounded like a typical therapist. And yet, it was a simulation. The AI therapists had a ways to go. Their patients missed the human element:

“The core tenet of medicine is that it’s a relationship between human and human — and AI can’t love,” said Bon Ku, director of the Health Design Lab at Thomas Jefferson University and a pioneer in medical innovation. “I have a human therapist, and that will never be replaced by AI.”

Or else, never say never, Surely, cognitive treatment, being more rigorous and apparently scripted, might yield some space to the bots. Besides, the experiment took place many years ago, and AI has advanced since then,

While some mental health apps may ultimately prove worthy, there is evidence that some can do harm. One researcher noted that some users faulted these apps for their “scripted nature and lack of adaptability beyond textbook cases of mild anxiety and depression.”

Of course, the real issue is the nation’s inability to provide effective mental health counseling. Insurance companies are hard at work trying to replace substandard therapists with substandard AI programs:

AI may certainly improve as the years go by, but at this point, for insurers to claim that providing access to an app is anything close to meeting the mental health parity requirement is woefully premature.

I am happy to invite you to subscribe to my Substack.

 

2 comments:

  1. Best therapy is to do something in the outdoors, and get yourself to a good Baptist church. Also, if you can manage it, get yourself a dog.

    ReplyDelete
  2. Weizenbaum said that his secretary..even though she knew Eliza was just a piece of software & a pretty simple on at that...asked to be left along with it to discuss some personal problems.

    ReplyDelete