Wednesday, December 6, 2017

Your Robotic Therapist

Are therapists about to be replaced by robots? Will artificial intelligence prove to be a superior therapeutic instrument to the empathy on offer from credentialed licensed practitioners? Come to think of it, how many years of advanced professional training does it take to ask: How does that make you feel? And how much training does it take to recognize that the standard therapist question allows the therapist to disengage from the patient, while forcing the patient to get lost in his mind?

The questions might not brighten up your day, but I find considerable amusement in the notion that a robot can do as good if not a better job than a real live therapist. I trust that we all take the results with a few grains of skepticism, but still, the experiments show us that the average therapist is not addressing serious issues and is not engaging with his or her patients.

The Washington Post offers a sample therapeutic exchange:

My therapist wanted to explain a few things during our first online session:

“I'm going to check in with you at random times. If  you can't respond straight away, don't sweat it. Just come back to me when  you're ready. I'll check in daily.”

“Daily?” I asked.

“Yup! It shouldn't take longer than a couple minutes. Can you handle that?

“Yes, I can,” I answered.

There was a little more back-and-forth, all via Messenger, then this statement from my therapist:

“This might surprise you, but ... I am a robot.

It wasn't a surprise, of course. I'd downloaded “Woebot,” a chatbot recently created by researchers, and it was trying to establish our therapeutic relationship.

“Part of the value of Woebot is you can get things off your chest without worrying what the other person thinks, without that fear of judgment,” said Alison Darcy, founder and chief executive of Woebot Labs. “We wanted it to make an emotional connection.”

Now, we are supposed to think that it’s therapeutic to make an emotional connection with a robot. But, this really seems to tell us that today’s therapists are robotic in their approach, that they do not know how to engage with their patients. It’s not so much that the robots mimic human conversation—they don’t—but that therapists are functioning like robots:

Convenient, easy to use and anonymous, these chatbots are programmed to mimic human conversation and decision-making and primarily give advice, offer self-help guidance and companionship.

It turns out that it’s all very superficial.

“These things can work well on a superficial level with superficial conversations,” Torous said. "[But] are they effective tools, do they change outcomes and do they deliver more efficient care? It's still early.”

Of course, the robots are not real people. They are fake people. And they cannot, I presume, go into detail about your specific issues and problems. They cannot guide you through a difficult life situation because they do not know the specific reality of your life. They know how to talk feelings, even though they have no feelings themselves.

I cannot imagine a more effective demonstration of the superficiality of most of today’s therapy, of therapists’ failure to engage with their patients, and of their failure to help their patients to learn how better to conduct their lives.

5 comments:

  1. By what fraudulent means does a patient come to hire lines of code to provide therapy? What was on the bot's CV?

    And how long can this charade possibly last? For that matter, how long does it take for anyone reading this blog to spot the bot-responder? Maybe two poastings?

    ReplyDelete
  2. Weizenbaum's ELIZA {1964-66) was the first.

    ReplyDelete
  3. ELIZA....Weizenbaum remarked that his secretary asked to be left alone with it to discuss some things, even though she knew very well what it actually was.

    ReplyDelete
  4. Just another Google,inc. population tracking/control mechanism.

    ??

    ReplyDelete
  5. We're certainly in the infancy of some future "virtual assistant" which in addition to keeping your calendar might offer psychologically helpful feedback as well.

    I've long used journal writing for helping me to pay attention and reflect, but I suppose there can be periods of days, weeks, or months where I don't do any serious journal writing. I would say I'm more likely to fall into denial over things I don't want to deal with, when I'm not writing regularly. Often my journal writing does overlap with my to-do lists, to get me to deal with things.

    But I'm unsure what someone else listening would help. Perhaps writing forces me to hear myself, and like if I make an intuitive assertion (i.e. assumption) about something, then I recognize it as something that may be true or false, and if it's false, I'd better find a way to find out before I bother anyone else over it. Certainly a therapist couldn't tell me if I have my facts wrong if he doesn't know either.

    I suppose a live therapist or coach at least could ask me what I'm going to do about some situation, and if I know he's going to ask me later if I followed through, so to avoid embarrassment, I might be more likely to at least put an honest effort. I wonder if a "Bot" will remember what I say and ask about it later? I wonder if I'd be equally embarrassed in front of a "Bot" if I had to confess that I wasn't walking my talk?

    ReplyDelete