Skip to main content

Verified by Psychology Today

Artificial Intelligence

2 Ways AI Fails at Therapy, and Why You Should Care

Corporations are using AI to replace humans at the expense of good care.

Key points

  • AI in mental health care offers some promise, but it lacks human creativity and fails with empathy-based care.
  • A therapist first must develop trust with a client through empathy in order to be effective.
  • Lack of shared empathy can make an interaction with AI feel inhuman and disordered.

The Writers Guild of America (WGA) screenwriters strike is urgently bringing our attention to the challenge AI poses to the human experience of writing. Can AI replace human writers?

And now, the National Eating Disorders Association has fired their national helpline staff and replaced them with a chatbot.1 The timing of this move says it all: Staff members were fired after unionizing.

One helpline staffer told Wells, “I was able to set (a caller) up with some treatment options and, you know, talk her into believing that this is real, and this is important." A chatbot could never offer such care to a human, since the staffer was able to use empathy in her intervention.

As a therapist, I have watched this conflict in mental health heating up: Can AI be a good therapist? I suggest that the answer, as with screenwriting, comes back to two fundamental human traits which cannot be simulated by a digital mind: creativity and empathy.

These two traits offer a useful frame for the AI debate: AI lacks creativity and empathy since it can only ever rearrange the information it can access, in ways that may feel real, but quickly turn lifeless and unhelpful.

My work as a psychotherapist is by definition a creative act based in empathy. When a client comes to work with me, I help them develop trust through empathy—the feeling that I care. As the therapy goes on, through trial and error, I use my imagination to translate my client’s emotions, thoughts, history, and actions into new and different words, images, and metaphors. These help my client to see their troubles from a new perspective. The German psychoanalyst Rainer M. Holm-Hadulla called this translation process between therapist and client “creative shaping.”

Creative shaping involves empathetic improvisation, combined with creative invention. Creative invention means including a new symbol in the dialogue, such as an unexpected but useful word or image. As the psychoanalyst Adam Phillips has said, therapeutic change sometimes means finding a new word for something.

For example, if a client comes to me sharing their recurring dreams about being inside a house, I may improvise and work with the metaphor of “house” as a way of understanding their childhood trauma. Which rooms in the house are scary? Which are safe? Who is in them? My client and I start with their symptoms, anxieties, and fears, then creatively shape them into an image system we then use as a tool. Through a creative exchange based on trust, we bring new images into each other’s minds, adding new characters to the house. We may use anyone from Fred Rogers to characters from Succession or whatever we find useful. We are on a creative journey together.

Without the empathetic relationship we had built together over time, this kind of play wouldn’t hold lively excitement and meaningfulness.

While an AI could introduce new metaphor families into a therapeutic discussion, it couldn’t in a way that would resonate in a personal way for the client. AI would only be able to introduce new terms in a random way or rearrange information already fed to it. In this way, ChatGPT is not much different than the original Eliza therapy computer program from the late 1960s. When Eliza was introduced, a staffer was so excited by it, she spent time venting to it while it repeated certain language arrangements back to her. (You can try it yourself.2) Such an initial thrill would certainly turn flat once the pattern of the program gave itself away through repetition. The privacy of talking to a computer would soon be replaced by the feeling of speaking alone in a room. This is why AI in health and mental health care offers promise in terms of efficiency and managing information but fails with empathy-based care.

You may think the argument that AI can replace a screenwriter is easier to make. After all, every writer learns to work with inspiration, or to “steal” from favorite and famous authors, just as an AI pulls material from the internet and rearranges it. With genre writing especially, the writer must work with a formula, and a set of rules and tropes the audience expects; hit all of those beats, and you have a screenplay. But writing is more than simply rearranging tropes and using rules. Any genre writer will tell you it’s about creatively shaping those conventions, rules, and what came before into familiar, but new and fresh material. While AI offers an initial thrill of “I can’t believe what it can do!” it is soon overshadowed by bizarre repetitions and word arrangements that feel inhuman and disordered. Again, the thrill of a digital mind quickly turns emotionally flat with no shared empathy.

Many of the eating disorder helpline workers have themselves struggled with an eating disorder, Wells points out, which adds a crucial level of empathy, engaging callers. “When you know what it's been like for you and you know that feeling, you can connect with others,” staffer Abbie Harper told Wells.

Whenever you try to get an AI to replace creativity and empathy, there’s a hangover of emptiness. To promote the idea that AI can replace human practice is to cynically imply that people don’t feel deeply and don’t need to feel deeply at a relational level.

On the part of tech companies, this leads to a cynical question: Will people notice, or care about what they’re missing? And the threat of an even more cynical response: It won’t matter whether people notice or care. They will get what they get. The conflict may come down to what corporate boards are able to get away with.

To find a therapist, please visit the Psychology Today Therapy Directory.

References

1. https://www.npr.org/2023/05/24/1177847298/can-a-chatbot-help-people-wit…

2. https://psych.fullerton.edu/mbirnbaum/psych101/eliza.htm

advertisement
More from Richard Brouillette LCSW
More from Psychology Today