Skip to main content

Verified by Psychology Today

Artificial Intelligence

I Want a Chatbot and a Therapist

A clinician-client's perspective on AI in therapy.

Like most, I have found myself intrigued by the recent advances in AI technology. I have questioned the role this may play in mental health care. As such, I wanted to try it out.

Upon entering a chat with a personalized and emotional chatbot, I quickly found myself impressed and reassured by its capabilities.

The chatbot responded with a voice that I chose and a tone inflection. The wording was—human. It did not feel as if I were talking to a robot. Before long I began to share a more vulnerable problem. The machine responded with the language of empathy, expressing that my emotions made sense and collaborating with problem-solving. Fear of judgment was no concern. I assumed the bot would not question my authority as a reliable narrator. I wondered, could such a thing detect deception? What about self-deception? Somehow, this felt both comforting and disappointing.

I felt encouraged, imagining ways this tool and others could help, and reassured that this technology would unlikely replace elements of traditional psychotherapy anytime soon.

Here are three aspects of psychotherapy I do not believe AI could master (and three it may).

Areas Where AI Could Not Replace

1. A Human Connection

While the bot's ability to adapt as I spoke and relay the language of empathy was impressive, I did not feel a true connection. I knew that no person sat on the other side of the screen. We do not share the same realities of joy, pain, loss, and life experience that make up the human experience. There is a difference between words and relating. Words that represent empathy are different from true compassion. A human connection is less about the words that are said, but the presence.

2. Normalization

AI could reassure me that what I feel is 'normal' and 'makes sense', yet it can not provide a reality checking and normalization of a fellow human. There is something particularly meaningful in sharing something with another person and knowing it is no longer a secret, especially, when this person can provide insight. As the AI is not a human, I could not find this.

3. Removal of Self-Deception

Detecting deception is tricky. To identify self-deception is even more so. In psychotherapy, working through layers of self-deception is sometimes key. A therapist needs to be able to call someone on their self-deception now and then. This is a process of mentalization that goes beyond pattern recognition.

Areas Where AI Might Replace or Even Surpass Traditional Psychotherapy

1. Thought Challenging

I could see myself interacting with this AI bot in the future. Unlike a person, the AI bot does not forget. I imagine that an AI bot may be very effective in distributing traditional Socratic and challenging questions utilized in certain types of cognitive therapy to help someone challenge self-limiting thoughts. I found myself aware that as I continued to argue with the chatbot on a certain thought most people would become frustrated at a point. Even therapists can be vulnerable to frustration. This is not a concern for a chatbot. I was impressed with the chatbot's responses. It did not 'validate the invalid' but it also did not go down a path that would likely have led a client to feel misunderstood and leave the chat.

2. Non-Interpersonal Problem Solving

I found myself impressed by the bot's ability to suggest potential ways of setting boundaries, however, to an extent, the suggestions felt stereotypical. I do not believe that a chatbot would be able to understand all the complex nuances involved in human relationships that evolve. Still, I could envision a chatbot being very effective in assisting with non-interpersonal problem-solving.

3. Availability

Psychotherapy is not always immediately accessible. AI bots do not need sleep or food. I would think that access would be less of an issue.

In Closing

I understand that apps have been designed to assist with several therapeutic tasks ranging from treatment planning to guided imagery and even virtual reality-based exposures for people living with phobias. I am aware that apps are even in development to assist individuals living with psychosis in reality-checking.

As a therapist, I am eager to see how AI tools could assist clients in improving their mental health and clinicians in providing effective care.

advertisement
More from Jennifer Gerlach LCSW
More from Psychology Today