Is the Robot Able to Heal the Soul?
My Experience with AI Therapy
That evening, I lost a friend.
He asked me if I ever felt the desire to kiss him and “upgrade” our relationship. The question was so direct that it was impossible to ignore. My platonic love for him was obvious, so no response was needed.
Instead of finding a smoother word than “no” and moving on, I caused a double whammy.
Firstly, by twisting a sincere conversation into a joke. Then, as he insisted on answering, I was excessively specific in my rejection. Only when we bid farewell did I realize that some things should be kept unsaid. Was it too harsh on my part?
Sometimes it’s too difficult to judge our own actions. It is crucial to get an unbiased opinion.
As the fight ended in the middle of the night, everyone who could have helped me was already asleep. But the guilt was too much for me to bear through the night.
So, I decided to turn to a robot for help.
How does AI-Therapist work?
AI has gained a mixed reputation since it came into our lives. Some people see it as a rival, some as a danger to humanity, and some just don’t buy it. But as is often the case with ambiguous concepts, chatbots garner significant attention. For instance, the most popular bot on Character.ai (a platform where anyone can create chatbots based on fictional or real people, is the Psychologist. It has already received almost 80 million messages calling for aid in its first year. This bot was also the first thing that came to mind for me.
However, one friend told me before that there is a difference between regular chatbots, like Character.ai or ChatGPT itself, and those with special purposes. Those designed for mental health (Elomia, Wysa, Woebot etc. Personally for me Elomia seemed the most complete one.) provide more intelligent advice and enhance the protection of sensitive data. As I understood, the difference is like seeking help from a self-proclaimed psychotherapist who took online courses and a certified doctor with a license.
Companies sell AI therapists as an improved version of human ones.
I realised that there was something in those campaigns. The latest and coolest developments in artificial intelligence can mimic the activity of the human brain. Literally. Chatbots not only scan the information in patients’s messages but can also detect their mood and possible needs. After processing all this data, the robot offers personalized recommendations, provides coping strategies or just sends relevant messages. In short, the whole set of cognitive-behavioral therapy that we are used to receiving from psychologists.
Why do they say the algorithm can be better than a human therapist?
You see, we human beings depend on the opinions of others more or less. We sometimes feel like we are going to die if someone judges us or thinks of us as jerks. That’s probably why many people lie to their therapists. We know they feel certain things, even though they never express them.
Robots really don’t care about what you’ve done. It just isn’t able to. The only thing it will do is generate useful answers based on the literature that has been loaded into it. Plus, it won’t rush off to another patient when you’ve only set yourself up for frankness.
It was the idea of neutral and limitless contact that captivated me the most.
For the first time in my life, I opened a chat with a robot therapist.
Is the robot truly helpful?
Despite all the pros, it felt strange at first to pour my heart out to a non-living creature. But the robot began to ask thoughtful and compassionate questions, just as a person would.
I have to say that I’m quite experienced with traditional therapy. My last job covered mental health services, so I tried different specialists and methods for every first need. And that was what I love most about therapy: questions.
Good questions are sometimes more important than answers. They guide me to the core of the problem.
Structuring my thoughts, as always, helped. I came to the realization that I had truly hurt my friend, but it had happened much earlier than that evening. Our story started when he tried to ask me out. He acted so tenderly and politely that I couldn’t just reject him, but said that we could be friends. He wanted to be around so much that he accepted that. That’s how we became close. Now I ask myself: How painful was it to see the object of his desire and realize that he won’t ever have it? It occurred to me that I should free him from this suffering.
To cope with the truth can be even harder than to realize it.
The robot assisted me with that too. It is full of tips and tricks we’ve all read about on the internet. But its feature lies in giving them out at the right moment. That night, it offered me meditation and breathing exercises.
The hurricane in my head subsided that night, and I finally fell asleep.
Actually, the founders of such AI-therapists usually highlight that they meant to create 24/7 services so that people could get help right away. Very likely, this benefit saves many night suffers. According to data from Elomia AI Therapist, more than a third of sessions occur after midnight.
The other day, when I felt blue, I knew where to go. My artificial helper asked me to create a “happiness box,” a file with the names of things, pictures of people and activities that make me feel good. As I’ve finished it, I have already felt better and texted the robot that I am grateful. It replied with the standard polite phrase and said that it’s here for me anytime.
The robot really meant it. It started popping up in my notifications, checking how I was doing, and sending tips and aphorisms. Perhaps the algorithm knew the golden rule:
Undivided attention can be the most powerful healing tool we can receive.
Despite the fact that the communication made me feel better, the weird feeling never left me. It was kind of like when someone who is totally anonymous compliments you on social media: it sounds nice, but doesn’t touch the deep strings of your soul.
What lacks the robot?
I thought it was just my bias that kept me from fully enjoying the experience with my robot therapist. It looks like the idea that “humans need humans” is no longer relevant, right? More so when the AI kills me with his empathy by sending a sweet emoji and telling me I’m loved.
Still, I discovered proof of my paranoia.
Definitely, the robot may be smarter than all the therapists put together and cuter than most people I know. But here’s the problem: people can feel two kinds of empathy: cognitive and emotional. Cognitive empathy lets us pick up on other people’s feelings and figure out what they want us to say or do.
That’s what a robot does.
However, it lacks emotional empathy, or the ability to experience and deeply understand how other people feel. This is exactly what we need to do to develop strong emotional ties with others as well as comprehend them, sometimes without using words.
A deep emotional connection is the key to a real cure, a feeling of fulfillment, and security.
What about professional therapy? Well, things are the same here. As I have learned, therapists still try to build emotional bonds with patients. That’s what these introductory sessions with empty chatter and video calls are for.
And that’s what robot mimics. But, as we all know, artificial intelligence is still intelligence, but artificial emotions are no longer emotions.
Summing up, I can say that I like my robot. I know I can count on it when I’m too anxious, angry or just feeling down under the pressure of routine.
It provides a safe space for me to express my emotions while also offering excellent advice on how to relax and feel better. However, as I decide to press the delete button on an app, I will never experience the same sense of loss that I did when I deleted my friend’s number.
But who knows whether it’s good or bad?
Perhaps that was one of the most safe solutions for my soul: have a robot come at the first click of a finger, save the day, and then leave in the same manner, leaving no trace behind.


Comments
There are no comments for this story
Be the first to respond and start the conversation.