AI "Therapy" Can't Be Actual Therapy
...And it shouldn't be.
In a world where some people would be happy to see everything operated by a computer, it is a reality that not everything can, or even should be.
I've seen a number of stories in the past couple of years about the attempts to create AI therapists. There was even a story discussing an unethical situation that involved a company pairing people with AI therapists, without their knowledge, in order to test their model. The result of this was that, within a few months, everyone paired with an AI therapist ended up leaving their "therapy", purely sensing that something was off about the interactions they were having with, as it turns out, a computer program and not a person.
While there are a whole host of problems with the idea of AI providing therapy, one that will not be able to be overcome is the issue of empathy.
Why is empathy such a deal-breaker?
Empathy arguably plays one of the most important (if not, the most important) role in how people connect. And when I say empathy, I'm not simply referring to empathic comments, such as, "It sounds like you've had a stressful day." One can make empathic comments without them actually coming from a place of empathy. And one can also relay empathy without having to make empathic-sounding comments.
Actual empathy comes from a deeper, often unconscious place within ourselves that can only be experienced between two people, whether or not words are even spoken. This is an important piece -- words are not needed to relay empathy.
Words, without the underlying empathic sentiment, are generally experienced by the other as empty, or void of emotion, and usually will not be able to be taken in or land appropriately, if at all. Imagine a moment in your life where someone has made an empathic sounding comment, but that you felt within yourself was not actually connected to the real emotion, or felt misaligned. In these moments, it can feel uncomfortable, dissonant, or even like you're being manipulated. The feeling is off, it's not a good feeling, it makes people squirm and want to turn away.
While words are, of course, important to some degree, it is the underlying empathy that lays the foundation for the words to land. Empathic communication is often unconscious and unspoken, not only in therapy but also between people in life. Underlying (unspoken) empathy is instrumental in how we hear, how we take in, how we sit with, and how we form a genuine connection with another person.
With this in mind, it is just not possible for a computer program to relay empathy. Empathy is a phenomenon that is created between two humans interacting in a shared space together. This happens whether in-person or virtually. As long as there are two people interacting in real time, empathy has room to exist. (However, it does not work the same with text or email, which is why people often misinterpret the tone of written communication. The other isn't there to detect the underlying emotion and they end up projecting their own experiences in these moments.).
If you are interacting with another person, some sort of underlying empathy or regard is at play, even if you don't always consciously acknowledge or tune into it. Even if there are no spoken words. If you are attempting to interact with AI and no words are spoken, there is nothing.
Even if AI were to speak to you and use empathic-sounding recordings of human voices, it still cannot create and relay empathy. Why? Because even an empathic human voice recording wouldn't be connecting with you in your subjective space. It cannot create a deeper, unspoken connection with where you are emotionally, mentally, or physiologically. The responses of a recording would be coming from a disconnected place (or it would be coming from a moment of connection with someone else when the recording was created), and that disconnect would be felt, even unconsciously. Even if it were possible to be "fooled" at first, this would erode as the inconsistencies in connectivity and lack of true emotion were to be experienced.
(I should also point out that it's utterly impossible for AI to provide therapy. A great deal of therapy is based on the experience of the shared space between the client and the therapist. AI can't truly have this experience).
Is AI "therapy" more harmful than good?
There is also a greater danger with AI "therapy" which one should consider. If you're attempting to create a connection to a computer program for an emotionally-based experience, there are potentially drastic consequences to your mental health being set up. Empathy is the cornerstone of how one develops emotionally, from infancy and childhood and even through adulthood. This involves not only how your parents (or others) talk to you, but how they feel towards you. When there is little underlying empathy or attunement from parents or caretakers growing up, this opens up a range of potential mental health issues, from anxiety or depression to even more severe issues such as antisocial personality disorder (sociopathy), and others.
If you're engaging in a pseudo-therapy with an AI computer program, the result is more likely to distort (or even mute) your own empathy and your ability to connect with others, as well as to potentially disconnect you internally from yourself. Even with a therapist who speaks less often, there is always an underlying feeling, empathy, regard for you. This silent, yet still highly empathic communication is crucial to not only the developing child in the world from their parents, but also to any human experience, which includes the therapy experience. So even a less talkative therapist can still relay a great deal of empathy.
In the end, an AI "therapy" program can make pre-determined comments. It can sound supportive on the surface. It may offer a coping skill. But these are merely algorithmic responses to words, or even possibly a tone or other "markers" that sets off a pre-programmed response. But it cannot truly know or feel you, or understand or "get" you, or be able to have the vast experience of a seasoned therapist who can discern between what someone else may need in a moment, versus what you personally may need that differs in a very similar situation. While empathy is not the only piece to consider in why AI "therapy" should never happen, without the ability to truly have empathy the entire rug is pulled out from under a therapy -- or any relationship for that matter.