ChatGPT Isn’t Therapy: The Quiet Risks of Replacing Human Help with AI

If you are like me, you use AI for many things. I personally use it for practical things like “why has my phone frozen again? Or hints on how to improve my YouTube channel. One thing I would never do is share any mental health issues I have and expect to get an answer that isn’t just what I wanted to hear. Yet, a surprising amount of people are doing just that with some well publicised disastrous results. 

Therapy (with a human) is not just a chat amongst friends or a set of coping methods . Proper therapy is a clinical process that sits inside a framework of underpinning methods, an ethical framework, a professional relationship and clear duty of care. To think that AI can replace this is dangerous thinking. Not because AI is evil or because it shouldn’t have some place in mental health (it could easily replace life coaching), it is purely because it does not and cannot provide what therapy does and that difference matters when people are really vulnerable.

AI is becoming increasingly efficient at producing language that sounds organized, calm, empathetic and plausible. It can offer explanations (my phone), steps, scripts and reassurance. It can help someone feel less lonely (especially as a lot of therapists work 9 to 5 and don’t encourage contact between sessions), summarize thoughts and offer up reflection points. The problem is that usefulness can be mistaken for clinical care. In some well publicized cases, individuals have started treating AI as a substitute for therapy. There have already been cases of suicide after this and today I read of a killing in Scandinavia, where phone records suggested that that AI input made the killer paranoid about his victim. What “sounds right” is not the same as “is right”. The more serious the condition or situation a person finds themselves in, the gap between the two can become serious. 

The first distinction is relationship. Therapy is not just giving information, it is a relationship with a therapist whose job is to notice patterns, track change and remain grounded, even the client isn’t. They remember what was said, also what you avoided to say, minimised, subdued or not ready to talk about to that point. They notice changes in story, mood shifts and how your language holds the key to what is going on in your inner world. They notice the things that happen between you because the relationship itself is part of the work. 

AI does not relate to you. It can simulate warmth but it cannot attune to you as a person. It does not have an emotional stake in you, your history or a lived sense of you as a person. It cannot imitate the nuances of therapy that are primarily human. It can’t sense when the process needs to slow down, or bring challenge. It cannot repair ruptures in the process because it doesn’t experience them. It can continue to speak indefinitely based on what the input is without really “feeling” what is truly going on.

The second major distinction is clinical judgment. Human therapists are trained to work with uncertainty. People very rarely present in the first session with a neat package of issues ready to go. They present with caution, blind spots, defenses and the effects of conditioning and trauma. A therapist’s job is to formulate, test and revise as new information emerges. That requires training and education as well as experience which includes recognising when someone is not speaking their full truth in their distress. AI will not recognise someone in an abusive relationship who calls it “relationship issues”. 

AI’s primary function is to generate plausible text from what it’s given. If the user’s input is incomplete, distorted or driven by distress (or someone is typing exactly what they need to get a preferred response), AI may well respond in a way that reinforces these issues rather than challenging them. It may validate something that should be questioned. It may offer certainty where uncertainty would be safer, (we all need to learn to sit in uncertainty and discomfort longer than we do). It may also offer broad explanations of treatment modalities that might not be right for the individual and is very good at pathologizing normal human pain. The risk is not just “wrong” answers, its the risk of “confident” answers, taken for granted by the individual, that would be looked at by a therapist in terms of risk factors, support, current functioning and the person’s ability to regulate.

AI has no duty of care. It cannot call you up to check on you, coordinate care or judge if your situation is sliding into a dangerous state. When people replace therapy with AI, they often do it because of availability, price and it feels safer than being exposed to another person. While this is understandable, it also consolidates the risk. The people most likely to use AI are the ones who struggle with relational safety, shame or distrust and these are precisely the people who would benefit with professional therapy. In other words, AI input can reinforce the very pattern that keeps them stuck, being alone and managing distress in private. 

AI also offers no meaningful boundaries in an ethical sense. It is available at any hour for as long as you want and can become an emotional dependency tool . Yet, dependence is not defined only by a relationship with a “tool”, it is also what it does to the personal autonomy of the individual. If someone starts to outsource emotional regulation, decisions and self-trust to AI, it will eventually weaken one of the main goals of therapy, internal authority, tolerance of discomfort and the ability to stay with difficult feelings without seeking an immediate source of external regulation.

None of this means that AI has no place. It can be useful as a support. It can help people prepare for sessions by organizing thoughts, identifying themes and drafting questions for therapy. It can generate journalling prompts and offer educational training through citing research and studies. It can offer coping strategies for someone who is stable and is used as a self-help tool. It can track habits and routines, produce boundary scripts and rehearse difficult conversations. So, a perfect support for therapy sessions. If AI is seen as a “workbook” that speaks, it can help, but it cannot replace the relational process that is therapy. 

If you are using AI and you are becoming more agitated, more paranoid, more ashamed and more alone, then that is a red flag. If you are using it to “diagnose” yourself or others, that is a red flag. If you are using it to talk yourself out of human contact, another red flag. If you are in crisis, distress, living in an unsafe environment or at risk of harm, AI should NEVER be the the primary first contact. In these cases, first contact should always be with a human, a therapist, GP, emergency services or a crisis centre.

The bottom line is this. AI is extremely helpful. But it isn’t therapy and won’t be for the foreseeable future. The aim is not to reject technology (I have increased use of AI on my blog), but to use it for what it is truly good for and put it in a place where it can be helpful. Use it to complement real therapeutic work, not to avoid it. If you want genuine change, especially in long-standing patterns like codependency, trauma responses, or chronic self-doubt, you need a relationship where you can be known, challenged, supported, and kept safe. That still requires a human being doing the job properly

 

Your Healing Journey Starts Here: Join Dr. Jenner’s Community!

Subscribe for weekly in-depth mental health insights, early access to Q&A sessions, and an exclusive discount on Dr. Jenner’s Codependency Recovery Program.

Join 2,671 other subscribers

Discover more from THE ONLINE THERAPIST

Subscribe to get the latest posts sent to your email.

Dr Nicholas Jenner

Dr. Nicholas Jenner, a therapist, coach, and speaker, has over 20 years of experience in the field of therapy and coaching. His specialty lies in treating codependency, a condition that is often characterized by a compulsive dependence on a partner, friend, or family member for emotional or psychological sustenance. Dr. Jenner's approach to treating codependency involves using Internal Family Systems (IFS) therapy, a treatment method that has gained widespread popularity in recent years. He identifies the underlying causes of codependent behavior by exploring his patients' internal "parts," or their different emotional states, to develop strategies to break free from it. Dr. Jenner has authored numerous works on the topic and offers online therapy services to assist individuals in developing healthy relationships and achieving emotional independence.