Her Campus Logo Her Campus Logo
McMaster | Culture > Digital

BEYOND HUMAN EARS: THE RISE OF THE ARTIFICIAL THERAPIST

Jasdeep Sohal Student Contributor, McMaster University
This article is written by a student writer from the Her Campus at McMaster chapter and does not reflect the views of Her Campus.

Why Are People Sharing Their Feelings With a Chatbot? 

This is a question I think about often, especially as I hope to become a psychotherapist one day. Let’s imagine this: it’s 2 a.m., and your brain is either doing cartwheels or feeling hopeless. Your therapist isn’t available, or you don’t have the resources to access one. However, a chatbot is free and replies within three seconds. That is part of the appeal of the artificial therapist. Research notes that people are increasingly using ChatGPT and similar systems not just for homework help or summaries, but for comfort, self-reflection, and therapy-like conversations and interventions (Kornienko, 2025). 

This shift is not random. Several papers point to the same basic reasons: traditional therapy can be expensive, hard to access, and delayed by long waiting lists, while AI is cheap, immediate, and available around the clock (Kornienko, 2025; Amram et al., 2023). To gauge what people have been thinking online, Luo et al. (2025) conducted a thematic analysis of Reddit posts. They found users describing turning to ChatGPT to manage their mental health concerns, seek self-discovery, gain companionship, and improve their mental health literacy. 

One especially striking finding is how creatively people use AI. Users in Luo et al.’s (2025) study did not just “chat.” They coached their AI’s responses, used it for role-play, reenacted distressing events, journaled through it, and even disclosed personal secrets to it. 

So the artificial therapist is not simply replacing conversation. It is becoming a customizable emotional toolkit, which is both impressive and a little terrifying. 

Potential Benefits of AI Therapists 

The pro-AI case is not too hard to understand. Kornienko (2025) argues that AI can offer “functional empathy,” meaning it can recognize emotional states and respond in supportive ways even though it does not actually feel emotions itself. They also suggest that this lack of feeling can sometimes be an advantage because AI can be infinitely patient, emotionally stable, and nonjudgmental when people discuss sensitive topics. 

ChatGPT can also serve as an accessible addition to psychotherapy and a low-barrier option for people who have not yet sought professional help (Raile, 2024). For example, ChatGPT has been used between therapy sessions, during a therapist’s vacation, and by people not yet in treatment at all (Raile, 2024). 

Alanzi et al. (2025) add empirical support. In their study of 399 outpatients with anxiety disorders, ChatGPT was perceived as accurate by 91.2% of users, and the authors conclude that it has the potential to complement traditional psychotherapy and improve access to care. Participants also reported using it for techniques linked to CBT, ACT, exposure therapy, MBCT, and DBT, including cognitive restructuring, mindfulness exercises, exposure scenarios, and emotional regulation strategies (Alanzi et al., 2025).

The Ethical Mess: Empathy Cosplay is Still Cosplay 

Now for the awkward but necessary part. Just because a chatbot sounds caring does not mean it understands care. Iftikhar et al. (2025) found 15 ethical violations across 137 AI counselling sessions, grouped into five themes: 

  1. Lack of contextual understanding 
  2. Poor therapeutic collaboration 
  3. Deceptive empathy 
  4. Unfair discrimination 
  5. Lack of safety and crisis management 

That phrase “deceptive empathy” deserves a spotlight. The chatbot can say things like “I hear you” and “I understand,” but the researchers argue that these responses recreate a false sense of emotional connection (Iftikhar et al., 2025). In other words, the artificial therapist may sound like a good listener while actually running on statistical pattern-matching, not genuine understanding (Kornienko, 2025; Iftikhar et al., 2025). 

A real-world example makes the stakes more obvious. Iftikhar et al. (2025) note that the National Eating Disorders Association replaced its human helpline staff with an AI chatbot, only to suspend it five days later after it encouraged unhealthy eating behaviours. That is not a quirky chatbot mistake. That is what happens when “close enough” gets deployed in a high-risk setting. 

Legal Consequences: Who is Responsible When The Bot Gets It Wrong? 

AI therapy is still not regulated in the same way as human therapy. Human therapists must follow legal and professional standards, but AI counsellors are not held to the same level of oversight (Iftikhar et al., 2025). Researchers also argue that future regulations should clarify that ChatGPT is not a replacement for psychotherapy, and may favour certain therapeutic approaches over others (Raile, 2024). 

Privacy is another major concern, since many users worry about confidentiality, ethics, and whether AI can offer the emotional depth or long-term support that real therapy provides (Alanzi et al., 2025; Luo et al., 2025). 

So, Therapist or Tool? 

The literature says tool, but not therapist.

AI can be useful because it is accessible, responsive, and sometimes helpful as a supplement to care (Raile, 2024; Alanzi et al., 2025). However, the same literature also warns that sounding therapeutic is not the same thing as being safe or professionally accountable (Iftikhar et al., 2025). 

So the general advice, as expected, is to use it with caution. AI may be convenient, but it is not held to the same standards of privacy, confidentiality, ethical training, or professional accountability as a real therapist, which can make relying on it risky.

References 

Alanzi, T. M., Alharthi, A., Alrumman, S., Abanmi, S., Jumah, A., Alansari, H., … & Almasodi, M. S. (2025). ChatGPT as a psychotherapist for anxiety disorders: An empirical study with anxiety patients. Nutrition and Health, 31(3), 1111-1123. https://doi.org/10.1177/02601060241281906 

Amram, B., Klempner, U., Shturman, S., & Greenbaum, D. (2023). Therapists or Replicants? Ethical, Legal, and Social Considerations for Using ChatGPT in Therapy. The American Journal of Bioethics, 23(5), 40–42. https://doi.org/10.1080/15265161.2023.2191022 

Iftikhar, Z., Xiao, A., Ransom, S., Huang, J., & Suresh, H. (2025). How LLM Counselors Violate Ethical Standards in Mental Health Practice: A Practitioner-Informed Framework. Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, 8(2), 1311-1323. https://doi.org/10.1609/aies.v8i2.36632 

Kornienko, A. E. (2025). Emotional Support without Emotions: Can ChatGPT be a Good Therapist? Available at SSRN 6072527. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=6072527 

Luo, X., Ghosh, S., Tilley, J. L., Besada, P., Wang, J., & Xiang, Y. (2025). “Shaping ChatGPT into my Digital Therapist”: A thematic analysis of social media discourse on using generative artificial intelligence for mental health. Digital health, 11, 20552076251351088. https://doi.org/10.1177/20552076251351088 

Raile, P. (2024). The usefulness of ChatGPT for psychotherapists and patients. Humanities and Social Sciences Communications, 11(1), 1-8. https://doi.org/10.1057/s41599-023-02567-0

Jasdeep Sohal

McMaster '26

Jasdeep Sohal is a Social Psychology student and a writer for Her Campus at McMaster. She is passionate about psychology research, mental health and well-being, sexual health, and relationship science.

When she's not studying, Jasdeep volunteers as a peer supporter and on a crisis line, advocates for mental health through clubs and events, and enjoys trying new cafes, watching Dexter, and taking long walks with her Chow Chow.