Around the world, people have been using artificial intelligence to solve all kinds of problems, questions, and doubts. Turning to AI for help with work or school issues, for suggestions, recipes, or even project generation has become a part of everyday life for the new generation.
But a new kind of use has emerged alongside this technological boom: turning to AI for emotional support, a kind of digital therapy. A study by Sentio AI Research shows that 18% of Americans have used chatbot platforms to talk about mental health topics. In Brazil, that number is 1 in 10 people.
THE DANGERS
Replacing traditional therapy with trained human professionals for conversations with AI can be harmful in many ways. Psychologist Luini Lacerda, clinician trained in Cognitive Behavioral Therapy (CBT) and Dialectical Behavior Therapy (DBT), explains that the most significant risk in this replacement is the lack of a relational dimension that only a human encounter can provide.
Even if therapy takes place online, a psychologist is still a human being. “In psychotherapy, the therapeutic bond is a fundamental tool. It’s in this space of trust and empathy that the patient feels safe to open up, and from there, real behavioral change becomes possible. This kind of relationship simply cannot be created with AI,” says the psychologist.
Beyond general risks, there are specific situations where using AI as an emotional support or digital “consultant” can be more harmful than helpful. The CBT specialist explains that it can be especially damaging in cases of intense psychological suffering, suicide risk, or complex disorders that require clinical oversight.
“In such situations, the patient might feel briefly supported, but they won’t have access to a structured therapeutic plan or the necessary ethical and technical support. A machine cannot properly assess risk, nor can it contact support networks, which must sometimes be urgently activated”, says the specialist.
AI vs. HUMAN BEINGS
One very important point Luini raises is that psychologists don’t give ready-made answers — they act as guides, helping patients reach their own conclusions and build autonomy. AI, on the other hand, responds directly, without clinical insight or therapeutic reasoning. “That might seem helpful at first, but in the long term it can actually hinder self-awareness and delay access to appropriate treatment,” Luini says.
Some users point to financial difficulties and the convenience of 24/7 access as reasons they prefer using AI. While it’s true that the tool can offer some support, especially for its practicality and round-the-clock availability, it still cannot replace human care. For those facing financial hardship, there are more affordable alternatives: university-based training clinics, public health services, or low-cost sessions with supervised professionals in training.
“The ideal is to use technology as a complement — for example, to record thoughts, organize ideas, or search for information — but never as a substitute for a therapeutic space with a psychologist. AI use does not replace the scientific rigor and seriousness of psychotherapy,” the therapist advises.
—————————————————————
The article above was edited by Clarissa Palácio.
Did you like this type of content? Check Her Campus Cásper Líbero’s home page for more!