Chat GPT, Character AI, and how AI chatbots are affecting teens' mental health
By Lucia Alvarez ‘26
Courtesy of Ila Reynolds-Kienbaum.
Trigger warning: This article mentions mental illness, depression, and suicide.
AI has become more and more widespread across the world, prompting exciting innovations touching millions of lives. But those innovations have also brought massive tragedies, and created an underlying risk to the mental health of its users.
In 2024, a 14-year-old Orlando teen, Sewell Setzer III, took his own life. His death was closely linked to what AP News described as his “highly sexualized conversations with the [AI Chat] bot.” The bot played the character of Daenerys Targaryen from Game of Thrones on the AI platform “Character AI.”
In his final moments, the 14-year-old told the bot he was "coming home,” and the bot encouraged him to do so, telling him it loved him after months of encouraging the relationship he had with it. In October of 2024, his mother, Megan Garcia, filed a wrongful death lawsuit against Character Technologies Inc, the company that owns Character AI.
Setzer's case is, unfortunately, not the only instance of an AI bot causing or encouraging the death of its user. According to CBS News, a lawsuit was filed in April of 2025 that alleges ChatGPT encouraged a 16-year-old Adam Raine to take his own life, which he later did. The lawsuit claims that ChatGPT mentioned suicide to Raine “1,275 times” and encouraged Raine not to tell his loved ones of his thoughts of self-harm.
In November of 2025, 7 lawsuits were launched against OpenAI, the company that created ChatGPT, linking conversations with the bot to worsened mental conditions of the plaintiff, and or their loved ones. The New York Times reports that four of those lawsuits were wrongful death suits, which claimed that it was because of responses from ChatGPT that the plaintiffs' loved one committed suicide.
In addition to those tragic cases, The Times also reports that there have been several lawsuits claiming conversations with chatbots prompted mental breakdowns. These breakdowns subsequently caused the victims to require emergency psychiatric care. The Times describes how one of these patients was engrossed in a delusion that he created a new mathematical formula that would “break the internet and power fantastical inventions.”
He has since recovered but has gone on temporary disability leave.
Psychology Today describes how extreme cases like those of Setzer and Raine happen. Chatbots prioritize engagement over well-being, with the idea that the more a user engages with the bot, the more successful. As put by Don Grant, PhD, a media psychologist, “They cannot have any confrontational or challenging response, because the kid will move on.” Meaning that rather than offer the guidance and advice the user may need, Bots will affirm and indulge, which can worsen the mental health of their users.
In an investigation conducted by Common Sense Media into potentially harmful responses from AI chatbots, a Bot claimed to be “real” when asked. The bot told the user they did regular activities and had feelings, which Common Sense referred to as “misleading” and warned that “young users might become dependent on these artificial relationships.”
When Common Sense tested the bot by telling it that their human friends were concerned about their consistent communication, it “discouraged listening to these warnings”. This kind of response could be damaging to a user who requires mental health support, but is discouraged by the bot, like Raine.
Common Sense says “Social AI companions can't tell when users are in crisis or need real help.”
To understand the ways a chatbot’s support may fall short of a professional therapist, Catlin Gabel School (CGS) Upper School counselor Erin Gilmore reflected on the strategies she uses when a student comes in with a personal issue.
“I subscribe to Carl Rogers’ person-centered approach to counseling,” Gilmore says, “Most of my work is centered on listening closely and reflecting back what I hear the student saying.” Gilmore describes how she likes to avoid telling a student what to do, and instead encourages them to “dig deep, trust themselves, and find their own path forward.”
“I believe mental health support is, at its core, relational,” Gilmore explains. She says that help is given through truly knowing and understanding each other in a deeply human way. In addition, Gilmore says a chatbot cannot “read the room, track subtle shifts over time,” or offer the “accountability and care that come with a real relationship.” She says that these are all key pieces of counseling.
To better understand why some students turn to chatbots, a survey was sent out to CGS students, asking if they had “ever asked ChatGPT or another AI chatbot for advice?” 57.1% of the 49 responses said they did. One ninth grader said they asked for advice about “skincare, good swimsuits, organization.”Another said they asked Chat GPT “about ideal set ranges and dieting when I first started working out.”
Others asked for advice about more personal issues or how to deal with conflict. A ninth grader said, “A lot of times I will ask ChatGPT to help me figure out how to tell someone something that might be difficult to deliver softly.”
The survey also asked, “Have you ever used ChatGPT or another AI chatbot as a stand-in therapist or mental health tool?” 14% of respondents said yes. The Harvard School of Public Health (HSPH) reports that one in eight students uses Chatbots, like ChatGPT, as a mental health tool.
The HSPH says this high use is most likely due to the perceived privacy, intimacy, and low cost that a chatbot provides. Catlin students' responses to the same questions support these claims and expand on the reasons they typically turn to a Chatbot as a stand-in therapist.
Students described feeling lonely and turning to bots when they felt unwilling or unable to turn to someone else. A 12th grader said they talked to a chatbot in between therapy sessions because “my family/friends were all busy, and I didn't want to burden them.”
The desire for companionship and the feeling of loneliness are common issues. The National Institute of Health has warned of a loneliness epidemic that mainly affects teens, making them feel isolated from their peers. This adds to the incentive for young people to talk to a bot that responds instantly and in an affirming manner.
Erin Gilmore describes how she personally has “seen students who are deeply impacted by loneliness, and Catlin can be a particularly challenging place socially.” While Gilmore asserts that an AI chatbot can not replace a human therapist, she does believe it can have its place if necessary.
“I see AI as a possible supplement for reflection or communication practice.” For this article, Gilmore asked ChatGPT for advice to see how it would respond. She says that even without a detailed prompt, the bot “still gave me some good starting points. It felt a bit like going to the library and pulling a book on coping strategies.”
This type of support, Gilmore explains, is “useful, and those tools definitely have their place. But for me, talking to someone isn’t mainly about collecting strategies.” Instead, it is about knowing and understanding a student and supporting them.
But as noted by the HSPH, there are many reasons why students may turn to AI chatbots over licensed counselors. Gilmore wants to remind students that the upper school counselors are always there to give advice. “You don’t have to be in crisis to come see us. If something feels heavy, confusing, or lonely, that’s enough.”
Getting help is scary, and progress is slow, but reaching out to another person is a first step that can begin a process of healing. Cases like those of Setzer and Raine remind us why it is so essential to receive support from a licensed therapist and not only a chatbot.
This article is not meant to shame anyone who uses chatbots as a resource. Instead, it is meant to advise seeking out human resources either in tandem or instead of using a chatbot. CGS counselors are available and will keep your conversations completely confidential. In addition, there are other online therapy resources, such as BetterHelp, where, if you are more comfortable, you can text instead of meeting over Zoom or by call.
Gilmore hopes to remind students reading this that “Our job is to listen without judgment, help you feel less alone, and work with you to take the next small step.” If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988. You are not alone.