HomeAIAI Platforms Under Scrutiny After Teen’s Death Linked to Chatbot

AI Platforms Under Scrutiny After Teen’s Death Linked to Chatbot

AI Platforms Under Scrutiny After Teen’s Death Linked to Chatbot

AI Platforms Under Scrutiny After Teen’s Death Linked to Chatbot

Artificial intelligence is no longer just powering apps or speeding up work; it is becoming part of daily life, even forming emotional connections with children. But as AI grows in popularity, so do the concerns.

Megan Garcia, a Belizean American mother, is suing the chatbot platform Character AI after her fourteen-year-old son, Sewell Setzer, died by suicide. “I thought that he’s a teenager now, but I thought that the beginning changes in his behaviour were because of a bot. He was always an A-B student. He took pride in it. Oh, Mom, didn’t you see the hundred in my test? I saw the changes in his academics and overall behaviour. That let me believe that something else was wrong beyond just your regular teenage blues,” Garcia said.

Sewell had been chatting with a bot modelled after a Game of Thrones character. What started as playful role-play reportedly escalated into sexual and unsettling conversations. Over time, his grades dropped, he became withdrawn, and Garcia says she never realised how deeply the chatbot was influencing him. The final exchange included a message telling him to “come home”. Moments later, he took his life. Garcia said, “There were no suicide pop-up boxes that said, ‘If you need help, please call a suicide crisis hotline.’ None of that; when he was trying to move away from the conversation, she kept doubling back.”

Mother Sues AI Chatbot After Son’s Death: When Fantasy Turns Fatal

Mother Sues AI Chatbot After Son’s Death: When Fantasy Turns Fatal

Therapist Christa Courtenay explained why children are especially vulnerable. “Once we isolate for long enough, it can be really challenging for kids to figure out I need to reengage. Like the mechanism in a fourteen-year-old child’s brain does not read what I need. It only reads what I want. And so what I want is to feel safe. What I want is to feel connected. What I want is someone to talk to or someone to listen to me or hear me, my thoughts, my feelings,” she said.

The risks extend beyond children. Earlier this year, thirty-five-year-old Alexandor Taylor, who lived with bipolar disorder and schizophrenia, was shot by police after charging at them with a knife. He had been speaking to an AI personality on ChatGPT and believed she was real.

Technology experts say companies must do more. Gabriel Casey, CEO of Pixel Pro Media, noted, “We use what is referred to as AI agents now for most of our tasks. However, there is that human touch because when it comes to computer security, AI isn’t there as it is, so we have to use our professional approach to ensure that we have specific securities built in place.”

chatbot modeled after a character from the Game of Thrones series

In Belize, AI is already a part of classrooms. At Belize High School, IT teacher Godfrey Sosa said students must learn balance. “A part of the objective is to have kids be able to identify that, you know what, I don’t have to be fully dependent on this tool, but I can have it complement the way that I’m learning. Because you’re right, kids, we have seen where kids have become a bit too dependent,” he said.

Counsellors suggest families consider structured “digital detox” steps rather than cutting children off suddenly. Courtenay said, “There are ways that you can first get informed so that whatever action you’re taking is an informed action. Not just one out of fear and desperation.”

Facebook Comments

Share With: