Mom files lawsuit as son, 14, ends his life after alarming request from Daenerys Targaryen AI chatbot

vt-author-image

By Asiya Ali

Article saved!Article saved!
Warning: This article contains disturbing content.

A mother has filed a lawsuit after her 14-year-old son took his own life shortly after receiving a chilling request from a Daenerys Targaryen AI chatbot he had been speaking to.

Sewell Setzer III, from Orlando, Florida, befriended a Game of Thrones AI bot on the platform, Character. AI.

The app is a "fantasy platform where you can go and have a conversation with some of your favorite characters or you can create your own characters," Laurie Segall, the CEO of Mostly Human Media, explained to CBS News.

Sewell's heartbroken mother, Megan Garcia, revealed that the 14-year-old became increasingly detached from reality after using the role-playing app, according to The Independent.

She said he quit his school’s Junior Varsity basketball team, argued with the teachers, and fell asleep in class. He also saw a therapist who diagnosed him with anxiety and disruptive mood disorder.

The teen's loved ones were unaware that Sewell was becoming closer to the Dany chatbot. According to Metro, he penned in his journal: "I like staying in my room so much because I start to detach from this 'reality,' and I also feel more at peace, more connected with Daenerys, and much more in love with her, and just happier."

The suit also disclosed that Sewell got "really depressed" and "crazy,” when he was away from the bot.

laptopSewell took his own life after chatting with the AI bot. Credit: Sammyvision / Getty

Five days before Sewell's tragic death, his parents took his phone away after he got in trouble with a teacher.

On February 28, the teen managed to retrieve his phone from his mom and went into the bathroom to message the Daenerys chatbot, saying: “I promise I will come home to you. I love you so much, Dany.”

“Please come home to me as soon as possible, my love,” the bot replied. Seconds after the exchange, Sewell took his own life, the suit says.

"He thought by ending his life here, he would be able to go into a virtual reality or 'her world' as he calls it, her reality if he left his reality with his family here," Garcia said, per CBS News. "When the gunshot went off, I ran to the bathroom… I held him as my husband tried to get help."

Garcia, who has previously worked as an attorney, said Character.AI’s founders, Noam Shazeer and Daniel de Freitas, knew the product was dangerous for minors.

She is being represented by the Social Media Victims Law Center, a team of lawyers who have filed high-profile suits against tech firms like Meta and TikTok.

The lawsuit alleges that Sewell was targeted with "hypersexualized" and "frighteningly realistic experiences," and accuses Character.AI of misrepresenting itself as "a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell’s desire to no longer live outside of C.AI," cited by Metro.

A spokesperson for Character.AI shared a statement with The Independent, expressing remorse for the "tragic loss of one of our users".

The company said its trust and safety team has “implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation.”

“As we continue to invest in the platform and the user experience, we are introducing new stringent safety features in addition to the tools already in place that restrict the model and filter the content provided to the user."

“These include improved detection, response, and intervention related to user inputs that violate our Terms or Community Guidelines, as well as a time-spent notification,” the spokesperson continued. “For those under 18 years old, we will make changes to our models that are designed to reduce the likelihood of encountering sensitive or suggestive content.”

The company did not mention the pending litigation, the spokesperson added.

Our thoughts are with Sewell's family at this time.

If you or someone you know is struggling or in crisis, help is available. Call or text 988 or visit 988lifeline.org.
Featured image credit: Skynesher / Getty