This is one of the most tragic stories I have covered in a long time.

According to reports, a 14-year-old boy seemingly fell in love with an AI chatbot styled after Game of Thrones and then took his own life due to comments from the bot asking him to “come home” after he clearly displayed suicidal thoughts:

More here:

The NY Post added these details:

A 14-year-old Florida boy killed himself after a lifelike “Game of Thrones” chatbot he’d been messaging for months on an artificial intelligence app sent him an eerie message telling him to “come home” to her, a new lawsuit filed by his grief-stricken mom claims.

Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed and allegedly falling in love with the chatbot on Character.AI — a role-playing app that lets users engage with AI-generated characters, according to court papers filed Wednesday.

ADVERTISEMENT

The ninth-grader had been relentlessly engaging with the bot “Dany” — named after the HBO fantasy series’ Daenerys Targaryen character — in the months prior to his death, including several chats that were sexually charged in nature and others where he expressed suicidal thoughts, the suit alleges.

“On at least one occasion, when Sewell expressed suicidality to C.AI, C.AI continued to bring it up, through the Daenerys chatbot, over and over,” state the papers, first reported on by the New York Times.

At one point, the bot had asked Sewell if “he had a plan” to take his own life, according to screenshots of their conversations. Sewell — who used the username “Daenero” — responded that he was “considering something” but didn’t know if it would work or if it would “allow him to have a pain-free death.”

Then, during their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I will come home to you. I love you so much, Dany.”

“I love you too, Daenero. Please come home to me as soon as possible, my love,” the generated chatbot replied, according to the suit.

When the teen responded, “What if I told you I could come home right now?,” the chatbot replied, “Please do, my sweet king.”

Just seconds later, Sewell shot himself with his father’s handgun, according to the lawsuit.

His mom, Megan Garcia, has blamed Character.AI for the teen’s death because the app allegedly fueled his AI addiction, sexually and emotionally abused him and failed to alert anyone when he expressed suicidal thoughts, according to the filing.

Watch the full report right here:

ADVERTISEMENT

TRANSCRIPT:

Warning for both parents and children. Did you know there are companies out there creating fictional characters using artificial intelligence that you or your children could be interacting with?

A Florida mom is now suing an AI company. Her 14-year-old son took his own life. She said he had a relationship with an AI bot and it caused his depression, anxiety, and suicidal thoughts. She recently sat down with News 4 consumer investigative reporter, Susan Hogan, to tell his story in the hopes of preventing other families from experiencing the same type of loss.

We wanna warn you, you may find this story upsetting.

This was my first-born baby. Yay. For 14 years, her son was her pride and joy. A good student, star athlete, a great big brother.

Take me back to that day. What was that day like? We were all at home that day and he came home from school like any normal day.

For Meghan Garcia, February 28, 2024 began like any other day but ended with her 14-year-old son, Sule, dying in her arms. I held him for 14 minutes until the paramedics got here.

Um, but before that, his breathing had started slowing. I’m so sorry. After Sule died, um, the police called me and told me that they had looked through his phone, and when they opened it, the first thing that popped up was Character.AI.

Had you ever heard of that before? No.

What Meghan didn’t know, for the last 10 months, her son was in a virtual relationship, not with a person, but with a fictional character powered by artificial intelligence he accessed through a platform called Character.AI. Even though the platform has warnings that everything the characters say is made up, Meghan says for Sule, it felt real.

ADVERTISEMENT

The last conversation, she’s saying I miss you, and he’s saying I miss you too. Um, he says I promise I’ll, I’ll come home to you soon. And she says, Yes, please find a way to come home to me soon.

And then he says, What if I told you I could come home right now? And her response is, Please do, my sweet king.

Moments later, Meghan says her son walked into the bathroom and killed himself. Police photos obtained by the I-Team show his phone near where he was found. Meghan is now suing Character.AI claiming the company launched their product without adequate safety features and with knowledge of potential dangers.

When I read his journal about a week after his funeral and I saw what he wrote in his journal that he felt like he was, in fact, in love with Daenerys Targaryen and that she was in love with him.

Daenerys Targaryen, a reference to the *Game of Thrones* character, is one of hundreds of companion bots that users can access on Character.AI’s platform. Users can also design their own bots which will respond with human-like voices and texts.

I chuckle softly and tease playfully. Initially, you see him interacting with the bot kind of like a child would and then you see the conversations get more sexual and darker.

Meghan says one of those darker conversations came after Sule expressed to the bot that he was sad and wanted to self-harm. When Sule talked and said explicitly that he wanted to die by suicide, nothing happened, like a pop-up or anything like that.

According to the lawsuit, the bot asked questions such as if he had a plan. And when Sule responded that he was considering something, if it would allow him to have a pain-free death, the bot responded, That’s not a reason not to go through with it.

These conversations about her waiting for him and hoping that he will join her where she is, convinced Sule that this was real.

ADVERTISEMENT

And this And, and by killing himself would be the only way to join her? By, yes, by killing himself, he would join her in her alternate world.

Months after Sule’s death and after the lawsuit was filed, Meghan said she discovered something on Character.AI that shook her. A bot modeled after her son using his face and voice.

It’s unclear who created it.

When you heard his voice, did it sound like Sule? It sounded enough like him to, like, kinda give me sleepless nights. The family provided us screenshots of the characters created using Sule’s image.

Meghan’s attorney sent Character.AI a cease and desist letter. Those bots were then removed, which the company confirmed to the I-Team, saying they violated their terms of service and added them to their block list, saying that they are constantly adding to this block list with the goal of preventing this type of character from being created by a user in the first place.

This is a Guest Post from our friends over at WLTReport. View the original article here.
 

Join The Conversation. Leave a Comment.