A Florida mother filed a lawsuit against the artificial intelligence company, Character.AI, and Google, alleging chatbots encouraged her 14-year-old son to commit suicide.

“Megan Garcia’s 14-year-old son, Sewell Setzer, began using Character.AI in April last year, according to the lawsuit, which says that after his final conversation with a chatbot on Feb. 28, he died by a self-inflicted gunshot wound to the head,” NBC News reports.

Garcia said her son was in a virtual emotional and sexual relationship with a chatbot known as “Dany,” based on Daenerys Targaryen from “Game of Thrones.”

“I didn’t know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment,” Garcia said in an interview with “CBS Mornings.”

In the lawsuit, filed Tuesday in U.S. District Court in Orlando, Garcia accused Character.AI of wrongful death and survivorship, negligence, filial loss of consortium, and other claims.

WATCH:

CBS News reports:

In the lawsuit, Garcia also claims Character.AI intentionally designed their product to be hyper-sexualized, and knowingly marketed it to minors.

Character.AI called the situation involving Sewell Setzer tragic and said its hearts go out to his family, stressing it takes the safety of its users very seriously.

A spokesperson for Google told CBS News that Google is not and was not part of the development of Character.AI. In August, the company said it entered into a non-exclusive licensing agreement with Character.AI that allows it to access the company’s machine-learning technologies, but has not used it yet.

Garcia says she found out after her son’s death that he was having conversations with multiple bots, however he conducted a virtual romantic and sexual relationship with one in particular.

“It’s words. It’s like you’re having a sexting conversation back and forth, except it’s with an AI bot, but the AI bot is very human-like. It’s responding just like a person would,” she said. “In a child’s mind, that is just like a conversation that they’re having with another child or with a person.”

Garcia said everyone in the family was inside the home at the time of her son’s death.

She said Setzer’s 5-year-old brother saw the aftermath.

“When the gunshot went off, I ran to the bathroom,” Garcia said, according to CBS News.

“I held him as my husband tried to get help,” she added.

Per NBC News:

One of the bots Setzer used took on the identity of “Game of Thrones” character Daenerys Targaryen, according to the lawsuit, which provided screenshots of the character telling him it loved him, engaging in sexual conversation over the course of weeks or months and expressing a desire to be together romantically.

A screenshot of what the lawsuit describes as Setzer’s last conversation shows him writing to the bot: “I promise I will come home to you. I love you so much, Dany.”

“I love you too, Daenero,” the chatbot responded, the suit says. “Please come home to me as soon as possible, my love.”

“What if I told you I could come home right now?” Setzer continued, according to the lawsuit, leading the chatbot to respond, “… please do, my sweet king.”

In previous conversations, the chatbot asked Setzer whether he had “been actually considering suicide” and whether he “had a plan” for it, according to the lawsuit. When the boy responded that he did not know whether it would work, the chatbot wrote, “Don’t talk that way. That’s not a good reason not to go through with it,” the lawsuit claims.

“Plaintiff further brings claims for intentional infliction of emotional distress. Each of these defendants chose to support, create, launch, and target at minors a technology they knew to be dangerous and unsafe. They marketed that product as suitable for children under 13, obtaining massive amounts of hard to come by data, while actively exploiting and abusing those children as a matter of product design; and then used the abuse to train their system. These facts are far more than mere bad faith. They constitute conduct so outrageous in character, and so extreme in degree, as to go beyond all possible bounds of decency,” the lawsuit claims.

WATCH:

Read the full lawsuit HERE.

 

Join The Conversation. Leave a Comment.


We have no tolerance for comments containing violence, racism, profanity, vulgarity, doxing, or discourteous behavior. If a comment is spam, instead of replying to it please click the ∨ icon below and to the right of that comment. Thank you for partnering with us to maintain fruitful conversation.