Quantcast

Florida mother sues artificial-intelligence 'chatbot' company over teen son's suicide

FLORIDA RECORD

Sunday, December 22, 2024

Florida mother sues artificial-intelligence 'chatbot' company over teen son's suicide

Federal Court
Webp noam shazeer fb

Character.AI CEO Noam Shazeer is among the defendants in the federal lawsuit filed by Megan Garcia. | Facebook

A Florida woman whose 14-year-old son committed suicide after allegedly becoming obsessed with an artificial intelligence-generated online “chatbot” is suing the AI developers for wrongful death, negligence, product liability and unfair trade practices.

Plaintiff Megan Garcia of Orlando filed the federal lawsuit in the Middle District of Florida on Oct. 22, alleging that the company Character.AI is responsible for the death of her son, Sewell Setzer III, who died by a self-inflicted gunshot wound.

Garcia is represented by the Social Media Victims Law Center and the Tech Justice Law Project in a lawsuit that argues Character.AI was reckless and malicious in its marketing campaign to provide teens with unrestricted access to “defectively designed” lifelike AI companions through the harvesting of user data.

“Defendants intentionally designed and programmed C.AI to operate as a deceptive and hypersexualized product and knowingly marketed it to children like Sewell,” the lawsuit states. “Defendants knew, or in the exercise of reasonable care should have known, that minor customers such as Sewell would be targeted with sexually explicit material, abused and groomed into sexually compromising situations.”

Sewell’s death followed his mother’s decision to take away his cell phone as punishment for causing problems for his instructors at school, according to the complaint. Sewell attempted to reconnect on other devices with the chatbot while without his cell phone, the lawsuit says.

“In fact, in one prior undated journal entry he wrote that he could not go a single day without being with the C.AI character with which he felt like he had fallen in love; that when they were away from each other they (both he and the bot) ‘get really depressed and go crazy,’ further evidence of the impact of the product’s anthropomorphic design,” the complaint states.

Some of the characters that Character.AI customers can interact with bill themselves as mental health professionals and tutors, according to plaintiffs’ attorneys.

Google, which once employed the founders of Character.AI, including CEO Noam Shazeer, is also a defendant in the lawsuit. 

A Character.AI spokesperson told the Florida Record in an email that the company does not comment on the specifics of litigation. But the Character.AI did express sympathy for the family.

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” the spokesperson said. “As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation.”

Character.AI is also introducing new safety features to its app, including “detection, response and intervention” responses which are prompted by user inputs that violate the company’s Terms or Community Guidelines.

“For those under 18 years old, we will make changes to our models designed to reduce the likelihood of encountering sensitive or suggestive content,” the spokesperson said.

The lawsuit seeks damages for physical and mental pain and suffering, loss of enjoyment of life, past medical care expenses for Sewell’s injuries, punitive damages, reimbursement for attorney fees and legal costs, and an order to stop the defendants from engaging in harmful conduct.

More News