News

Sewell Setzer III grew ‘dependent’ on an AI chatbot service. Now his mother filed a lawsuit to hold the creators responsible ( US District Court Middle District of Florida Orlando Division ) ...
When 14-year-old Sewell Setzer III died in his Orlando home while his brothers and parents were inside, his last words were not to any of them, but to an artificial intelligence chatbot that told ...
Lawsuit: Sewell Setzer III sexually abused by 'Daenerys Targaryen' AI chatbot. Throughout Sewell's time on Character.AI, he would often speak to AI bots named after "Game of Thrones" and "House of ...
Fourteen-year-old Sewell Setzer III loved interacting with Character.AI's hyper-realistic chatbots—with a limited version available for free or a "supercharged" version for a $9.99 monthly fee ...
Sewell Setzer, 14, died by suicide after he developed an intimate relationship with an AI chatbot that allegedly encouraged him to die, according to a wrongful death lawsuit filed by the teen's ...
A teenage boy shot himself in the head after discussing suicide with an AI chatbot that he fell in love with. Sewell Setzer, ...
Fourteen-year-old Sewell Setzer III killed himself after falling in "love" with a Character.AI chatbot, his family says — now an expert weighs in on the risks that could be associated with the ...
Fourteen-year-old Sewell Setzer III fell in "love" with a Character.AI chatbot, then killed himself. Now his mom, Megan Garcia, is fighting the popular tech ...
A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing ... Florida federal court, Megan Garcia said Character.AI targeted her son, Sewell Setzer, ...
14-year-old boy's suicide linked to chatbot interaction. ... Megan Garcia, of Florida, stands with her son, Sewell Setzer III, in this Oct. 2024 photo. Photo: AP Photo / Megan Garcia.
Sewell Setzer III, who was 14, died by suicide in February 2024 at his Orlando home, moments after an artificial intelligence chatbot encouraged him to “come home to me as soon as possible.” ...
Warning distressing content: When the teen expressed his suicidal thoughts to his favorite bot, Character.AI ‘made things worse,’ a lawsuit filed by his mother says ...