News

California lawmakers introduced SB 243 after a teen’s suicide, aiming to regulate AI chatbots and prevent future harm to vulnerable young users.
Across Australia, kids are forming relationships with artificial intelligence companion bots much more dangerous than traditional social media.
Megan Garcia, a Florida mother whose oldest child, 14-year-old Sewell Setzer III, died by suicide after extensive ...
Proposals to install ChatGPT into a range of toys including Barbie dolls have sparked alarm from experts who branded it a ...
The Heritage Foundation — the group behind the infamous Project 2025, the conservative policy plan that outlined ____ — is suddenly really, really down with AI regulation. Who knew! The conservative ...
In Sewell Setzer’s case, the chatbot ultimately seemed to encourage him to kill himself. Other reports have also surfaced of bots seeming to suggest or support suicide.
Character.AI allows users to interact with life-like AI “characters”, including fictional and celebrity personas that mimic human traits like stuttering.
Megan Garcia, the mother of Sewell Setzer III, claims Character.ai targeted her son with "anthropomorphic, hypersexualized, and frighteningly realistic experiences" in a lawsuit filed in Florida.
In October, Megan Garcia sued Character.ai and Google, claiming they were responsible for the suicide of her son, Sewell Setzer III.
Sewell Setzer III, who was 14, died by suicide in February 2024 at his Orlando home, moments after an artificial intelligence chatbot encouraged him to “come home to me as soon as possible.” ...
A lawsuit claims Sewell Setzer III, a 14-year-old Orlando high school freshman, shot himself in the head in February 2024 after becoming obsessed with an AI chatbot named after and reminiscent of ...