News
California lawmakers introduced SB 243 after a teen’s suicide, aiming to regulate AI chatbots and prevent future harm to vulnerable young users.
AI relationships may be useful and even enjoyable. But only a fellow human can offer the depth of understanding that real ...
Across Australia, kids are forming relationships with artificial intelligence companion bots much more dangerous than traditional social media.
Megan Garcia, a Florida mother whose oldest child, 14-year-old Sewell Setzer III, died by suicide after extensive ...
Proposals to install ChatGPT into a range of toys including Barbie dolls have sparked alarm from experts who branded it a ...
The Heritage Foundation — the group behind the infamous Project 2025, the conservative policy plan that outlined ____ — is ...
Just because AI is becoming mainstream doesn't mean it's safe, especially when used by children who it has few guidelines to ...
Last year a 14-year-old US boy called Sewell Setzer III killed himself after exchanging multiple messages a day with a personalised chatbot from Character.ai over several months. Before he died, ...
In Sewell Setzer’s case, the chatbot ultimately seemed to encourage him to kill himself. Other reports have also surfaced of bots seeming to suggest or support suicide.
Character.AI allows users to interact with life-like AI “characters”, including fictional and celebrity personas that mimic human traits like stuttering.
Megan Garcia, the mother of Sewell Setzer III, claims Character.ai targeted her son with "anthropomorphic, hypersexualized, and frighteningly realistic experiences" in a lawsuit filed in Florida.
In October, Megan Garcia sued Character.ai and Google, claiming they were responsible for the suicide of her son, Sewell Setzer III.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results