News

An expansion of Thorn's CSAM detection tool, Safer, the AI feature uses "advanced machine learning (ML) classification models" to "detect new or previously unreported CSAM," generating a "risk ...
Over a year ago, Apple announced plans to scan for child sexual abuse material (CSAM) with the iOS 15.2 release. The technology is inevitable despite imperfections and silence about it.
Late last year, California passed a law against the possession or distribution of child sex abuse material (CSAM) that has been generated by AI. The law went into effect on January 1, and ...
Lawmakers questioned Big Tech CEOs about how they are preventing online child sexual exploitation. AI is complicating the problem.
Twitter has failed to remove images of child sexual abuse over recent months—even though they were flagged as such, a new report will allege this week.
GROVETOWN, Ga. (WJBF) – A Grovetown man has been taken into custody after being indicted on crimes that involved the exploitation of minors. According to authorities, Grovetown Police Department ...
In August, a 9-year-old girl and her guardian sued Apple, accusing the company of failing to address CSAM on iCloud. Techcrunch event. Save $200+ on your TechCrunch All Stage pass ...