Apple has been sued by the state of West Virginia over what it says is a failure to prevent child sexual abuse materials (CSAM).
WASHINGTON, Dec. 18, 2025 /PRNewswire/ -- DNSFilter, a global leader in protective DNS and content filtering, today reported a record level of blocked child sexual abuse material (CSAM) across ...
The lawsuit accuses Apple of prioritizing privacy branding and its own business interests over child safety.
After a massive outcry from privacy advocates, child safety groups, and governments, Apple dropped its plans for scanning iCloud photos against the CSAM database. Instead, it now relies on its ...
"We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids," an ...
Thorn, a nonprofit that provides detection and moderation software related to child safety, said it canceled its contract with X after the platform stopped paying it. When Elon Musk took over Twitter ...
Brand safety isn’t always cut and dried. An alcohol brand, for instance, might look for content that other brands would instinctively steer clear of. But some media doesn’t leave room for nuance. On ...
Content warning: This article contains information about alleged child sexual abuse material. Reader discretion is advised. Report CSAM to law enforcement by contacting the ICAC Tip Line at (801) ...
AI-generated child sexual abuse material (CSAM) has been flooding the internet, according to a report by The New York Times. Researchers at organizations like the Internet Watch Foundation and the ...
Can a communications provider be held liable when it reports to the National Center for Missing and Exploited Children (NCMEC) an image the provider believes to be child sexual abuse material based on ...
STE. GENEVIEVE COUNTY, Mo. — A Farmington kindergarten teacher is facing 10 charges in connection with child sexual abuse material shared on a social messaging app. On Wednesday, 36-year-old Erika ...
The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material (CSAM) in 2025. The "vast majority" of that content was ...