Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as ...
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse ...
A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored ...
Apple is facing a lawsuit seeking $1.2 billion in damages over its decision to abandon plans for scanning iCloud photos for ...
It claims that, after Apple showed off its planned child safety tools, the company “failed to implement those designs or take ...
A second suit says Apple isn't doing enough to stop the spread of harmful images and videos and that it's revictimizing the ...
Thousands of CSAM victims are suing Apple for dropping plans to scan devices for the presence of child sexual abuse materials. In addition to facing more than $1.2B in penalties, the company could be ...
Thousands of victims banded together for a proposal regarding a class action lawsuit against Apple, with the company now ...
Apple faces a $1.2 billion lawsuit for failing to address child sex abuse material (CSAM) after cancelling a detection tool.
Announced in 2021, the plan was for Apple to scan images on iCloud for child abuse material using on-device technology. While ...
New class action suit filed by thousands of victims accuses the tech giant of hiding behind cybersecurity concerns instead of ...
Apple is now facing a $1.2 billion lawsuit over its decision to drop plans for scanning photos stored in iCloud for child ...