Thousands of CSAM victims are suing Apple for dropping plans to scan devices for the presence of child sexual abuse materials. In addition to facing more than $1.2B in penalties, the company could ...
Apple doesn’t make many mistakes when it comes to the security and privacy of its billion-plus iPhone owners. But when they do, they hit extra hard. So it is with the furor building this week as ...
Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM).
Apple last year deployed a mechanism for identifying landmarks and places of interest in images stored in the Photos ...
A class-action lawsuit filed in a Northern California district court alleges Apple's iCloud service has been used to spread child sexual-abuse materials, or CSAM. It also alleges that Apple's ...
Over thousands of CSAM (child sexual abuse materials) victims are now taking the fight against Apple after the company ultimately decided to skip adding tools that will help detect it on their ...
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit argues that by not doing more to prevent ...
A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material. Apple originally introduced ...
Apple's artificial intelligence-led photo analyzer is raising privacy concerns months after the company appears to have ...
which "enables Apple to accurately identify and report iCloud users who store [CSAM] in their iCloud Photos accounts" and flag those accounts for the NCMEC's review, in August 2021. The company ...
A second suit says Apple isn't doing enough to stop the spread of harmful images and videos and that it's revictimizing the subjects of those materials.