Apple sued for not implementing CSAM detection for iCloud
Apple is being sued by a 27 year old woman for the amount of $1.2 billion for its decision not to implement a system that could scan iCloud photos for child sexual abuse material (CSAM).
Back in 2021, Apple developed a program called NeuralHash.
The moment someone downloads photos of child abuse and syncs those to their iCloud account, the software would then scan the image’s code or hash. Apple does not generate this hash itself, but rather obtains it from organizations such as the National Center for Missing and Exploited Children (NCMEC).
Apple’s scanning software was able to recognize variations on the supplied image material, such as black and white photos or mirrored versions.
To prevent the software from issuing false positives, Apple built two security measures into the process. First, a manual check by an employee takes place if there’s actually a match. In addition, alarm bells would only go off at a minimum of thirty hits. The latter was called Threshold Secret Sharing.
As soon as Apple introduced NeuralHash to the public, the tech company immediately faced fierce criticism. Human rights groups like the Electronic Frontier Foundation (EFF) feared that the software was a first step in building a surveillance system. Apple employees were concerned that the system could be misused by authoritarian regimes to monitor dissidents and activists, or to impose government censorship.
That’s why Apple decided to withdraw the plan to implement NeuralHash.
A 27 year old victim of CSAM decided to file a lawsuit against Apple for this decision. According to the lawsuit, Apple announced “a widely touted improved design aimed at protecting children”, but then failed to “implement those designs or take any measures to detect and limit” images of child sexual abuse.
According to The New York Times, who first reported on the indictment, the lawsuit argues that Apple forces victims of child abuse to relive their trauma and cause immense mental pain by not doing more to prevent the spread of CSAM. The plaintiff says that a relative molested her when she was an infant and those pictures were shared online. All these years later she still receives law enforcement notices almost daily about someone who’s being charged over possessing those images.
Attorney James Marsh, who represents the plaintiff, says there’s a potential group of 2,680 victims that could be entitled to compensation. That’s why he’s suing for $1.2 billion.
A company spokesperson told The New York Times that Apple is “urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users”.
Your email address will not be published. Required fields are marked