#News

Apple Gets Sued For Not Adding CSAM Detection To iCloud

Apple Gets Sued For Not Adding CSAM Detection To iCloud

Date: December 09, 2024

Apple recently decided not to implement a system that automatically scans Child Sexual Abuse Material (CSAM) and is now facing charges for it.

The efforts to minimize the propagation of child sexual abuse material (CSAM) online are gaining global importance. While top tech giants like Google and Meta have implemented new robust systems to auto-detect such media and take immediate action, Apple has fallen behind due to a recent decision. Apple is now being sued for abandoning its 2021 announcement to use digital signatures from the National Center for Missing and Exploited Children and other agencies to detect CSAM on iCloud.

The lawsuit does not claim that Apple does not have existing systems but argues that by not doing more to prevent the spread of CSAM, it is indirectly forcing victims to go through the traumas again. According to the lawsuit, Apple announced a ‘widely touted improved design aimed at protecting children’ and then failed to ‘implement those designs or take any measures to detect and limit.’

Apple seems to be going back on its 3-year-old promise due to the chances of creating a backdoor for government surveillance through iCloud libraries. These suggestions arrived from security and privacy advocates who claim that a vulnerable backdoor can put almost every iCloud user in the world at risk.

The lawsuit is being filed by a 27-year-old woman who used a pseudonym to ensure that her name does not get public, resulting in the further spread of such material. The woman was molested by her uncle when she was an infant, who also shared her images online. She claims that she regularly gets refreshed on the traumatic experience as notifications from law enforcement keep coming almost every day about someone being charged for possessing that content.

The lawsuit’s official legal lead, James Marsh, said that the success of this one lawsuit could help compensate a potential group of 2680 victims. Apple has responded to a new media house’s request to comment on the topic, saying they are “urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”

The spotlight on lawsuits is not a new one for Apple. Back in August, the guardian of a 9-year-old girl got Apple sued over the accusation of failing to address CSAM spread prevention on iCloud.

Arpit Dubey

By Arpit Dubey LinkedIn Icon

Have newsworthy information in tech we can share with our community?

Post Project Image

Fill in the details, and our team will get back to you soon.

Contact Information
+ * =