Apple Faces Lawsuit for Neglecting Child Sexual Abuse Content on iCloud

Apple Faces Lawsuit ⁢Alleging Inadequate‍ Measures to Combat Child Sexual Abuse Material

A⁣ recent lawsuit accuses Apple of not ​doing enough to prevent the spread of ​child sexual abuse material (CSAM) on its iCloud and iMessage platforms. The complaint, filed in the U.S. District Court Northern District of California, claims that Apple was aware of the serious CSAM problem but chose not to address it.

The lawsuit was brought by a 9-year-old minor through her guardian. Between⁣ December 2023 and January ⁣2024, the plaintiff received friend requests from two unknown Snapchat users⁤ who then asked for her iCloud ID. Subsequently,⁣ they sent five CSAM videos through iMessage depicting young children ⁢engaged in sexual activities ⁤and requested explicit ‌videos from the minor.

The suit‍ states that as a ⁢result of this interaction, the plaintiff has‍ suffered severe mental and physical harm and is currently seeking psychotherapy and mental ‌health care.

Apple has‌ not⁣ yet⁢ responded to requests for comment⁤ on this matter.

The proposed class-action lawsuit accuses Apple of using privacy protection as an excuse while allowing CSAM material to proliferate on iCloud. It points out⁢ that Apple abandoned its NeuralHash⁣ CSAM scanning tool due to concerns about unintended consequences for‍ users’ privacy.

In an email cited by the lawsuit, Apple warned about potential issues with ⁣scanning initiatives such as bulk surveillance or screening for political or religious viewpoints that could impact free speech.

According to the⁢ complaint, Apple consistently underreported CSAM cases compared to other tech companies when submitting reports to agencies like the National Center for Missing ⁣& Exploited Children (NCMEC). Last year alone, leading tech companies ⁢submitted‍ over 35 million reports while Apple only submitted ⁣267.

The complaint⁢ calls on Apple to invest in comprehensive measures that enhance user privacy and guarantee ⁢child safety. It also references a report by nonprofit Heat Initiative which identified numerous cases involving CSAM on Apple products or services.

Furthermore, ⁢Epic Games’ ongoing lawsuit against Apple includes messages suggesting that one employee claimed their strict⁢ focus on privacy made their platform ideal for distributing child pornography.

This⁣ legal action questions whether maintaining user privacy is‍ truly a priority for Apple given instances such as transferring operations of iCloud Chinese users’ data to ⁢a Chinese firm obligated under China’s⁢ Cyber Security law to ‌provide‌ personal data upon request from authorities known for persecuting dissidents.

To combat‍ online proliferation of CSAM content, Senator Dick Durbin introduced legislation⁢ called STOP ⁤CSAM Act in May last year. This act aims at holding tech firms ‌accountable by⁣ allowing⁣ victims of child exploitation civil recourse against companies hosting or making available such material while also enabling restitution and removal requests penalties.

Share:

Leave the first comment

Related News