$610,000 Penalty Imposed for Failure to Address Child Sexual Abuse Materials

The eSafety Commissioner in Australia has taken legal action against social media giant X, previously known as Twitter, for its failure to comply with government requirements regarding child sexual exploitation materials. Previously, in February, the eSafety commissioner issued a transparency notice to several social media companies, requesting information on how they were addressing child sexual exploitation and abuse materials and activities on their platforms. X did not comply with the notice, failing to prepare a report in the required manner and form. Additionally, the company did not truthfully and accurately respond to some questions in the notice, such as the time needed to respond to reports of child sexual exploitation and the measures implemented to detect child sexual exploitation in live streams.

The eSafety Commissioner also found that X inadequately disclosed the number of safety and public policy staff remaining after tech billionaire Elon Musk acquired the platform in October 2022 and implemented several rounds of job cuts. Consequently, while other social media platforms did not respond well to the transparency notice, eSafety found X’s non-compliance very serious. In response, the eSafety issued a fine of $610,500 to X, giving the company 28 days to request the withdrawal of the infringement notice or pay the penalty. However, X did not pay the fine nor requested withdrawal, opting instead for a judicial review of eSafety’s transparency and infringement notices.

In light of these developments, the eSafety is seeking to have the judicial review heard while commencing civil penalty proceedings against X at the same time. The eSafety Commissioner, Julie Inman Grant, emphasized that the other social media platforms also did not do well in tackling child sexual exploitation materials in Australia.

Among the platforms, Discord did not take any measures to detect child sexual exploitation in live streams, citing “prohibitively expensive” costs. The company also did not use any language analysis technology to detect child sexual abuse activities, such as sexual extortion, across its services. Google only used such technology on YouTube, but not on Chat, Gmail, Meet, and Messages. Furthermore, Google and Discord did not block links to known child sexual exploitation materials nor used technology to detect grooming in some or all of their services.

Overall, the eSafety Commissioner has taken a strong stance against social media companies that fail to comply with government requirements regarding child sexual exploitation materials. The legal action against X sends a strong message to all social media platforms, emphasizing the importance of taking meaningful and proactive action to address this issue. The eSafety Commissioner is committed to ensuring the safety and protection of innocent children from predatory adults through stringent regulatory measures and penalties for non-compliance.

Share:

Hot News