Apple slammed for not doing enough to prevent CSAM distribution

iMessage can warn minors about nudity in photos

Apple and Microsoft have provided details of their methods for detecting or preventing child sexual abuse material distribution, and an Australian regulator has found their efforts lacking.

The Australian e-Safety Commissioner demanded that major tech firms like Apple, Facebook, Snapchat, Microsoft, and others detail their methods for preventing child abuse and exploitation on their platforms. The demand was made on August 30, and the companies had 29 days to comply or face fines.

Apple and Microsoft are the first companies to receive scrutiny from this review, and according to Reuters, the Australian regulator found their efforts insufficient. The two companies do not proactively scan user files on iCloud or OneDrive for CSAM, nor are there algorithmic detections in place for FaceTime or Skype.

Instead of scanning for existing content being stored or distributed, Apple has decided to take a different approach that will evolve over time. Currently, devices used by children can be configured by parents to alert the child if nudity is detected in photos being sent over iMessage.

Apple plans on expanding this capability to detect material in videos, then move the detection and warning system into FaceTime and other Apple apps. Eventually, the company hopes to create an API that would let developers use the detection system in their apps as well.

Abandoning CSAM detection in iCloud has been celebrated by privacy advocates. At the same time, it has has been condemned by child safety groups, law enforcement, and governmental officials.

Author: Subham

Leave a Reply

Your email address will not be published. Required fields are marked *