23 August 2021
Apple has announced new child safety features to fight against child sexual abuse materials (CSAM). It will start scanning messages sent using iPhone, iPad, and Mac to look for sensitive content that can be harmful to children. It will also scan images before storing them to iCloud with similar goals. It does say that scanning will be powered by "on-device machine learning" and it will not be able to read any other content.
Starting with Messages, Apple says it will warn child users before viewing or sending sexually explicit photos. All such detected received images will be blurred. It will offer helpful resources to help children make sense of what is going on. If they still want to view the photo, they can do so but their parents will be notified in that case. Similarly, child users will be alerted before sending a sensitive photo and if they still end up proceeding, parents will receive an alert. These features are said to become available on iPhone, iPad, and Mac by the end of this year.
Apple will also be keeping an eye on iCloud Photos but before those photos get uploaded. It says, "Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."
It is further touting the accuracy of the image detection machine learning technology. It says there is "less than a one in one trillion chance per year of incorrectly flagging a given account". It adds that if someone's account gets flagged by mistake, they will be allowed to raise a dispute and get it reinstated.
The team is bringing improvements to Siri and Search guidance system. These will now point parents and child users to relevant resources when they will search for how to report CSAM. If there is someone who is searching for child pornographic material, Siri will intervene to notify him about the dangerous situation they could find themselves in. It will also suggest to them partner resources that they consult to get help.