» Tip Us Hey you, we are hiring! Join us if you are an author, developer or designer!

Apple to scan devices to detect child sexual abuse materials

06 August 2021 1

Apple has announced new child safety features to fight against child sexual abuse materials (CSAM). It will start scanning messages sent using iPhone, iPad, and Mac to look for sensitive content that can be harmful to children. It will also scan images before storing them to iCloud with similar goals. It does say that scanning will be powered by "on-device machine learning" and it will not be able to read any other content.

Starting with Messages, Apple says it will warn child users before viewing or sending sexually explicit photos. All such detected received images will be blurred. It will offer helpful resources to help children make sense of what is going on. If they still want to view the photo, they can do so but their parents will be notified in that case. Similarly, child users will be alerted before sending a sensitive photo and if they still end up proceeding, parents will receive an alert. These features are said to become available on iPhone, iPad, and Mac by the end of this year.


Apple will also be keeping an eye on iCloud Photos but before those photos get uploaded. It says, "Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."

It is further touting the accuracy of the image detection machine learning technology. It says there is "less than a one in one trillion chance per year of incorrectly flagging a given account". It adds that if someone's account gets flagged by mistake, they will be allowed to raise a dispute and get it reinstated.

The team is bringing improvements to Siri and Search guidance system. These will now point parents and child users to relevant resources when they will search for how to report CSAM. If there is someone who is searching for child pornographic material, Siri will intervene to notify him about the dangerous situation they could find themselves in. It will also suggest to them partner resources that they consult to get help.



1

comments

Apple to scan devices to detect child sexual abuse materials
Write a comment...
Derrick09

It is easy to fall prey to incompetent hackers, but after reading this, i hope you do not ever have to be a victim of hackers who are not able to complete the job given to them. I was once a victim but not anymore and this is because i have found the best hacker ever and i wish to let you know that Hackmart is the best you can ever think of and he always deliver and he his reliable… contact hackmart242 @gmail.com Visit w ww. hackmart. org to learn more about their terms.

Android

Try OnePlus 11 5G free for 100 days with this new promo

Windows

Samsung Galaxy Book 3 series bring 13th Gen Intel CPUs, different form factors

Android

Samsung debuts Galaxy S23 series with Snapdragon 8 Gen 2, 200MP camera

Apple

Apple launches MacBook Pro, Mac Mini with M2, M2 Pro, M2 Max chips