Technology

Apple Child Safety photo scanning: what you need to know

[ad_1]

Apple announced that it would be enacting a new protocol: automatically scanning iPhones and iPads to check user photos for child sexual assault material (CSAM). The company is doing this to limit the spread of CSAM, but also adding other features ‘to protect children from predators who use communication tools to recruit and exploit them,’ Apple explained in a blog post. For now, the features will only be available in the US.

Apple will institute a new feature in iOS 15 and iPadOS 15 (both expected to launch in the next couple months) that will automatically scan images on a user’s device to see if they match previously-identified CSAM content, which is identified by unique hashes (e.g. a set of numbers consistent between duplicate images, like a digital fingerprint). 



[ad_2]

Share this news on your Fb,Twitter and Whatsapp

File source

NY Press News:Latest News Headlines
NY Press News||Health||New York||USA News||Technology||World News

Tags
Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close