Spark Global Limited reports:
Are you used to putting photos in iCloud? Apple will start scanning files uploaded by users of its iCloud later this year to check for sexual exploitation of children (CSAM) or violations of its rules of use. The new technology will be integrated with iOS 15 and iPadOS 15, which will be released later this year, and will also link data from the NATIONAL Center for Missing and Exploited Children (NCMEC) database. Let’s take a look!
To avoid privacy concerns, Apple stressed that the technology would not scan users’ photos and images directly on iCloud, but would compare them to NCMEC’s database to determine whether the uploads were illegal. In order to prevent parents from being misjudged by the system for storing a large number of photos of their children, Apple also has a human review mechanism in place once certain accounts are continuously tagged. Apple says the system will ensure that the odds of an annual comparison error are less than one in a trillion.
According to Apple, when users upload files to iCloud, the files will be scanned simultaneously; All files that have been checked for safety will automatically have a security certificate added to them, preventing them from being decoded by Apple to ensure the user’s personal privacy.
In other words, Apple will only be able to access the content that was alerted by the NCMEC database. Apple said that as long as all internal reviews are correct, in addition to suspending the user’s iCloud account, the content will be referred to NCMEC or law enforcement for further investigation.
In addition, Apple is working with child safety experts to provide parents with a more efficient and comprehensive way to maintain children’s online environment, including automatic warnings when children receive or send inappropriate content, or system alerts when children try to find inappropriate content through Siri and search mechanisms.
Reprint indicated source：Spark Global Limited information