Apple will soon begin to implement a mandatory software update to U.S. iPhones which includes a program to scan for images of child sexual abuse. The detection system known as “neuralMatch” will use a cryptographic algorithm to continuously search photos that are on a physical device and/or uploaded to a user’s iCloud account. The program reportedly will be flagging and reporting images that are associated with a database maintained by the National Center for Missing and Exploited Children (NCMEC). All “matches” will be reviewed by a human, and if child pornography is confirmed, the user’s account will be disabled and law enforcement will be notified. Apple has indicated that the detection process requires multiple layers and steps of automated review before a final, manual determination.
While the announcement has concerned privacy advocates, it is important to know that most cloud- based services, such as Dropbox, Google, and Microsoft, already utilize similar software. However, this
development is significant for Apple, a company which has historically prided itself on user encryption and privacy, even previously refusing to unlock an iPhone belonging to an individual behind the San Bernardino terror attack. In response, Apple has indicated that parents taking innocent photos of their children, in a bathtub for example, need not worry. This is because neuralMatch does not “see” the images but relies on associated hash values provided by NCMEC. Apple claims a “one in one trillion chance” of a false positive and that an appeals process would be established for alleged mistakes.
* * *