Apple has stressed that it will not allow any government to conduct surveillance using its new tool aimed at detecting child sexual abuse material (CSAM).
What Happened: Apple said its CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at the National Center for Missing and Exploited Children (NCMEC) and other child safety groups.
“Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it,” Apple said in a document posted to its website.
The tech giant added that the feature does not work on the private iPhone photo library on the device.
Why It Matters: Apple announced the launch of the new features last week but ittriggered a controversyover whether the system reduces Apple user privacy.
Security experts are worried that the technology could be eventually be expanded to scan phones for other prohibited content. It could also be used by authoritarian governments to spy on dissidents and protestors.
Will Cathcart, the head of Facebook's WhatsApp instant messaging app,criticized Apple’s plan to launch the new features, calling it a “setback for people’s privacy all over the world.”
Price Action: Apple shares closed less than 0.1% lower in Monday’s trading at $146.09.