Apple Says It Will 'Refuse' Any Demands To Surveil Content Beyond Child Sexual Abuse From Governments

Benzinga2021-08-10

Apple has stressed that it will not allow any government to conduct surveillance using its new tool aimed at detecting child sexual abuse material (CSAM).

What Happened: Apple said its CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at the National Center for Missing and Exploited Children (NCMEC) and other child safety groups.

“Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it,” Apple said in a document posted to its website.

The tech giant added that the feature does not work on the private iPhone photo library on the device.

Why It Matters: Apple announced the launch of the new features last week but ittriggered a controversyover whether the system reduces Apple user privacy.

Security experts are worried that the technology could be eventually be expanded to scan phones for other prohibited content. It could also be used by authoritarian governments to spy on dissidents and protestors.

Will Cathcart, the head of Facebook's WhatsApp instant messaging app,criticized Apple’s plan to launch the new features, calling it a “setback for people’s privacy all over the world.”

Price Action: Apple shares closed less than 0.1% lower in Monday’s trading at $146.09.

免责声明:本文观点仅代表作者个人观点,不构成本平台的投资建议,本平台不对文章信息准确性、完整性和及时性做出任何保证,亦不对因使用或信赖文章信息引发的任何损失承担责任。

精彩评论

发表看法
4