zrz
2021-09-06
Like for like heheheh
Apple walks back plans for new child safety tools after privacy backlash
免责声明:上述内容仅代表发帖人个人观点,不构成本平台的任何投资建议。
分享至
微信
复制链接
精彩评论
我们需要你的真知灼见来填补这片空白
打开APP,发表看法
APP内打开
发表看法
1
3
{"i18n":{"language":"zh_CN"},"detailType":1,"isChannel":false,"data":{"magic":2,"id":817348424,"tweetId":"817348424","gmtCreate":1630913116268,"gmtModify":1632905171797,"author":{"id":3577482237164631,"idStr":"3577482237164631","authorId":3577482237164631,"authorIdStr":"3577482237164631","name":"zrz","avatar":"https://static.laohu8.com/default-avatar.jpg","vip":1,"userType":1,"introduction":"","boolIsFan":false,"boolIsHead":false,"crmLevel":2,"crmLevelSwitch":0,"individualDisplayBadges":[],"fanSize":7,"starInvestorFlag":false},"themes":[],"images":[],"coverImages":[],"extraTitle":"","html":"<html><head></head><body><p>Like for like heheheh</p></body></html>","htmlText":"<html><head></head><body><p>Like for like heheheh</p></body></html>","text":"Like for like heheheh","highlighted":1,"essential":1,"paper":1,"likeSize":3,"commentSize":1,"repostSize":0,"favoriteSize":0,"link":"https://laohu8.com/post/817348424","repostId":1158471190,"repostType":4,"repost":{"id":"1158471190","kind":"news","pubTimestamp":1630911387,"share":"https://www.laohu8.com/m/news/1158471190?lang=&edition=full","pubTime":"2021-09-06 14:56","market":"us","language":"en","title":"Apple walks back plans for new child safety tools after privacy backlash","url":"https://stock-news.laohu8.com/highlight/detail?id=1158471190","media":"cnn","summary":"New York (CNN Business) Apple made headlines — and not the good kind — last month when it announced ","content":"<p>New York (CNN Business) Apple made headlines — and not the good kind — last month when it announced a test of a new tool aimed at combating child exploitation. Critics quickly decried the feature's potential privacy implications, and now Apple is taking a long pit stop before moving forward with its plans.</p>\n<p>On Friday, the company said it will pause testing the tool in order to gather more feedback and make improvements.</p>\n<p>The plan centers on a new system that will, if it is eventually launched, check iOS devices and iCloud photos for child abuse imagery. It includes a new opt-in feature that would warn minors and their parents of sexually explicit incoming or sent image attachments in iMessage and blur them.</p>\n<p>Apple's announcement last month that it would begin testing the tool fit with a recent increased focus on protecting children among tech companies — but it was light on specific details and was swiftly met with outraged tweets, critical headlines and calls for more information.</p>\n<p>So on Friday, Apple (AAPL) said it would put the brakes on implementing the features.</p>\n<p>\"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,\" the company said. \"Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.\"</p>\n<p>In a series of press calls aiming to explain the planned tool last month, Apple stressed that consumers' privacy would be protected because the tool would turn photos on iPhones and iPads into unreadable hashes, or complex numbers, stored on user devices. Those numbers would be matched against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC) once the pictures were uploaded to Apple's iCloud storage service. (Apple later said other organizations would be involved in addition to NCMEC.)</p>\n<p>Only after a certain number of hashes matched the NCMEC's photos, Apple's review team would be alerted so that it could decrypt the information, disable the user's account and alert NCMEC, which could inform law enforcement about the existence of potentially abusive images.</p>\n<p>Many child safety and security experts praised the intent of the plan, recognizing the ethical responsibilities and obligations a company has over the products and services it creates. But they also said the efforts presented potential privacy concerns.</p>\n<p>\"When people hear that Apple is 'searching' for child sexual abuse materials (CSAM) on end user phones they immediately jump to thoughts of Big Brother and '1984,'\" Ryan O'Leary, research manager of privacy and legal technology at market research firm IDC, told CNN Business last month. \"This is a very nuanced issue and one that on its face can seem quite scary or intrusive.\"</p>\n<p>Critics of the plan applauded Apple's decision to pause the test.</p>\n<p>Digital rights group Fight for the Future called the tool a threat to \"privacy, security, democracy, and freedom,\" and called on Apple to shelve it permanently.</p>\n<p>\"Apple's plan to conduct on-device scanning of photos and messages is one of the most dangerous proposals from any tech company in modern history,\" Fight for the Future Director Evan Greer said in a statement. \"Technologically, this is the equivalent of installing malware on millions of people's devices — malware that can be easily abused to do enormous harm.\"</p>","collect":0,"html":"<!DOCTYPE html>\n<html>\n<head>\n<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\" />\n<meta name=\"viewport\" content=\"width=device-width,initial-scale=1.0,minimum-scale=1.0,maximum-scale=1.0,user-scalable=no\"/>\n<meta name=\"format-detection\" content=\"telephone=no,email=no,address=no\" />\n<title>Apple walks back plans for new child safety tools after privacy backlash</title>\n<style type=\"text/css\">\na,abbr,acronym,address,applet,article,aside,audio,b,big,blockquote,body,canvas,caption,center,cite,code,dd,del,details,dfn,div,dl,dt,\nem,embed,fieldset,figcaption,figure,footer,form,h1,h2,h3,h4,h5,h6,header,hgroup,html,i,iframe,img,ins,kbd,label,legend,li,mark,menu,nav,\nobject,ol,output,p,pre,q,ruby,s,samp,section,small,span,strike,strong,sub,summary,sup,table,tbody,td,tfoot,th,thead,time,tr,tt,u,ul,var,video{ font:inherit;margin:0;padding:0;vertical-align:baseline;border:0 }\nbody{ font-size:16px; line-height:1.5; color:#999; background:transparent; }\n.wrapper{ overflow:hidden;word-break:break-all;padding:10px; }\nh1,h2{ font-weight:normal; line-height:1.35; margin-bottom:.6em; }\nh3,h4,h5,h6{ line-height:1.35; margin-bottom:1em; }\nh1{ font-size:24px; }\nh2{ font-size:20px; }\nh3{ font-size:18px; }\nh4{ font-size:16px; }\nh5{ font-size:14px; }\nh6{ font-size:12px; }\np,ul,ol,blockquote,dl,table{ margin:1.2em 0; }\nul,ol{ margin-left:2em; }\nul{ list-style:disc; }\nol{ list-style:decimal; }\nli,li p{ margin:10px 0;}\nimg{ max-width:100%;display:block;margin:0 auto 1em; }\nblockquote{ color:#B5B2B1; border-left:3px solid #aaa; padding:1em; }\nstrong,b{font-weight:bold;}\nem,i{font-style:italic;}\ntable{ width:100%;border-collapse:collapse;border-spacing:1px;margin:1em 0;font-size:.9em; }\nth,td{ padding:5px;text-align:left;border:1px solid #aaa; }\nth{ font-weight:bold;background:#5d5d5d; }\n.symbol-link{font-weight:bold;}\n/* header{ border-bottom:1px solid #494756; } */\n.title{ margin:0 0 8px;line-height:1.3;color:#ddd; }\n.meta {color:#5e5c6d;font-size:13px;margin:0 0 .5em; }\na{text-decoration:none; color:#2a4b87;}\n.meta .head { display: inline-block; overflow: hidden}\n.head .h-thumb { width: 30px; height: 30px; margin: 0; padding: 0; border-radius: 50%; float: left;}\n.head .h-content { margin: 0; padding: 0 0 0 9px; float: left;}\n.head .h-name {font-size: 13px; color: #eee; margin: 0;}\n.head .h-time {font-size: 11px; color: #7E829C; margin: 0;line-height: 11px;}\n.small {font-size: 12.5px; display: inline-block; transform: scale(0.9); -webkit-transform: scale(0.9); transform-origin: left; -webkit-transform-origin: left;}\n.smaller {font-size: 12.5px; display: inline-block; transform: scale(0.8); -webkit-transform: scale(0.8); transform-origin: left; -webkit-transform-origin: left;}\n.bt-text {font-size: 12px;margin: 1.5em 0 0 0}\n.bt-text p {margin: 0}\n</style>\n</head>\n<body>\n<div class=\"wrapper\">\n<header>\n<h2 class=\"title\">\nApple walks back plans for new child safety tools after privacy backlash\n</h2>\n\n<h4 class=\"meta\">\n\n\n2021-09-06 14:56 GMT+8 <a href=https://edition.cnn.com/2021/09/03/tech/apple-child-safety-tools-reversal/index.html><strong>cnn</strong></a>\n\n\n</h4>\n\n</header>\n<article>\n<div>\n<p>New York (CNN Business) Apple made headlines — and not the good kind — last month when it announced a test of a new tool aimed at combating child exploitation. Critics quickly decried the feature's ...</p>\n\n<a href=\"https://edition.cnn.com/2021/09/03/tech/apple-child-safety-tools-reversal/index.html\">Web Link</a>\n\n</div>\n\n\n</article>\n</div>\n</body>\n</html>\n","type":0,"thumbnail":"","relate_stocks":{"AAPL":"苹果"},"source_url":"https://edition.cnn.com/2021/09/03/tech/apple-child-safety-tools-reversal/index.html","is_english":true,"share_image_url":"https://static.laohu8.com/e9f99090a1c2ed51c021029395664489","article_id":"1158471190","content_text":"New York (CNN Business) Apple made headlines — and not the good kind — last month when it announced a test of a new tool aimed at combating child exploitation. Critics quickly decried the feature's potential privacy implications, and now Apple is taking a long pit stop before moving forward with its plans.\nOn Friday, the company said it will pause testing the tool in order to gather more feedback and make improvements.\nThe plan centers on a new system that will, if it is eventually launched, check iOS devices and iCloud photos for child abuse imagery. It includes a new opt-in feature that would warn minors and their parents of sexually explicit incoming or sent image attachments in iMessage and blur them.\nApple's announcement last month that it would begin testing the tool fit with a recent increased focus on protecting children among tech companies — but it was light on specific details and was swiftly met with outraged tweets, critical headlines and calls for more information.\nSo on Friday, Apple (AAPL) said it would put the brakes on implementing the features.\n\"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,\" the company said. \"Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.\"\nIn a series of press calls aiming to explain the planned tool last month, Apple stressed that consumers' privacy would be protected because the tool would turn photos on iPhones and iPads into unreadable hashes, or complex numbers, stored on user devices. Those numbers would be matched against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC) once the pictures were uploaded to Apple's iCloud storage service. (Apple later said other organizations would be involved in addition to NCMEC.)\nOnly after a certain number of hashes matched the NCMEC's photos, Apple's review team would be alerted so that it could decrypt the information, disable the user's account and alert NCMEC, which could inform law enforcement about the existence of potentially abusive images.\nMany child safety and security experts praised the intent of the plan, recognizing the ethical responsibilities and obligations a company has over the products and services it creates. But they also said the efforts presented potential privacy concerns.\n\"When people hear that Apple is 'searching' for child sexual abuse materials (CSAM) on end user phones they immediately jump to thoughts of Big Brother and '1984,'\" Ryan O'Leary, research manager of privacy and legal technology at market research firm IDC, told CNN Business last month. \"This is a very nuanced issue and one that on its face can seem quite scary or intrusive.\"\nCritics of the plan applauded Apple's decision to pause the test.\nDigital rights group Fight for the Future called the tool a threat to \"privacy, security, democracy, and freedom,\" and called on Apple to shelve it permanently.\n\"Apple's plan to conduct on-device scanning of photos and messages is one of the most dangerous proposals from any tech company in modern history,\" Fight for the Future Director Evan Greer said in a statement. \"Technologically, this is the equivalent of installing malware on millions of people's devices — malware that can be easily abused to do enormous harm.\"","news_type":1},"isVote":1,"tweetType":1,"viewCount":245,"commentLimit":10,"likeStatus":false,"favoriteStatus":false,"reportStatus":false,"symbols":[],"verified":2,"subType":0,"readableState":1,"langContent":"EN","currentLanguage":"EN","warmUpFlag":false,"orderFlag":false,"shareable":true,"causeOfNotShareable":"","featuresForAnalytics":[],"commentAndTweetFlag":false,"andRepostAutoSelectedFlag":false,"upFlag":false,"length":18,"xxTargetLangEnum":"ORIG"},"commentList":[],"isCommentEnd":true,"isTiger":false,"isWeiXinMini":false,"url":"/m/post/817348424"}
精彩评论