Set as Homepage - Add to Favorites

日韩欧美成人一区二区三区免费-日韩欧美成人免费中文字幕-日韩欧美成人免费观看-日韩欧美成人免-日韩欧美不卡一区-日韩欧美爱情中文字幕在线

【????? ?????? ??? ??????】Apple delays plan to check iPhones for child abuse images

The ????? ?????? ??? ??????pushback against Apple's plan to scan iPhone photos for child exploitation images was swift and apparently effective.

Apple said Friday that it is delaying the previously announced system that would scan iPhone users' photos for digital fingerprints that indicated the presence of known Child Sexual Abuse Material (CSAM). The change is in response to criticism from privacy advocates and public outcry against the idea.

"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material," a September 3 update at the top of the original press release announcing the program reads. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."


You May Also Like

Announced in August, the new feature for iOS 15 would have checked photos in an iPhone user's photo library — on the device before sending the photos to iCloud — against a database of known CSAM images. If the automated system found a match, the content would be sent to a human reviewer, and ultimately reported to child protection authorities.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

The fact that the scanning happened on the device alarmed both experts and users. Beyond it being generally creepy that Apple would have the ability to view photos users hadn't even sent to the cloud yet, many criticized the move as hypocritical for a company that has leaned so heavily into privacy. Additionally, the Electronic Frontier Foundation criticized the ability as a "backdoor" that could eventually serve as a way for law enforcement or other government agencies to gain access to an individual's device.

"Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor," the EFF said at the time.

Experts who had criticized the move were generally pleased with the decision to do more research.

Others said the company should go further to protect users' privacy. The digital rights organization fight for the future said Apple should focusing on strengthening encryption.

While other companies scan cloud-based photo libraries for CSAM, like Google, and the overall goal of protecting children is obviously a good one, Apple thoroughly bungled the rollout of this product with privacy concerns justifiably overshadowing the intended purpose. Better luck next time, folks.

Topics Apple Cybersecurity iPhone Privacy

0.1378s , 9926.84375 kb

Copyright © 2025 Powered by 【????? ?????? ??? ??????】Apple delays plan to check iPhones for child abuse images,Public Opinion Flash  

Sitemap

Top 主站蜘蛛池模板: 中文字幕无码一区二区免费 | 精品国产综合在线 | 久久久久精品亚洲 | 九九久久久久无码国产精品 | 国产亚洲欧美在线观看三区 | 亚洲国产初高中生女av | 免费看午夜福利在线观看 | 欧美精产国品一二三产品特点 | 日本美女家庭教师黄色网站 | 伦理在线 | 国产福利资源网在线观看 | 国产精品厕所 | 国产成人精品第一区二区三区官网版手机版 | 麻豆果冻传媒av精品一区 | 国产果果在线播放在线 | 久久久国产一区 | 亚洲色妻| 国产真实乱对白精 | 爆乳无码中文字幕在线观看 | 国产不卡免费视频 | 无码精品人妻一区二区三区免费 | 亚州巨乳成人片 | 国产大屁股视频免费区无卡 | 精品一区二区三区中文在线 | 国内自拍偷拍 | 另类亚洲欧美精品久久 | 2024精品久久久久久中文字 | 日本黄色免费网站 | 伦理电影中文字幕韩国在线观看 | 人妻系列中文字幕亚洲 | 91精品国产免费青青 | 韩国三级伦在线观看久 | 91精品啪在线观看国产九色 | 久久精品国产亚洲不v麻豆 久久精品国产亚洲妲己影院 | 国产精品白嫩在线观看 | 激情四房 | 亚洲欧美日本久久综合网站 | 人妻无码AV一区二区三区 | 国产成人精品日本无码动漫 | 国产精品91四虎不卡 | 久久国产精品99久久久久久牛牛 |