Добавить новость
ru24.net
News in English
Декабрь
2022

Apple stops developing CSAM detection system for iPhone users

0

Last year, Apple announced that iCloud Photos would be able to detect inappropriate material in users' photos based on a database of Child Sexual Abuse Material (CSAM) image hashes. While Apple wouldn't see these photos since it would use on-device processing, it generated a lot of criticism from privacy and security researchers.

Now, after announcing a new Advanced Data Protection for iCloud, the company's executive Craig Federighi has confirmed that Apple will not roll out the CSAM detection system for iPhone users as the company has stopped developing it.

Continue reading...

The post Apple stops developing CSAM detection system for iPhone users appeared first on BGR.

Today's Top Deals

  1. Oops! 67 crazy Black Friday deals that Amazon forgot to end
  2. Shockingly compact foldable camera drone is on sale for $40
  3. Best Deals: Amazon, Apple, Tech, TVs, and more sales
  4. This $23 device helps you cook perfect & fish steak every time

Trending Right Now:

  1. iPhone has a mind-blowing volume control trick that you’re not using
  2. How to create unlimited email addresses in your Gmail account
  3. James Webb telescope breakthrough lets us ‘see’ dark matter

Apple stops developing CSAM detection system for iPhone users originally appeared on BGR.com on Wed, 7 Dec 2022 at 13:56:36 EDT. Please see our terms for use of feeds.




Moscow.media
Частные объявления сегодня





Rss.plus
















Музыкальные новости




























Спорт в России и мире

Новости спорта


Новости тенниса