Meteen naar de inhoud

Ahead of a child safety group's campaign, Apple details why it's dropping iCloud CSAM scanning, saying it could be a “slippery slope of unintended consequences” (Lily Hay Newman/Wired) 01-09-2023

Lily Hay Newman / Wired:
Ahead of a child safety group’s campaign, Apple details why it’s dropping iCloud CSAM scanning, saying it could be a “slippery slope of unintended consequences”  —  Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting.


Lees verder op Tech Meme