Ahead of child safety group's campaign to pressure Apple to scan iCloud for CSAM, Apple says such sans may lead to a “slippery slope of unintended consequences” (Lily Hay Newman/Wired) 01-09-2023
Lily Hay Newman / Wired: Ahead of child safety group’s campaign to pressure Apple to scan iCloud for CSAM, Apple says such sans may lead to a “slippery slope of unintended consequences” — Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. Lees verder op Tech Meme