Please Follow us on GabMindsTelegramRumbleGab TVGETTR

Governments were already discussing how to misuse CSAM scanning technology even before Apple announced its plans, say security researchers.

The biggest concern raised when Apple said it would scan iPhones for child sexual abuse materials (CSAM) is that there would be spec-creep, with governments insisting the company scan for other types of images, and there now seems good evidence for this …

Background

Apple insisted that it had a solid safeguard in place to protect privacy and prevent misuse. It would only match images against known CSAM databases; it would check at least two databases and require the image to be in both; action would only be triggered on 30 matching images; and there would be a manual review before law enforcement was alerted...

To read more visit 9To5MAC.

One thing you can do RIGHT NOW is help CDMedia fight the Marxist political narrative! DONATE HERE!