Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Apple is already saying "We'll take this database of illegal image hashes provided by the government and use it to scan your phone."

This is incorrect.

Apple has been saying—since 2019![0]—that they can scan for any potentially illegal content, not just images, not just CSAM, and not even strictly illegal.

That's what should be opposed. CSAM is a red herring.

[0] https://www.macobserver.com/analysis/apple-scans-uploaded-co...



The difference is that's scanning in the cloud, on their servers, of things people choose to upload. It's not scanning on the user's private device, and currently they have no way to scan what's on a private device.


The phrase is “pre-screening” of uploaded content, which is what is happening. I'm pretty sure this change in ToS was made to enable this CSAM feature.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: