We’ll, what Apple built with CSAM checking is a tool that checks images uploaded to their servers, and only those images, against a digital fingerprint.
There is no novel technology in this. It’s very basic stuff doing a specific thing. It’s in no way a meaningful stepping stone to doing anything else.
If you’re against Apple scanning images uploaded to iCloud for CSAM that’s fine, say that, but it’s no stepping stone. If you’re not against Apple scanning CSAM uploaded to their servers, as they are actually required to do by law, then it’s a non-issue.
> We’ll, what Apple built with CSAM checking is a tool that checks images uploaded to their servers, and only those images, against a digital fingerprint.
No, that's not how any of it works. Apple scans and matches your photos, on your device, against a list of hashes that represent government provided images that Apple has no knowledge of. The stipulation that images are only scanned when uploaded to Apple servers is a courtesy, not a technical limitation. The exact courtesy that would be rescinded by pressure from the Chinese government.
> If you’re not against Apple scanning CSAM uploaded to their servers, as they are actually required to do by law, then it’s a non-issue.
The law explicitly stipulates that Apple is _not_ required to scan photos for CSAM. The text of the law [1] explicitly protects people's privacy in the section titled "Protection of Privacy".
>The stipulation that images are only scanned when uploaded to Apple servers is a courtesy, not a technical limitation. The exact courtesy that would be rescinded by pressure from the Chinese government.
The scan is performed in the code that does the uploading, so it is a hard limitation of the implementation. They can't push a button to make it scan other photos, they would have to make significant code changes. Not hard code changes, none of this is hard or a major technology achievement, but the current implementation does one specific thing and that thing only.
On the legal requirement, huh, it looks like you're quite right. They are choosing to scan for CSMA loaded into their service. Good for them.
There's nothing to stop the Chinese government requiring Apple to make whatever code changes they require anyway.
It was only when upload to iCloud was enabled which is functionally equivalent to a check on server (and enables future roll out of encrypted content on the sever).
The CSAM issue was mostly misunderstood by HN imo.
As long as we are clear about the circumstances, and we're not misrepresenting the situation so it appears it can scan photos not being uploaded to iCloud, sure.
The intention of the implementation is that if a user asks Apple to upload CSAM to iCloud, that Apple has the ability to check it and stop that CSAM landing on their servers. They don't want it there, and feel they have a right to check for it in advance so their servers are clear of CSAM and it stays on the user's phone. Frankly I think that's a reasonable attitude to take to CSAM.
> and feel they have a right to check for it in advance
Ok, and thats the problem. Because the scan is on the user's device. No, I don't think they should have the right to use someone else's device, that the individual owns, like that.
There is no novel technology in this. It’s very basic stuff doing a specific thing. It’s in no way a meaningful stepping stone to doing anything else.
If you’re against Apple scanning images uploaded to iCloud for CSAM that’s fine, say that, but it’s no stepping stone. If you’re not against Apple scanning CSAM uploaded to their servers, as they are actually required to do by law, then it’s a non-issue.