I don’t like that system, but according to their summary document, CSAM detection system is only (as described today) processing images that are uploaded to iCloud.
Please look more carefully at that document. It says, on page 5: "Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations."
And look at the diagram. The matching of image hashes with user photos occurs "on device."
I just think that there are better arguments against it without unnecessary exaggeration. So I wanted to be more specific on which images on your device are scanned according to that paper.
Obviously I should have phrased is better, like “about to be uploaded to iCloud”…
On device != Scanning your device. It scans your iCloud uploads and if configured, iMessages (which you didnt mean, because that doesnt work with hashes)
The general worry is that the fact that it happens on the device means in the future it could conceivably scan your device, even if there are some software checks in place to only scan things being synced to iCloud.
I'll need to see their source before I believe that. Apple ("privacy is a human right") lost their presumption of good faith when they announced that they will automatically notify law enforcement if their algorithms and processes suspect you're doing something naughty on your device.
[1] https://www.apple.com/child-safety/pdf/Expanded_Protections_...