Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But that’s not the proposed design. Browser cache isn’t getting advanced. It’s photos for iCloud.

I’m not asking if people do bad things, I’m saying over and over again this is getting coverage on Hn and people are pointing to this hypothetical issue — yet this hypothetical issue has been possible for years on many more devices.



As far as I have read its scanning any images and messages on the device, as well as text entered into Siri.

So accidentally stick a w in your siri teen porn search and you'll be seeing https://www.apple.com/v/child-safety/a/images/guidance-img__...

If it was photos taken on the device there would be no existing hash for the image to match.


https://www.apple.com/child-safety/pdf/Expanded_Protections_...

"Does this mean Apple is going to scan all the photos stored on my iPhone?

No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone pho- to library on the device."


Totally tangential, but i didn't realize that there were well-advertised "anonymous helplines for at-risk thoughts". I'm kind of curious about it as a pathology (what does "help" look like?), but I'm uneasy about even getting that in my search history




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: