Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That is to say face scanning is equally insidious as the new feature?


The technical risk to user privacy - if your threat model is a coerced Apple building surveillance features for nation state actors - is exactly the same between CSAM detection and Photos intelligence which sync results through iCloud. In fact, the latter is more generalizable, has no threshold protections, and so is likely worse.


It's the legal risk that is the biggest problem here. Now that every politician out there knows that this can be done for child porn, there'll be plenty demanding the same for other stuff. And this puts Apple in a rather difficult position, since, with every such demand, they have to either accede, or explain why it's not "important enough" - which then is easily weaponized to bash them.

And not just Apple. Once technical feasibility is proven, I can easily see governments mandating this scheme for all devices sold. At that point, it can get even more ugly, since e.g. custom ROMs and such could be seen as a loophole, and cracked down upon.


This hypothetical lacks an explanation for why every politician has not demanded Apple (or say Google) do this scope creep already for photos stored in the cloud where the technical feasibility and legal precedent has already been established by existing CSAM scanning solutions deployed at scale.


I have to note that one of those solutions deployed at scale is Google's. But the big difference is that when those were originally rolled out, they didn't make quite that big of a splash, especially outside of tech circles.

I will also note that, while it may be a hypothetical in this particular instance as yet, EU already went from passing a law that allows companies to do something similar voluntarily (previously, they'd be running afoul of privacy regulations), to a proposed bill making it mandatory - in less than a year's time. I don't see why US would be any different in that regard.


Ok but now you’ve said that the precedent established by Google and others already moved the legislation to require terrible invasions of privacy far along. You started by saying Apple’s technology (and, in particular, its framing of the technology) has brought new legal risk. What I’m instead hearing is the risk would be present in a counter factual world where nothing was announced last week.

At this point of the discussion, people usually pivot to scope creep: the on-device scanning could scan all your device data, instead of just the data you put on the cloud. This claim assumes that legislators are too dumb to connect the fact that if their phone can search for dogs with “on-device processing,” then it could also search for contraband. I doubt it. And even if they are, the national security apparatus will surely discover this argument for them, aided by the Andurils and NSOs of the world.

As I have repeatedly said: the reaction to this announcement sounds more like a collective reckoning of where we are as humans and not any particular new risk introduced by Apple. In the Apple vs. FBI letter, Tim urged us to have a discussion about encryption, when we want it, why we want it, and to what extent we should protect it. Instead, we elected Trump.


The precedent established by Google et al is that it's okay to scan things that are physically in their data centers. It's far from ideal, but at least it's somewhat common sense in that if you give your data to strangers, they can do unsavory things with it.

The precedent now established by Apple is that it's okay to scan things that are physically in possession of the user. Furthermore, they claim that they can do it without actually violating privacy (which is false, given that there's a manual verification step).


The precedent established by Apple, narrowly read, is it’s ok to scan data that the user is choosing to store in your data center. As you pointed out, this is at least partly a legal matter, and I’m sure their lawyers - the same ones who wrote their response in Apple vs. FBI I’d imagine - enumerated the scope or lack thereof.

Apple’s claim, further, is that this approach is more privacy-preserving than one which requires your cloud provider to run undisclosed algorithms on your plaintext photo library. They don’t say this is not “violating privacy,” nor would that be a well-defined claim without a lot of additional nuance.


Nonsense. Building an entire system as opposed to adding a single image to a database is a substantially different level of effort. In the US at least this was used successfully as a defense. The US cannot coerce companies build new things on their behalf because it would effectively create "forced speech" which is forbidden by the US Constitution. However they can be coerced if there is minimal effort like adding a single hash to a database.


Photos intelligence already exists, and if people are really going to cite the legal arguments in Apple vs. FBI, then it’s important to remember the “forced speech” Apple argued it could not be compelled to make was changing a rate limit constant on passcode retries.


Exactly this. The whole thing is a red herring. If Apple wanted to go evil, they can easily do so, and this very complex CSAM mechanism is the last thing that will help them.


I’ve read your comments, and they are a glass of cold water in the hell of this discourse. This announcement should force people to think about how they are governed - to the extent they can influence it - and double down on Free Software alternatives to the vendor locked reality we live in.

Instead, a forum of presumably technically savvy people are reduced to hysterics over implausible futures and a letter to ask Apple to roll back a change that is barely different from, and arguably better than, the status quo.


Thanks. I couldn’t agree more.

We need both - develop free software alternatives (which means to stop pretending the alternatives are good enough), and to get real about supporting legal and governance principles that would protect against abuses.

If people want to do something about this, these are the only protections.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: