It is a difficult technical problem for Apple to solve all of the corner-cases. The article shows the screenshot of seemingly 100 unique FindMy devices around this guy's personal residence... there may be some characterization work that can help solve that so an iPhone user would get the alert message. But Apple will continue to promote it and dismiss or downplay these security concerns.
> The problem is easy to solve, just store copies of all public keys of each air tag you send out.
that kills the privacy aspect of it, because it also means apple knows about the exact whereabouts of each tag. airtags are specifically designed/marketed so apple can't do that.
Software (and I guess hardware too) is about tradeoffs. The tradeoff here is that in not being able to validate if a device is a valid Airtag is that Apple has created a massive, completely uncontrollable surveillance network. The fact that anyone can interrogate the network to track devices that aren't even guaranteed to be running the official firmware or have the official hardware is insane. Not having Apple be able to know the location of the tags is pretty much irrelevant in the face of this downside.
It basically operates like dead drops. Airtags broadcast their location using a public key that constantly rotates. Apple maintains a mapping of public key to location. Anyone can look up the location of a public key, but the search space is so big that it's not worth bruteforcing. Even if you did, all you'd end up is a heatmap[1] of airtags, not very helpful. However, if you know the corresponding secret, you can predict what the public key will be and know the exact whereabouts of a particular device.
A core selling point of Airtags is that other people's iPhones help you find your AirTag. That's also what makes them effective trackers. It's a bit of an unsolvable problem.
Yes, and Apple 100% has the capacity / ability to filter out "fake" AirTags on their back-end. All they need to do is setup a manufacturing process that captures the public keys.
So the phones will still relay the beacons to Apple, who can then do things and just reject messages from these fake tags.
(I worked for a Medical Device Company that set all of this up within our supply chain).
If they haven't been doing this so far, it seems like it will be a tough job to record them after the fact. Perhaps they could interrogate each device and require it to be re-adopted, then record the data at that point but it seems like an arms race they won't win.
Yes, you could do attestation schemes for hardware - such as a single manufacturing-time private key for large batches (say 1M+ AirTags) or something like Direct Anonymous Attestation.
Apple likely would go toward batch keys - in addition to being simpler crypto, it doesn't give them the capability to use other mechanisms to potentially correlate location reports.
That said, AirTags work solely within BLE advertisements, which are payload size limited to 31 bytes. Apple is currently using 30 of those bytes.
Since the AirTag emits the message, that message would either contain:
- a static signature, which could then be copied and mimic'd by imposters (replay attack)
- store the private key on the AirTag device, which could then sign the a continuously changing nonce like the current datetime. But this means the private key could then be extracted from one device, and used to sign messages on an imposter device. So unless every device had a separate private key, this method would immediately be compromised as well.
So why doesn't Apple have a unique private key for each device? Well it appears it actually does, and has them constantly changing their private keys. But there appears to be some kind purposely implemented anonymity features that is designed to prevent Apple's servers from associating a ping with ever having to decode the contents, and thus of associating your account/device with the emitted location.
If you build in validation to write to the network ping database that goes "here's a ping with a signature ABC and let's lookup if it's valid, oh it is, that must be from AirTag Bob bought last month with private key XYZ, let's declare this ping valid" then Apple is only a logfile-write-of-this-information away from being able to perfectly stalk everyone who has purchased a device. So instead, the tradeoff they made is they don't keep track / purposely blind themselves to their device-in-circulation keys to truthfully say they actually can't track you. That leaves open the ability of imposter devices to transmit information through the network by creating their own known keys which look indistinguishable from authentic device pings.