[cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone
noloader at gmail.com
Thu Mar 28 21:59:47 EDT 2013
On Thu, Mar 28, 2013 at 7:27 PM, Jon Callas <jon at callas.org> wrote:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
> [Not replied-to cryptopolitics as I'm not on that list -- jdcc]
> On Mar 28, 2013, at 3:23 PM, Jeffrey Goldberg <jeffrey at goldmark.org> wrote:
>>> Do hardware manufacturers and OS vendors have alternate methods? For
>>> example, what if LE wanted/needed iOS 4's hardware key?
>> You seem to be talking about a single iOS 4 hardware key. But each device
>> has its own. We don't know if Apple actually has retained copies of that.
> I've been involved in these sorts of questions in various companies that I've worked.
Somewhat related: are you bound to some sort of non-disclosure with
Apple? Can you discuss all aspects of the security architecture, or is
it [loosely] limited to Apple's public positions?
> If you make a bunch of devices with keys burned in them, if you *wanted* to retain the keys, you'd have to keep them in some database, protect them, create access controls and procedures so that only the good guys (to your definition) got them, and so on. It's expensive.
> You're also setting yourself up for a target of blackmail....
> Eventually, so many people know about the keys that it's not a secret. Your company loses its reputation.
> On the other hand, if you don't retain the keys it doesn't cost you any money and you get to brag about how secure your device is, selling it to customers in and out of governments the world over.
I regard these as the positive talking points. There's no slight of
hand in your arguments, and I believe they are truthful. I expect them
to be in the marketing literature.
>>> I suspect Apple has the methods/processes to provide it.
>> I have no more evidence than you do, but my guess is that they don't, for
>> the simple reason that if they did that fact would leak out. ...
> And that's just what I described above. I just wanted to put a sharper point on it.
> I don't worry about it because truth will out. ...
A corporate mantra appears to be 'catch me if you can', 'deny deny
deny', and then 'turn it over to marketing for a spin'.
We've seen it in the past with for example, Apple and location data,
carriers and location data, and Google and wifi spying. No one was
doing it until they got caught.
Please forgive my naiveness or my ignorance if I'm seeing things is a
different light (or shadow).
>>> I think there's much more to it than a simple brute force.
>> We know that those brute force techniques exist (there are several vendors
>> of "forensic" recovery tools), ....
> The unlocking feature on iOS uses the hardware to spin crypto operations on your passcode...
Apple designed the hardware and hold the platform keys. So I'm clear
and I'm not letting my imagination run too far ahead:
Apple does not have or use, for example, custom boot loaders signed by
the platform keys used in diagnostics, for data extraction, etc.
There are no means to recover a secret from the hardware, such as a
JTAG interface or a datapath tap. Just because I can't do it, it does
not mean Apple, a University with EE program, Harris Corporation,
Cryptography Research, NSA, GCHQ, et al cannot do it.
A naturally random event is used to select the hardware keys, and not
a deterministic event such as hashing a serial number and date of
These are some of the goodies I would expect a manufacturer to provide
to select customers, such as LE an GOV. I would expect that the
information would be held close to the corporate chest, so folks could
not discuss it even if they wanted to.
More information about the cryptography