[cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

ianG iang at iang.org
Fri Mar 29 05:16:38 EDT 2013


On 29/03/13 08:27 AM, Jeffrey Goldberg wrote:
...
> The scare story spread quickly, with the more hyperbolic accounts getting the most attention. The corrective analysis probably didn't penetrate as widely.


The issue that I see is that because Apple runs a secret shop, they are 
more vulnerable to PR-style disasters.

I actually agree with Apple doing it.  I think they win more by closing 
off the information.  But to do this exposes them to risks of the media 
being able to push bad conspiracy stories, and having them unchallenged.

Also, secrecy means that ones own people do not know what is going on. 
So, stupidity is easier because there are less eyes on the topic, and 
when it does happen, fewer eyes are there to warn about it as the event 
rolls on towards its own disaster.

How to deal with this?  Google has (or had, by repute) a policy of 
secrecy to the outside, but inside is open.  Google people can talk 
about anything to other insiders.  This sounds like a good solution to 
me, or at least, one way, conceptually.

(Perhaps google people can confirm/deny more about this meta-secret if 
they so desire.  Actually, forget the confirm/deny, just tell us whether 
it works and/or what is better :)

Also, I think we have to be a little bit humble and realise that all 
security is risk based.  S**t happens.  Sometimes the statistics roll 
around, it's our turn, and mistakes become disasters.

Take it on the chin, and get back to work.  Don't overreact by promising 
ludicrous things like "it'll never happen again" or lopping off heads 
just to satiate the media.

...
> The trick is how to communicate this the people, most of whom do not wish to be overwhelmed with information.


Stick to your business.  People's loyalty to brand will overcome these 
sorts of disasters, as long as you're sticking to your own values and 
your values are aligned with your customers.

It's when you don't stick to your own values, or when your values are 
confused and contradictory, that the real failure happens.  Arguably, 
this happened in the 2000s with Microsoft -- Bill Gates' famous memo -- 
which couldn't articulate their values in security, *and* had a major 
security PR issue to contend with.  Hence, their efforts to improve 
their security in a post-1990s-benign world fell to little effect, even 
though their efforts were strong.

Apple are very good at this.  Microsoft are lousy.  Google sort of 
muddle along in between.

...
> What's the line? Never attribute to malice what can be explained by incompetence.


Yeah.  But in a lot of cases, what looks like incompetence after the 
fact was just innocence of the future, before the fact, and simple lack 
of understanding of the wider world.  As Jon described, if you think 
about the location & wifi issues, how likely is that you yourself 
couldn't be caught up in something like that?

There but for the grace of our own personal deity, we go.

Personally, when I think of what google were doing with their street 
vans, and listening to the wifi and all, to have properly scrubbed that 
data would be to ascribe deity status to the geeks involved.  They are 
employed and paid to do cool things with tech, not understand PR 
disasters or trawl through the arcania & complexity of privacy 
regulation or read pop management mags like HBR.

Even the fact that heads rolled over that disturbs me.  I hope that was 
not done without a lot of thought.

...
> At the same time we are in the business of designing system that will protect people and their data under the assumption that the world is full of hostile agents. As I like to put it, I lock my car not because I think everyone is a crook, but because I know that car thieves do exist.


:)  Security is risk-based.  Locks raise the bar, a little, which is 
generally enough to move the dishonest thief to an easier target, and 
make the honest thief think twice.  We should think more like that in 
our field, it would help us a lot.  Conceptually, this is to say we need 
algorithms that fail 0.1% of the time, not 0.000001%.

However, a consequence of risk is that some people get hit.  We need to 
learn to love our occasional failures.  As Dan Geer put it, "If the 
number of failures is zero, there is no way to disambiguate good luck 
from spending too much."

http://financialcryptography.com/mt/archives/001255.html

That's two good reasons to defend not attack the locationgates.  We are 
all vulnerable in a risk-oriented world, and each failure is a learning 
opportunity.



iang


More information about the cryptography mailing list