[cryptography] Let's go back to the beginning on this

Kevin W. Wall kevin.w.wall at gmail.com
Thu Sep 15 01:40:08 EDT 2011


[Note to moderator: May be slightly OT. Unfortunately, Gmail web interface
won't allow me to alter the Subject: to mention it there.]

On Wed, Sep 14, 2011 at 5:52 PM, Seth David Schoen <schoen at eff.org> wrote:

> More fundamentally, as Peter Biddle points out, trust isn't
> transitive.  Suppose we think that a particular CA is super-awesome
> at verifying that someone owns a domain and issuing hard-to-forge
> certificates attesting to this fact, while resisting compromises
> and coercion.  That doesn't necessarily mean that it's also a good
> judge of whether another organization is also a good CA.

Not so fast. I agree with your conclusion, but not with your premise.

Myself and others really think that Peter Biddle is wrong about trust
not being transitive. If you read carefully through Peter Biddle's blog
(http://peternbiddle.wordpress.com/2008/03/26/trust-isnt-transitive-or-someone-fired-a-gun-in-an-airplane-cockpit-and-it-was-probably-the-pilot/)
on this topic, you will see (as Keith Irwin pointed out in a reply) that
Peter is mixing contexts here. In a nutshell, in his blog, he is making
the argument that trust in two completely different contexts equates
to trust in general (i.e., any context) and therefore trust is not transitive.

However, trust clearly is context dependent and when considering
whether or not trust is transitive, we need to consider the same context.

Specifically, if C1 and C2 are two different contexts, it does NOT
logically follow that:

    There exists a context C1 such that 'Alice trusts(C1) Bob'
    There exists a context C2, where C1 != C2, such that 'Bob trusts<C2> Carol'
    Therefore,
             Alice trusts<C> Carol for all contexts, C.

    where trusts<C> means "trust in context C".

That seems to be the way that Peter Biddle is arguing about trust
not being transitive. Well, if that's the way he's defining it, then of course
it's not transitive.

If it is just that...well, that's the WRONG way to reason about transitivity
in general, and trust being transitive in particular.

Transitivity is a mathematical property of some relationship R and says
for x, y, and z are members belonging to some well-defined set, then we
call relationship R 'transitive' if:

     ( x R y )  AND  ( y R z ) IMPLIES ( x R z )

(See http://en.wikipedia.org/wiki/Transitive_relation for more thorough, but
very comprehensible treatment of this.)

However, in Biddle's blog where he gives his examples, all the examples that
he mentions is is talking about 2 different contexts (e.g., flying
planes and handling
firearms or working on cars and taking care of kids).

That is, Biddle is really discussing 2 *different* relationships

     trust<flying planes>
and
     trust<handling firearms>

and what he is then trying to conclude is that

    ( x trust<flying planes> y ) AND  ( y trust<handling firearms> z )
            IMPLIES ( x trust<C> z )

for any context C.  Well, duh! If you make a fallacious straw man
argument about trust
being transitive in this manner, of course your conclusion this going to be that
"trust is NOT transitive". But you would also, IMHO, be wrong. If we stick to
a specific context / attribute however, then I think you will find the logic
concludes that trust is transitive. (But, I'll show later it's not really quite
that simple.)

Here's a really nutty case restricted to a specific context. Let's suppose that:

    Passengers trust<flying planes> Pilots
and
    Pilots trust<flying planes> Chimpanzees

are both true.  So, a pilot brings his trusted chimpanzee into the
cockpit and shortly after takeoff, he decides to take a little nap
so handles the controls over to his chimp pal. And all this occurs
unbeknownst to the passengers. So what do we conclude? Well,
logic dictates that based on the premises, we may conclude:

    Passengers trust<flying planes> Chimpanzees

But wait! That's absurd you say. Well, perhaps. But then
again, whether the passengers know it or not, the Chimp
who is supposedly flying the plane is pretty much holding
the lives of the passengers in his hands (or is that paws?).

On one hand, these passengers are literally (unbeknownst to them)
trusting that chimp to safely fly that plane. (Or course, on the
other hand, if there where a dozen parachutes on the plan, there
would be a blood bath seeing who would get them. ;-)

Now lets make a little change in the premise. Let's
substitute 'Auto Pilot System' for 'Chimpanzees'. The
conclusion is now:

    Passengers trust<flying planes> Auto Pilot System

All I've done is exchanged one symbol (Chimpanzee) with
another (Auto Pilot System), but all of sudden most of
us feel a whole lot better.

So what does that tell us about 'trust'? Well, for one,
the *human* concept of trust is much more complex than
some simplistic quantifiable mathematical property as we have
been trying to model it thus far.  And herein a big problem
in security. Why? Because the software systems that
we construct can no way approach the complexity of
all these nuances. (Not that it matters a whole lot.
History has shown that we can't even get the simpler
model correct, but I digress.)

> Even giving the PKIX status quo the benefit of the doubt, the root
> CA decisions are supposed to be made by neutral parties following a
> careful process that includes input from professional auditors.  When
> CAs get in the habit of delegating their power, that process is at
> risk of being bypassed and in any case starts to happen much less
> transparently.  There are plenty of cases in the real world where
> someone is trusted with the power to take an action, but not
> automatically trusted with the power to delegate that power to others
> without external oversight.  And that makes sense, because trust isn't
> transitive.

It makes sense, but NOT because 'trust isn't transitive'. It makes
sense because of another aspect of trust that I have not yet discussed.
Specifically,

  Trust is not binary.

Trust is not black or white; it is shades of gray. As humans, for
a given context, we "assign" more trust to some and less to others.
This "level of trust" is largely based on our _perception_ of
experience and reputation, the latter which we sometimes try
to model in reputation-based systems.

An example...unfortunately, you need brain surgery. You have two
surgeons to choose from:

    Surgeon 1: 10 years of experience and over 300 operations.
    Surgeon 2: 1 year of experience and 6 operations.

All other things being equal, who you gonna choose? Surgeon 1, right?
(Well, unless in those 300 operations, s/he has had 250 malpractice
results. ;-) And at least by comparison, you probably do NOT trust
Surgeon 2.

So, let's get back to the transitivity part:

    You trust<brain surgery> Surgeon 1
    Surgeon 1 trust<brain surgery> Surgeon 2

so, obviously,

    You trust<brain surgery> Surgeon 2.

Whoa! Wait a minute. Didn't we just say that we did NOT trust
Surgeon 2. Yep!

So what went wrong here? Well, that went wrong is that we are
assuming that trust behaves as a binary relationship...I either
have complete trust or zero trust. But trust is not binary.
It is shades of gray.  That means that to more accurately
model trust in the real world, we need some property for that
relationship that indicates a _level_ of trust, rather than
trust just being T/F. So we need that IN ADDITION TO a context.

So now we see we need (at least) something like:

          trust<level, context>

to model trust. Where before we just were (implicitly)
using something like

          trust<{T,F}, context>

(which allowed us to model only complete trust or no trust), we
find we now need something more like:

          trust<[0,1], context>

That is, we model level as a real number in the range 0 to 1,
inclusive.

So... we're done now, right? Well, not so fast Sparky. We still
haven't taken into account this property of trust:

    Trust is not constant over time

Question: Any of you now trust Comodo or DigiNotar CAs less
now than you did a year ago? Thought so.

So, we need to account for a time factor in trust...maybe
model it using

    trust<time, level, context>

Of course, this is getting way too complicated and I'm starting to
ramble. Maybe bot much as Peter Biddle was rambling on about gun
control, which IMO, what what the whole point of his blog
was really about, but never the less, still rambling. (Note
that I'm all in favor of gun control in that I want people
who have guns know how to control them. I state that not only
do I believe in the right to bear arms, but also the right
to bare arms, and even the right to arm bears. ;-)

Anyway, enough rambling on my part. If you want to read more,
I've blogged about this 'trust' topic in the past. If you're
really bored and are suffering from severe insomnia, you can read
about it here:
<http://off-the-wall-security.blogspot.com/2011/07/understanding-trust.html>

Thanks for listening,
-kevin
P.S.- You have no idea how many times I almost accidentally made a typo
         and spelled 'trust' as 'tryst'. If I did, my wife might never
trust me again. :)
--
Blog: http://off-the-wall-security.blogspot.com/
"The most likely way for the world to be destroyed, most experts agree,
is by accident. That's where we come in; we're computer professionals.
We *cause* accidents."        -- Nathaniel Borenstein



More information about the cryptography mailing list