[cryptography] PKI "fixes" that don't fix PKI (part II)

Lucky Green shamrock at cypherpunks.to
Wed Sep 7 18:52:46 EDT 2011

[Adding a cc: to observatory. I am not a big fan of cross posting, but
there are two virtually identical discussions taking place on the
Cryptography and SSL Observatory mailing lists].

After writing my "Diginotar Lessons Learned (long)" post yesterday to
the Cryptography mailing list, I browsed through the archives of various
forums where Diginotar and potential PKI component fixes intended to
prevent a recurrence were discussed.

It rapidly became evident to me that many participants, including
participants that implement certificate-consuming applications, are not
fully aware of the requirements that PKI was intended to address or are
insufficiently familiar with existing PKI aspects and protocols to
adequately analyze the suitability-to-task of suggested improvements.

Moreover, I noticed that some posts list one or more desirable
properties and requirements together with a proposed solution. What is
often absent is an analysis if the proposed solution meets the stated or
otherwise known requirements. Not surprisingly, the proposed solutions
therefore often fail to meet that burden.

To help educate newer participants in the PKI debate while raising the
level of discussion, I decided to turn my "Diginotar Lessons Learned"
post into a series of posts, of which yesterday's post should be
considered the first installment. If you have not read it, you may wish
to do so now.


In this series of posts I will select a few proposed designs and analyze
if and how they meet the known requirements or stated design goals.
Often, I will pick a proposed fix based on someone's post mentioning
that fix. Please note that my quoting a poster mentioning a proposed fix
should not be taken as an indication that I necessarily believe that
poster is an advocate for that fix.

In case you wonder, I am not bringing a comprehensive solutions to all
the ills of PKI to the table. I am fine with that. Nor is it necessary
to know the prefect solution to perform an analysis if a particular
proposed solution meets its design goals.

With that, lets jump into the analysis.

o Changes to OCSP
Peter Gutmann proposes changes to the OCSP protocol as detailed in

When OCSP was first implemented in a CA software product, long before it
was called OCSP or submitted to the IETF standards process, OCSP behaved
roughly the way Peter suggests it should. The CA would maintain a
database of all issued certificates together with metadata indicating
the status of the cert. (not revoked, revoked, suspended, yikes we don't
know about it and somebody must have gained access to our signing key).

The benefits of real-time status checking over downloading potentially
huge CRLs were readily understood by PKI customers and CA vendors. The
problem was that the top three CA vendors at the time, RSA Security,
VeriSign, and Netscape didn't have a comprehensive database of
certificates issued by their software and were only able to generate
blacklist-based CRLs. During the IETF process, OCSP was therefore
redesigned as a bug-compatible front end to be fed by those CRLs.

Quoting myself here from those days: "learning in 80 ms that the
certificate was good as of a week ago and to not hope for fresher
information for another week seems of limited, if any, utility to us or
our customers".

But that's the best the majority of CA vendor products architecturally
could provide at the time, which caused the IETF process to arrive at
the "rough consensus" that became known as OCSP. The consequences of
that decision are hounding us to this day. OCSP needs a redesign.

o Static lists of trusted CAs
Many certificate-consuming applications employ static lists of trusted
CAs hardcoded into the binaries. Auto-updaters can help greatly in
rapidly pulling compromised roots from deployed systems. Google Chrome's
aggressive auto-updater does an especially good job at that. Still, even
the more aggressive PC software updaters leave vulnerability windows on
the order of days and in some cases weeks. Ample time for the attacker
to harvest webmail and Twitter passwords given average webmail and
Twitter usage patterns.

Realize that a browser on a Windows or GUI-based Linux distro represents
the best-case and most rapid update scenario.

MacOS users have higher security expectations of their vendor yet place
lower security demands on their vendor, which leads to longer fix cycles
by Apple.

Completely out in the cold are most of the hundreds of millions of
smartphone users, including well over 100 million Android users, who's
software can only be upgraded by the carrier, which rarely happens for
the lifetime of the devices. The vulnerability window for your average
Android user is equal to the length of the hardware replacement cycle.
Or somewhere around 18 months depending on which study you believe. More
than enough time for the attacker to perform whatever attack the
counterfeit SSL server cert is intended to facilitate and have plenty of
time left over.

Even worse of are other embedded and industrial systems, which also
remain vulnerable for the duration of the devices replacement cycle. A
cycle that can span years or even a decade.

Dynamic root CA lists address many of those issues, but see my post from
yesterday about the potential traffic analysis risk associated with this
approach. Likely implementation approaches carry a high risk of leaking
the identity of every visited web site using an in-house CA to the
browser vendor, if not the site cert itself.

o Gobal CA
Also known as meta-CA, CA-CA, single trusted root, and "the turtle on
which all other turtles stand", limiting the certificate-consuming
applications to a singular root has clear benefits.

The browser vendors could get out of the business of verifying public CA
credentials, saving time and money. Liability could easily be shifted to
that meta-CA. Dynamic root CA lists become unnecessary, unless the
meta-CA is compromised. Could there be another turtle underneath that one?

Downsides include the creation of an even juicier target for attack,
loss of genetic diversity, and a whole host of political challenges that
I believe would rapidly sink such an effort. See the "global
authority issuing credentials" section of my post from yesterday for

Until the next episode of "PKI "fixes" that don't fix PKI",
--Lucky Green

More information about the cryptography mailing list