[cryptography] preventing protocol failings

Nico Williams nico at cryptonector.com
Mon Jul 4 22:54:51 EDT 2011


On Mon, Jul 4, 2011 at 6:28 PM, Sampo Syreeni <decoy at iki.fi> wrote:
> So why don't we make our crypto protocols and encodings *very* simple, so as
> to resist protocol attacks? X.509 is a total mess already, as Peter Gutmann
> has already elaborated in the far past. Yet OpenPGP's packet format fares
> not much better; it might not have many cracks as of yet, but it still has a
> very convoluted packet structure, which makes it amenable to protocol
> attacks. Why not fix it into the simplest, upgradeable structure: a tag and
> a binary blob following it?

Why even have a tag??  The ASN.1 Packed Encoding Rules (think ONC XDR
with 1-byte alignment instead of 4-byte alignment) doesn't use tags at
all.

The problem regarding encodings is *redundancy* in encoding, partly.
Another part is any type of bags-of-bags-of-items data structure.
Another problem is optionality.  Of these I'm certain that the only
one we can avoid or minimize is encoding redundancy.

In BER/DER/CER/XML you get a lot of redundancy: tag-length-value,
sometimes tag-length-tag-length-value (e.g., when explicit tagging is
used).  XML doesn't do tag-length-value, but still, it's quite
redundant.

In XDR and PER there's much less redundancy -- none if all elements
are fixed-sized, non-extensible, and non-optional in the schema.
Optional elements' presence absence is denoted by a boolean value (one
bit in PER, 4 bytes in XDR[!]), for example.

A data structure like:

foo ::= SEQUENCE {
  f1 INTEGER (0..65535),
  f2 UTF8String OPTIONAL,
  f3 OCTET STRING OPTIONAL,
  ...
}

Would be quite trivially encoded in PER.  The first element is a
two-byte binary-encoded integer, then there's a single byte to hold
the one bit to indicate the presence/absence of f2 and f3 and the
presence of any extensions (per-the ellipsis), followed by the length
of f2 (if present), then f2, then the length of f3 (if present), then
f3, followed by the length of any extensions (if present) and those
extensions.

A structure like so:

foo2 ::= SEQUENCE {
   f1 INTEGER (0..65535),
   f2 UTF8String (SIZE (32)),
   f3 OCTET STRING (SIZE (32))
}

has a fixed size, taking exactly 66 bytes in PER (68 in XDR, IIRC),
with exactly the layout that you'd expect.

But in BER the size of foo2 is not fixed.  In CER and DER it would be
fixed, though there would be tags and lengths embedded -- elements
that must be validated or else.  Nor would foo2 be pretty in XER/XML
either.

If you want to prevent new bugs in these areas, let's start with
putting the venerable BER/DER/CER to rest in the trash bin.  Legacy
will make that a difficult proposition.

Another option is to use ASN.1 compilers wherever one must use ASN.1
at all.  There are several open source ASN.1 compilers nowadays,
incidentally.  Heimdal (h5l.org) has a really neat ASN.1 compiler and
run-time.  (In practice it's very difficult to retrofit ASN.1
compilers into existing code that uses hand-rolled ASN.1
encoders/decoders.)

> Not to mention those interactive protocols, which are even more difficult to
> model, analyze, attack, and then formally verify. In Len's and his spouse's
> formalistic vein, I'd very much like to simplify them into a level which is
> amenable to formal verification. Could we perhaps do it? I mean, that would
> not only lead to more easily attacked protocols, it would also lead to more
> security...and a eulogy to one of the new cypherpunks I most revered.

I think we can certainly produce a decent set of thumb rules for
cryptographic protocol design -- we have accumulated an enormous list
of "don'ts", and even a few "do"s.

Formal verification has been attempted.  A few years ago there was a
presentation at an IETF meeting of a formal analysis of Kerberos.
That analysis showed that there was an interesting defect in Kerberos
(known to some, IIRC), though no one could come up with an attack
based on that defect.  The defect in question related to insufficient
binding or protection of data (which I suspect was probably an
optimization).

If formal analysis tools become generally available, then we should
demand that all new protocols (and old ones, and new extensions to old
ones) be put through them.

Nico
--



More information about the cryptography mailing list