[cryptography] Why anon-DH is less damaging than current browser PKI (a rant in five paragraphs)
guido at wtmnd.nl
Mon Jan 7 12:32:26 EST 2013
Lost track who said this:
> >> "Fixing PKI" isn't the problem, PKI itself is the problem. It
> >> doesn't work, and as long as browser vendors keep distracting
> >> themselves by fiddling with even more PKI, they'll never get
> >> around to addressing the actual problem.
Imho, the problem with PKI is not the technology. I believe the
mathematics of Public Key cryptography to be sound.
What I consider the source of the problem with the current deployment of
PKI is that I'm _FORCED_ to _TRUST_ the global CAs and all their derived
certificates. If I don't place trust in these global CAs, I can't
communicate securely with a website, nor can I send or receive encrypted
emails. Whether or not that trust is justified.
Trust cannot be forced. It must be earned. And there is no way for me as
end-user to express my trust levels for the global CAs. I can't reject
Verisign and decide to use only GoDaddy to prove the 'correctness' of a
server certificate. Not without breaking the web: I get scary error
messages when I want to visit a site.
I would like to hijack this discussion towards some solutions.
It helps the big guys, leaving the little guys out in the cold. Google
can include the current certificates of most of the wolds' banks to
protect against phishing the bank.
But the browser vendors won't include the certificate of my humble blog.
My readers won't have any protection against MITM-attacks such as Phorm,
nor the Great Firewall of [Country].
Google/Microsoft better have a water tight procedure for my bank to
submit their certificates or I'm shafted again.
As a browser is just a form of operating system, how would pinning help
me to protect my whole Debian operating system. How do we get the banks'
certificates into Ubuntu, or Centos? Plan9? Reliably?
-- Public Certificate repositories: Certificate Transparency /
Perspectives / Convergence
With public accessible certificate libraries, I can get the trust
agility that I want. I instruct my browser to validate certificates
against a sufficiently large subset of libraries and I expect these to
agree for each site I visit.
Shortly after I've installed the Perspectives-plugin I've deleted all
the ca-certificates in my Firefox and _relied_ on perspectives to
validate them. Especially with the certificates of my bank. Certificate
Patrol took care of remembering the CAs that signed the site.
A funny thing: when I tried Convergence, it made the certificates look
like they were all signed by Convergence. It broke Certificate Patrols'
validation. So I deleted Convergence. There I exercised my Trust
Agility, just what Marlinspike promotes. :-)
What I read from the certificate-transparency.org website is that it
intends to limit to Global CA certificates. I would urge mr Laurie and
Google to include all certificates, including self-signed. It would
increase the value of CT for me, especially in combination with DNSSEC/DANE
-- DNSSEC with DANE/TLSA
I have high hopes in widespread use of DNSSEC with DANE/TLSA. It allows
me to create a self signed certificate for my humble blog and publish
that certificate in DNS.
With DNSSEC, my readers, who have to trust DNS anyway to reach my site,
can use that same trust to validate it's certificate. Readers will
perform a DNSSEC treewalk from my domain up towards the DNSSEC root
certificate. That root is pinned in my resolver library in my browser.
To protect against rogue registrars, public certificate repositories can
be used to monitor changes. It would not prevent my government from
pressuring the registrar into silencing my site, but it will make it
visible that it happened.
Assume the government not only takes away my domain name but also
pressures my hosting provider to take down the host, not everything is lost.
I can rent new hosting space and republish under a new domain name, even
an .onion address.
Although the hostname validation would fail, the certificate is the
same. As it is a self signed certificate, no CA can publish an OCSP
record invalidating it. It allows people to _validate_ that my new site
is indeed the old one that has been taken down.
The certificate is the *identity* of the site, not its domain name.
Of course, browsers must remember my sites' certificate in a way that is
convenient to the user. We could use a Petname protocol for that.
With DNSSEC/DANE we can migrate from using DNS-based identity to
certificate based identity. And isn't a certificate an Identity in
There is one risk left. The government can take over my servers' private
key when they impound the server. To mitigate, I create my own CA-key
and certificate and use that to sign the certificate for my servers'
private key. I publish my local CA-certificate in DNSSEC/DANE records. I
specify that my CA-certificate is the sites' identity. Now when the
government takes hold of the servers private key, I can publish an OCSP
record invalidating it and rolling a new certificate for my .onion address.
With this private CA. I can do more. I can create extra sites for CDN
usage. I sign these with my private local CA-key. Then I link (static)
content from my main site to the CDN-sites and the browser can validate
that they are signed by the same CA. The browser can assign different
This combination of technologies form the basis of what I call Eccentric
Authentication. But Ecca goes further. It also allows visitors to sign
up with nothing more than a username and a public key, eliminating
This second link is my test-site. It is https- and IPv6-only with a self
signed certificate. You can validate against DNSSEC/DANE with the
Firefox-plugin "Extended DNSSEC Validator 0.8", if you get it working.
You need a dnssec validation resolver specified in your
/etc/resolv.conf. Installing unbound and specifying 127.0.0.1 does the
I hope to publish my validating/client certificate handling web proxy soon.
With regards, Guido Witmond.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the cryptography