[cryptography] How are expired code-signing certs revoked?

Steven Bellovin smb at cs.columbia.edu
Fri Dec 9 17:55:05 EST 2011

On Dec 9, 2011, at 5:41 04PM, Randall Webmail wrote:

> From: "Nico Williams" <nico at cryptonector.com>
>> What should matter is that malware should not be able to gain control
> of the device or other user/app data on that device, and, perhaps,
> that the user not even get a chance to install said malware, not
> because the malware's signatures don't chain up to a trusted CA but
> because the "app store" doesn't publish it and the user uses only
> trusted app stores.  Neither of the last two is easy to ensure though
> And yet we see things like someone (apparently) sneakernetting a thumb-drive from an infected Internet Cafe to the SIPR network: <http://www.washingtonpost.com/national/national-security/cyber-intruder-sparks-response-debate/2011/12/06/gIQAxLuFgO_story.html>
> If the USG can't even keep thumb drives off of SIPR, isn't the whole game doomed to failure?   (What genius thought it would be a good idea to put USB ports on SIPR-connected boxes, anyway?)

How do you import new intelligence data to it?  New software?  Updates?
New anti-virus definitions?  Patches for security holes?  Your external
backup drive?  Your wireless mouse for classified Powerpoint
presentations (based on
http://www.nytimes.com/2010/04/27/world/27powerpoint.html I suspect that
such things indeed happen....) I've heard tell of superglue in the USB
ports and I've seen commercial software that tries to limit which
specific storage devices can be connected to (of course) Windows boxes.

Yes, one can imagine technical solutions to all of these, like NSA-run
central software servers and restricted machines to which new data can
be introduced and a good registry of allowed disks and banning both
Powerpoint and the mindset that overuses it.  Is that operationally
realistic, especially in a war environment where you don't have adequate
bandwidth back to Ft.  Meade?  (Hunt up the articles on the moaning and
groaning when DoD banned flash drives.)

The purpose of a system is not to be secure.  Rather, it's to help you
accomplish something else.  Being secure is one aspect of helping to
accomplish something, but it's not the only one.  The trick isn't to be
secure, it's to be secure enough while still getting the job done.
Sometimes, relying on training rather than technology is the right
answer.  Obviously, per that article, it wasn't enough, but it doesn't
mean the approach was wrong; perhaps other approaches would have had
even worse failures.

		--Steve Bellovin, https://www.cs.columbia.edu/~smb

More information about the cryptography mailing list