[SSL Observatory] Diginotar broken arrow as a tour-de-force of PKI fail

Larry Seltzer larry at larryseltzer.com
Tue Sep 6 16:36:59 PDT 2011


I've never heard of pinning before. There is a proposed IETF spec for a DNS
record to specify an authorized CA:
http://tools.ietf.org/html/draft-hallambaker-donotissue-03

Some big names on it, but I'm not sure it's actually gone anywhere.

LJS


On Mon, Sep 5, 2011 at 7:35 PM, Ian G <iang at iang.org> wrote:

> On 5/09/11 7:23 PM, Gervase Markham wrote:
>
>  The thing which makes the entire system as weak as its weakest link is
>> the lack of CA pinning.
>>
>
>
> Just a question of understanding:  how is the CA pinning information
> delivered to the browser?
>
> (For those who don't know, I also had to look it up too :)  CA pinning is
> where a particular CA is the only one permitted to issue certs for a
> website.  I think, it's a very new feature, in some browsers only?)
>
>    An HSM or smart card that does anything the PC that it's attached to
>>> tells
>>>   it to is only slightly more secure than simply storing the key directly
>>> on
>>>   the PC.  You need to do more to secure a high-value signing process
>>> than
>>>   sprinkling smart card/HSM pixie dust around and declaring victory.
>>>
>>
>> This is true, but I'm not sure it's particularly relevant.
>>
>
> Well, what's relevant is whether the security processes are doing the job.
>  Evidence over the last year says no.  Why?
>
> What Peter's saying is that there are signs that the processes are weaker
> than they appear.  One clue is when they go for expensive solutions rather
> than smart solutions, and declare it done.
>
>  (Who claims
>> that HSMs are magic pixie dust?)
>>
>
> CABForum, in BR15.6.  "CA must use a HSM" approx.
>
> Monkey-see-monkey-do.  Which, amusingly, contradicts most of the rest of
> section 15 :)
>
>  Lack of breach disclosure requirements for CAs means that they'll cover
>>> problems up if they can get away with it:
>>>
>>
>> Do you think that remains true?
>>
>
> We don't know.  There is no full disclosure mechanism, so we don't know
> what is disclosed and what not.  and even when the full disclosure mechanism
> is in place, we'll need 20 or so events to gain confidence in it.
>
> Recall SB1386?  It actually didn't do anything until 2 years had passed.
>  Then someone paniced.  And attitudes shifted...
>
>  Comodo didn't cover their problems up,
>>
>
> Have they released the full report of the issue?  Has Mozilla?
>
> Or do we just know the headline, and what people have dug up against their
> best wishes?
>
> You saw the chat on mozilla list, another CA declined to report, dressed up
> by lots of buts, ifs, maybes, not-us's and other rantings.
>
> Non-disclosure is certainly in place.
>
>  and are still in business. DigiNotar covered theirs up, and are not.
>> Covering up is a massive business gamble, because if anyone finds the
>> certs in the wild (as happened here), you are toast. Particularly given
>> that browsers are deploying more technologies like pinning which makes
>> this sort of attack easier to find, it would be a brave CA who covered a
>> breach up after the lesson we had last week. You'd have to be pretty
>> darn confident any misissued certs didn't get obtained by the attackers
>> - and if they didn't get out, is there actually a problem?
>>
>
>
> What is of current concern is that CAs may now be "disclosing" to the
> vendors.  And calling that disclosure.
>
> This is of concern for several reasons:  firstly, it likely puts the
> vendors in a very difficult position, even to the point of corrupting them.
>  Secondly, it creates a liability-shifting mechanism:  the broken CA can now
> point to this as its industry-standard disclosure mechanism (regardless of
> utility and user damages) which reduces its own liability, without a
> commensurate payment; and the vendor now has to take on the risk of suits.
>  Thirdly, it's being done in an ad hoc knee jerk fashion, again in secret,
> and there is no particular faith that the parties involved will be able to
> keep their interests of the table.
>
> (For Mozilla alone, private disclosure goes against their principles.)
>
> I'm not denying that disclosure to vendors may help.  But I have no faith
> in the risk managers at the other side to analyse that risk.
>
> If you feel that they can do a good job, post their risk analysis.
>
> Right, I thought so, they haven't done one.  All vendors are in breach of
> BR.  Doesn't auger well does it :)
>
>    there's nothing protecting the user.  Even the most trivial checks by
>>>   browsers would have caught the fake Google wildcard cert that started
>>> all
>>>   this.
>>>
>>
>> What sort of "trivial checks" are you suggesting?
>>
>
> Perhaps CA pinning!  But in the browser :)
>
>
>    Diginotar both passed audits in order to get on the browser gravy train
>>> and
>>>   then passed a second level of auditing after the compromise was
>>> discovered.
>>>   The auditors somehow missed that fact that the Diginotar site showed a
>>> two-
>>>   year history of compromise by multiple hacking groups, something that a
>>>   bunch of random commentators on blogs had no problem finding.
>>>
>>
>> I think there are definitely searching questions to ask of DigiNotar's
>> auditors.
>>
>
> :)  and, any other CA audited by that organisation.  And any CA audited to
> that standard....
>
> And ... wait, all of them!  Oops!
>
> Short story -- you won't be able to blame the auditor for this.
>
> Sure, you can embarrass them a lot!  But, it's pretty obvious on one
> reading of webtrust that it's a farce.  It's also pretty obvious reading BR
> that an audit would not have picked this up.
>
> We could do it ten times over and it's still be the same thing.  Audit
> isn't up to solving this problem, it's only up to lifting the basic game of
> low-end CAs to some reasonable best-practices level at the governance side.
>
> (Another sign that the processes aren't doing the job is that CABForum's
> solution is to add more audits.  We're up to 4, now, right?  WebTrust, BR,
> EV, vendor.  Would 5 do it?  6?)
>
>
>    available.  There is no fallback.  Site owners who are concerned about
>>> the
>>>   security of their users can't do anything, because the browser vendors
>>> have
>>>   chosen to prevent them from employing any other option (I can't, for
>>>   example, turn on TLS-PSK or TLS-SRP in my server, because no browsers
>>>   support it - it would make the CAs look bad if it were deployed).
>>>
>>
>> Patches welcome? (Or did we reject them already? :-)
>>
>
> Yep, I'm afraid that's the case ;)
>
> It's at the attitude level, not any particular patch.  Patches aren't
> welcome.  E.g., CA pinning was proposed in the mid-00s and people were told
> to go away.  And take their code with them...
>
> I mean, we could be wrong.  But who's going to take the chance and spend a
> month on code, or a year, only to be told no?  Again?  Who's gonna bother to
> fight through the human-shield to get to the coders?
>
> It's up to the vendors, really.  We wait and we watch and we groan.
>
>
>
> iang
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.eff.org/pipermail/observatory/attachments/20110906/97e47c21/attachment.html>


More information about the Observatory mailing list