RSA warns developers not to use RSA products

In today’s news of the weird, RSA (a division of EMC) has recommended that developers desist from using the (allegedly) ‘backdoored’ Dual_EC_DRBG random number generator — which happens to be the default in RSA’s BSafe cryptographic toolkit. Youch.

In case you’re missing the story here, Dual_EC_DRBG (which I wrote about yesterday) is the random number generator voted most likely to be backdoored by the NSA. The story here is that — despite many valid concerns about this generator — RSA went ahead and made it the default generator used for all cryptography in its flagship cryptography library. The implications for RSA and RSA-based products are staggering. In the worst case a modestly bad but by no means worst case, the NSA may be able to intercept SSL/TLS connections made by products implemented with BSafe.

So why would RSA pick Dual_EC as the default? You got me. Not only is Dual_EC hilariously slow — which has real performance implications — it was shown to be a just plain bad random number generator all the way back in 2006. By 2007, when Shumow and Ferguson raised the possibility of a backdoor in the specification, no sensible cryptographer would go near the thing.

And the killer is that RSA employs a number of highly distinguished cryptographers! It’s unlikely that they’d all miss the news about Dual_EC.

We can only speculate about the past. But here in the present we get to watch RSA’s CTO Sam Curry publicly defend RSA’s choices. I sort of feel bad for the guy. But let’s make fun of it anyway.

I’ll take his statement line by line (Sam is the boldface):

“Plenty of other crypto functions (PBKDF2, bcrypt, scrypt) will iterate a hash 1000 times specifically to make it slower.”

Password hash functions are built deliberately slow to frustrate dictionary attacks. Making a random number generator slow is just dumb.

At the time, elliptic curves were in vogue

Say what?

and hash-based RNG was under scrutiny.

Nonsense. A single obsolete hash based generator (FIPS 186) was under scrutiny — and fixed. The NIST SP800-90 draft in which Dual_EC appeared ALSO provided three perfectly nice non-backdoored generators: two based on hash functions and one based on AES. BSafe even implements some of them. Sam, this statement is just plain misleading.

The hope was that elliptic curve techniques—based as they are on number theory—would not suffer many of the same weaknesses as other techniques (like the FIPS 186 SHA-1 generator) that were seen as negative

Dual-EC suffers exactly the same sort of weaknesses as FIPS 186. Unlike the alternative generators in NIST SP800-90 it has a significant bias and really should not be used in production systems. RSA certainly had access to this information after the analyses were published in 2006.

and Dual_EC_DRBG was an accepted and publicly scrutinized standard.

And every bit of public scrutiny said the same thing: this thing is broken! Grab your children and run away!

SP800-90 (which defines Dual EC DRBG) requires new features like continuous testing of the output, mandatory re-seeding,

The exact same can be said for the hash-based and AES-based alternative generators you DIDN’T choose from SP800-90.

optional prediction resistance and the ability to configure for different strengths.

So did you take advantage of any of these options as part of the BSafe defaults? Why not? How about the very simple mitigations that NIST added to SP800-90A as a means to remove concerns that the generator might have a backdoor? Anyone?

There’s not too much else to say here. I guess the best way to put it is: this is all part of the process. First you find the disease. Then you see if you can cure it.

20 thoughts on “RSA warns developers not to use RSA products

  1. I actually liked the “elliptic curves were in vogue” part. Had other things been equal (though they emphatically were not), I could see making a choice among the four with “EC math is cool; let's go with that one.” Of course things were not close to equal at the time. Dual_EC was slow, harder to implement, and most importantly was proven to have been backdoorable.

    But the “in vogue” reason is the only one offered that I could see actually playing a role at the time. The other reasons presented are obviously post-hoc nonsense that just makes them look stupid.

  2. I think it's time to stop using the word “allegedly”. Of course Dual_EC_DRBG has a backdoor; this is perfectly obvious.

    As you say, the RSA folks are not stupid. For them to make this choice implies that they were incentivized, coerced, or both. This is also perfectly obvious.

    On the bright side, NSA apparently thinks EC discrete log is hard. The whole point, from their perspective, would be to insert a back door that only they could use.

  3. > On the bright side, NSA apparently thinks EC discrete log is hard.

    We only know they believe it to be hard to anyone but them. They might have some insight on elliptic curves they assume to be so advanced that the public is not likely to find out in the coming decades.

  4. Not only the public, who are morons, but similar agencies around the world. It is not in the “national security interest” for (e.g.) the U.S. banking system to be vulnerable to (e.g.) Chinese intelligence.

    More relevantly… If EC discrete log were easy for them, they would have no need to insert such an obvious back door; they could instead have generated P and Q in a convincingly random manner and then computed the value they need.

    So I repeat: This is actually pretty strong evidence that NSA considers EC discrete log to be a hard problem, or did in 2006.

  5. I disagree on the “not stupid” part about the NSA. If they even had a basic working knowledge of deception, they would have made sure the generator looked attractive by making it fast and they certainly would have removed the bias. This way, we can only speculate that they forced it to be in there and used as default by RSA by plain, old-fashioned coercion instead of subtlety. It stinks of ham-handed incompetence.

    Quite frankly, the NSA now looks like a bunch of incompetent hacks with a very limited clue, but tons of money and far, far too much power.

  6. With all due respect, I must disagree that this episode shows NSA incompetence. Why do I do so?

    Because it worked.

    Ham-fisted? Yes, a bit. But look: yes, “everyone” knew in 2006 this thing had been NSA backdoored. To make that clear, a colleague of mine who was in prison at the time heard about it… in prison. So this was hardly a secret. However, the goal wasn't to keep it secret from geeky crypto nuts – the goal was to get it into production systems.

    Ahem.

    The reason they didn't make a PSRG that was cleverly backdoored and speedy is that doing is is very difficult – perhaps even not congruent with the way our universe is put together. No doubt, they tried – and they have lots of smart folks working on this stuff. What they got was a backdoored PSRG that more or less worked but was terribly slow – and then they crammed it down NIST's throat, and got it into production as a DEFAULT with RSA… which meant countless folks used it without being aware of it.

    Mission accomplished.

    Sure, people like us (those reading and commenting here) wouldn't have accepted the default and were waving our hands about when it came to this dog of an RNG (which is actually quite disrespectful to dogs). But nobody listened to us back then, really. We did our thing, and worried our worries, and talked to each other breathlessly… and it was just background noise. This thing was “government recommended by an independent agency” and RSA could slide it in without needing any real cover story at all.

    Aaaaa… the days before Snowden, eh?

    Three substantive points:

    1. Well-said, Nemo – this is no longer an “allegedly” backdoored piece of crap. It's NSA backdoored. Even in criminal cases in America, where “proof beyond a reasonable doubt” is the dispositive metric, when someone is convicted by a jury (or pleads guilty), she is no longer “alleged” to have done the deed. She did. She may continue to protest her innocence, of course – but everyone else crosses a grey threshold and drops “alleged” from discourse. The same is true here; this is no mere allegation, and using that weasel-word essentially become euphemism now that so much confirming data – and zero disconfirming data – are available.

    2. Nicely said, again, re the strength of ECC in general. Game theoretically, the NSA wouldn't have burned time and effort and political capital to push this thing into NIST-land if they had magic keys to ECC. Indeed, they'd have let a REAL EC-based PSRG float to the top, since it would have been (even more) widely adopted – having no baggage as did extant example. That they didn't do this, and that they did work hard to get this ham-fisted crack in the door leads by logical interpolation to the conclusion that they didn't at the time (2005-ish) have magic keys. Maybe they found 'em since then, or maybe they had 'em back then but were so secretive about it that even their NIST-bullying teams didn't know about it… but those are less supported by Occam.

    3. RSA didn't do this. Individual people at RSA did this. Who were they? Who had signatory authority over this? Where is he or she today? Why isn't someone from the hallowed halls of “investigative journalism” sticking a microphone in his/her face and asking hard questions? Why isn't someone checking real estate records to see if vacation homes were bought concurrently (mostly metaphorical example)? Accountability means someone is ACCOUNTABLE – a person, not just some amorphous company name.

  7. {continued past character limit of prior post…}

    The people who had their fingers in this particular cookie jar should never, ever be trusted again with anything relating to security. Ever. That sends a strong signal: get caught with your thumb on the scale and NSA calling cards in your pocket, and your career in security is over. Well, not over – you can always get a job as a shill for Hayden and the rest of the cyberwar-military complex, spreading FUD & profiting every step of the way.

    Still and all, we need to know who. So we can learn from this example, as Dr. Green says – and so we can ask them about what motivated their betrayal. That's fair, no?

  8. Good points. I still think incompetent, as it was just too obvious. They should have refrained from putting the backdoored generator in there if they could not make it faster than the others. At that time, yes, only competent people cared, and I have personal experience of how difficult it can be to explain even to a security-conscious company that some crypto implemented by a reputable vendor can be bad.

    But the mind-sets of the semi-competent are always changing and the occasional panic is normal. If one happens, evidence like the one in this case, suddenly becomes important. Backdooring crypto needs to be done with a very long-term perspective or you lose the ability. Just the same as with putting sleeper agents in place (to use a not-quite accurate analogy). While not certain, it seems the NSA has just lost credibility for a long time and so have some of their victims/accomplices.

    I do fully agree about personal accountability. If you collaborated in such an immoral action, your personal reputation must go down the drain publicly and permanently. If you have a PhD in CS or related, it must be removed. Your name must turn up as somebody that damaged the field significantly in web-searches, and hiring you for a security position must be very risky or basically impossible.

  9. There was nothing semi-competent about the cryptographers working at RSA in 2005. I worked with several. It sickens me to think they might have been complicit, so I choose to believe they were kept out of the loop. Which still requires an active and malicious decision on somebody's part, IMHO.

  10. Indeed. I never meant to call the cryptographers incompetent, but the “active and malicious” party is, IMO. Too greedy, too little sense of self-preservation, no long-term vision. Good for us though, because we now _have_ a smoking gun.

    Still, in the right climate even an ordinarily honorable person can become corrupt or lose perspective. While that is sad, I think we have to start looking at what outside influences and pressure cryptographers are under before trusting them. Good thing is that academics should usually be fine, academic freedom is still a significant protection and I don't think you can coerce somebody decent to go against his or her own field to this degree by refusing to fund projects.

  11. I suspect bribery, either (unlikely) overt (to perhaps just a single programmer) or more likely covert (“Make this the default for government sales”).

    Its almost certainly the latter: somebody either stated or implied that making Dual_EC_DRBG the default was a requirement for government sales. Especially since this is the “FIPS compliant” version, that would mean that such a default (since the government must buy the FIPS version) would be the default for everyone.

    Since Dual_EC_DRBG is actually good (well, almost good, there is still the minor bias issue, but 'good enough') if you are the only one with the key to the backdoor, it is a 'safe' backdoor if its running on government systems.

  12. This also suggests the following: Most crypto products are going to use libraries. Find out which ones by default use Dual_EC_DRBG. This should be known as the “Bribe List”: All these companies took 30 pieces of silver to betray their customers.

  13. RSA got popped a couple years ago, anyone remember? Mass token relacement under strict NDA?

    I'm sure that RSA managed that incident without any help from anyone apart from their PR people. No reason to talk to NSA about it.

    Could someone remind me why RSA are still in business while Diginotar aren't?

  14. While the Dual_EC_DRBG story is interesting, I think the rabbit hole goes much deeper. It is pretty common knowledge that NSA has “classified mathematics” and cryptanalytic techniques that they have developed independently over many decades. Thus, it is prudent to ask whether or not they have used those techniques to undermine more than just a single PRNG.

    What about public-key systems? Is there some subtle weakness that the “standards” specify that only NSA knows about? What about those algorithms that NSA released (DSA SHA-1)? Do they have built-in weaknesses? What about block ciphers? Did NIST choose Rijndael because it has some unknown mathematical weakness? We do know that it has a very simple algebraic structure (which has been a major contention of people like Courtois). Does NSA have viable algebraic attacks against it? It's key schedule is also generally referred to as “lousy” by people like Schneier. Nonetheless, the public crypto community at large voted for it, so it's hard to contend that NIST had something up it's sleeve. However, NIST has the power to “override” the vote if they so choose. It just so happens they didn't in this case.

    Block ciphers have a lot of research behind them and it's generally accepted that it's pretty hard to hide weaknesses in them. Thus, my bet is on public-keys. I suspect that there may be something hidden in the various public-key generation protocols. It is easier to hide a “trap door” in a public-key system than it is in a block cipher.

    Also, the SHA-3 algorithm, Keccak, seems to utilize a primitive (sponges) that doesn't seem to have near the research behind it that previous constructions like MD have. Is there some weakness that NSA found with such a construction?

  15. It could also have been as simple as “If you want the government to continue buying millions and millions of dollars worth of your products every year, you will do the following for us. If you don't, we have a migration path already worked out for one of your compeditors' products. The choice is yours.”

    Money talks.

  16. Thankyou for the excellent articles.

    I suppose the question that the masses will ask is: “how do I know if I'm affected? “. Of course, the glib answer is that you _are_ affected…but isn't particularly helpful.

    It would be great if a follow up article could be written that gives the masses help on how to ensure their systems are fixed. The scope of this backdoor is mind boggling…

    Craig
    Ps: Im one of the masses. I never studied crypto, but was a code monkey for years…blindly trusting rsa encryption…

Comments are closed.