Here’s a crazy thought – designing and implementing a crypto system that remains secure even if one of the most popular crypto algorithms is broken. Overkill, you say? Imagine if all data flowing over the Internet were in the clear – banking transactions, purchases, and other information you want to keep secret.

I’ve been in the crypto field for 20 years. This is what we do – contemplate ways to secure our communications and then build those systems. The fact that RSA has all been brought to its knees scares the crap out of me. As a crypto expert, I feel a bit of shame thinking about how we as a collective group got to this vulnerable situation. I get it – the rapid pace of technology change and the high change cost of a system that is free and “good enough” kept our attention on other areas. The three pillars of security are confidentiality, availability and integrity. The system we have today flies in the face of this mantra - it relies on an old algorithm with known vulnerabilities that in just a matter of time, will create the most widespread lack of confidentiality and integrity we’ve ever seen.

What I would like to see is the Internet moving to a system with better crypto algorithm agility. It seems to me that a system where you use two or more public key crypto algorithms in parallel, XORing together the two different pre-master secrets that you send, is going to be much more robust against advances than any system with a single point of failure. There are good candidate algorithms out there that you might be reluctant to rely on independently but would make good sense as an insurance policy. We have several good lattice-based encryption schemes, a few good lattice-based signature schemes, some hash-tree-based signature schemes, and some multivariate quadratic signature schemes. All of these could be supported by browsers.

In my dream world, all SSL handshakes would be based on ECC and NTRU encryption keys, and transported in certificates signed by ECDSA, PASS, or hash trees. The certificates would have short lifetimes and all Certificate Authorities (CA’s) would support these algorithms. There would be a well-defined process for adding more algorithms to the mix and for disabling the use of specific algorithms in web browsers, and all signed code would be signed with two or three different algorithms. That way if one breaks, organizations don’t need to migrate to a new one in a panic because they will be as strong as the strongest, not the weakest.

Some have lobbied for ECC to replace RSA, but as I wrote in a previous blog, the two issues there are it’s still a single point of failure and ECC is known and proven to be vulnerable to quantum computers. If we are going to endure a few years of disruption to move to ECC, we’re just going to have to replicate this effort in the subsequent 5 or 10 years after that. So the question remains – should we be architecting the Internet to cope with the quantum computing threat now in the event that they are released sooner than expected, and possibly avoid another migration headache twice in the same decade? Come to think of it, this isn’t that crazy of a thought after all.

Get the Newsletter

Every two weeks we'll send you our latest articles along with usable insights into the state of software security.