Researchers with the Computer Security and Industrial Cryptography group (CSIS) at KU Leuven have managed to break (opens in new tab) one of the late-stage candidate algorithms for post-quantum encryption. The algorithm, SIKE (short for Supersingular Isogeny Key Encapsulation (opens in new tab)), made it through most stages of the US Department of Commerce’s National Institute of Standards and Technology (NIST) competition that aimed to define standardized, post-quantum algorithms (opens in new tab) against the threat posed by quantum computers – which will render current encryption schemes obsolete.
The researchers approached the problem from a purely mathematical standpoint, attacking the core of the algorithm’s design instead of any potential code vulnerabilities.
For mathematicians, the researchers were able to crunch SIKE by attacking its base encryption math, Supersingular Isogeny Diffie-Hellman (SIDH). They showed (opens in new tab) that SIDH is vulnerable to the “glue-and-split” theorem, developed by mathematician Ernst Kani in 1997, with additional mathematical tools devised in 2000. Still placed squarely in the mathematical arena, the attack uses genus two curves to attack elliptic curves (genus 1 curves). According to SIKE co-inventor David Jao, a professor at the University of Waterloo, “The newly uncovered weakness is clearly a major blow to SIKE” – which it is. But he added that all of this goes back to cryptographers’ sometimes imperfect dominion of pure mathematics, in that the approach taken by the researchers was “really unexpected.”
For the rest of us, all of that means that the researchers used math to understand SIKE’s encryption scheme and could predict – and then retrieve – its encryption keys.
For their troubles and the paper titled An Efficient Key Recovery Attack on SIDH (Preliminary Version) (opens in new tab), the researchers received a NIST-sponsored $50,000 bounty.
SIKE emerged to prominence after making it one of four additional candidates being considered by NIST after the organization last month officially announced the four algorithms that will replace the currently-used RSA, Diffie-Hellman, and elliptic curve Diffie-Hellman algorithms. Most of the globe’s cybersecurity stands on these algorithms, which have been invaluable in securing information against bad actors – when they aren’t compromised by other means. But they have a fundamental issue: they’ll be easily bested once quantum computers scale enough. And that’s not a matter of “if” anymore: it’s a matter of “when.”
“When” is expected to occur within this decade, so there’s a veritable race to harden future computing systems and update the encryption schemes applied to current information.
To put the big problem into perspective, consider this: the current estimation is that humanity will have produced and stored 64 zettabytes (opens in new tab) (1,000 bytes to the seventh power) of data by 2020. Almost none of that information is currently quantum-resistant. But, for now, we’re somewhat insulated from a cascade of quantum-powered hacks by the technology’s complexity – only the most remarkable corporations and prosperous states have the brain, human, and monetary capital for these systems.
But the cost will decrease. Close to “off-the-shelf” solutions will eventually appear. And when they do, any data secured with classical algorithms will have as good a protection as an apple’s skin is at preventing you from sinking your teeth into its core.
Yet it seems that all it takes to break specific leading-edge designs – namely, SIKE – is a single-core computer and an exotic application of high-level math. So it begs the question: are we ready to create standards for such a burgeoning field of computing, where novel approaches and breakthroughs are announced daily?
As Jonathan Katz, an IEEE Member and professor in the department of computer science at the University of Maryland, wrote in an email to Ars Technica, “It is perhaps a bit concerning that this is the second example in the past six months of a scheme that made it to the 3rd round of the NIST review process before being completely broken using a classical algorithm.” Rainbow, the candidate he is referring to, was broken in February this year, although only at a theoretical level. He continued: “Three of the four PQC schemes rely on relatively new assumptions whose exact difficulty is not well understood, so what the latest attack indicates is that we perhaps still need to be cautious/conservative with the standardization process going forward.”
Whether the time is ripe for standardization or not, one thing is certain: intense work in post-quantum cryptography is needed. Just think about the systemic damage a single successful attack on a medium-sized bank – flipping all its information to a zero, for instance – would have on its very human clients and the financial system at large. We’re talking about everything ranging from single-parent families to 401K savings accounts through small and not-so-small companies.
There are a few issues as important as this one. First, its cryptographic and mathematically-sound exploration is paramount.