It’s Alive! Lattice Cryptography Takes Off To A New Level

It is no monster and is likely to save the Internet from being broken

Photo by freestocks on Unsplash

It’s Alive! Lattice Cryptography Takes Off To A New Level

It is no monster and is likely to save the Internet from being broken

While he slept on a couch, Ron Rivest had a brainwave for the creation of a trap door function for public key encryption. It was a problem he had continually tried to solve. Could he find a method where a special digital key could be used to provide a trap door in the deciphering process, and where it was extremely difficult to decrypt unless we knew a secret?

RSA is born

And, so, while symmetric key has been since the 1960s, the usage of a public key to encrypt and a private key to decrypt became Ron’s key focus. For this, he continually came up with new methods, but his fellow researchers — Adi Shamir and Len Adleman — always found ways around them. But, with his new method, he had found a mathematical method that would make public key encryption possible.

Overall, he found that he could generate two prime numbers and multiply them together in a public modulus. With this, computers find the task of factorizing large modulus values into factors, a costly process. While computers are built with only a capacity for processing 64-bits at a time, this new method used numbers which were much larger than these small numbers. With the RSA method, he found, that if someone knew these prime numbers, they could generate a trap door within the deciphering process, and it all came down to one little piece of magic:

g^{p-1} (mod p) = 1

With this, we take a prime number, such as p=11. Then no matter the value of g, we will always get a result of 1. For example, when we take g to the power of 10, and then take the mod of 11, we get:

2¹⁰ (mod 11) = 1
3¹⁰ (mod 11) = 1
4¹⁰ (mod 11) = 1

And, so, the RSA method was born. While it was never going to compete with symmetric key methods for speed and scale, it found a home within digital signing and and allowed us to identify Web sites correctly. Without it, the little green bar in the browser just couldn’t be trusted.

ECC takes over

RSA has since reigned supreme and saw off the rise of the ElGamal public key encryption method. But, it was Satoshi Nakamoto who saw the opportunity to use a little upstart method — Elliptic Curve Cryptography (ECC). With this, Satoshi could generate a private key, and sign a Bitcoin transaction, and add an EDCSA signature, than could be proven from a hashed version of the private key. It was truly magical and reduced the need for having a repository for public keys — PKI. Now, anyone could create their own private keys, and sign a transaction, and then checked against a published identifier. ECC could also be used in a traditional PKI, and could easily replace RSA.

These little curves also used prime numbers, and typically took the form of:

y² = x³ + a.x + b (mod p)

This gave us points on an elliptic curve, and where you could add two points on the curve, and not be able to find the two points which created the resultant point. The maths became extremely simple, and where we just need one simple operation: point addition, and that was it. We could then quickly multiply points by a large scalar value — and which was the private key, and produce a public key which was a point on the elliptic curve. Satoshi selected the secp256k1 curve, and the rest is history.

RSA and ECC is perfect harmony

And, so, RSA and ECC have co-existed together, with RSA happily signing for the identity of Web sites, and ECC is used for most other areas. ECC is generally lightweight in computation and key size, whereas RSA is heavier in its requirements (but still acceptable in most applications). But, there are clouds on the horizon for both RSA and ECC, and these are storm clouds. Basically, we have never proven that RSA and ECC are truly hard problems to solve, as they are just difficult to solve for our existing computer architectures. It was then Peter Shor who showed that both RSA and ECC could be broken with the use of pulses of light within a quantum computer. And, so, the race is on to find a solution, before quantum computers break every piece of trust that we have on the Internet.

Meet Lattice

So, what’s the solution? Well NIST has been running a competition to find a solution to the replacement of RSA and ECC for key exchange/public key encryption and digital signatures. Both are fundamental for the security of the Internet, and in creating secure tunnels for data. Without them, we could be spied upon for every bit of data that we send and receive. The winners were Kyber and Crystals, and which are both lattice-based methods.

With this, we go back in time, again, and find a truly hard problem … given a multi-dimension lattice of points; if we add a little bit of an error to a given point in that lattice, it is extremely difficult to find the actual point that is nearest to the point with the error. This problem has been around for a while and is often known as Learning With Errors (LWE), and where we add a bit of noise into our computation, and can then create a trapdoor function with a secret key. Overall, we don’t quite use vectors as we would when we deal with points in a lattice but convert our values into polynomials, such as for the point at (3,4,5) being represented by:

3.x² + 4.x + 5

If you remember your maths at school you could add, subtract, multiply, and even divide polynomials. Thus, we can encrypt with a sequence of arithmetic operations, and then perform the reverse to decrypt, and apply our secret key. If we use enough dimensions in our lattice, it becomes a hard problem — even for a quantum computer.

The “monster” arises

I say “monster” in the terms of bringing Frankenstein’s monster alive .. “It’s Alive!”. But, lattice methods are far from being a monster … they will be the savour of many of our existing problems with the rise of quantum computers. Without them, we could crash virtually all of our trust on the Internet. We could not trust any Web servers, there would be no secure tunnels, and our software would be trusted. It would be Open Season for hackers!

But, we have built systems which are focused on RSA and ECC, and where we need speed, and can we have special chips to perform their computations. In fact, ECC has been so successful, that it can perform its magic even on an extremely limited device. But, the solution is evolving with technologists in Germany producing a microchip that implements lattice-based methods. This chip uses an open source RISC-V standard, and which has been designed to cope with polynomial multiplication. It has been proven to work 10 times faster with Kyber than existing software solutions [here][1]:

A key benefit of using the RISC-V architecture is that it can also support other post-quantum methods, such as SIKE (and which has been proven to give a 21x improvement in performance) [2]:

The researchers, too, have worked out ways to make sure that there are no backdoor trojans within their implementations [3]:

Conclusions

We have passed the level of debate on post-quantum cryptography. No matter when a quantum computer will be built at scale, we need alternatives that we can use now, and so this new microchip moves us forward in building systems which can secure our future. So … “It’s Alive”

If you are interested in knowing more about post-quantum cryptography, try here:

https://asecuritysite.com/pqc

References

[1] Fritzmann, T., Sigl, G., & Sepúlveda, J. (2020). RISQ-V: Tightly coupled RISC-V accelerators for post-quantum cryptography. IACR Transactions on Cryptographic Hardware and Embedded Systems, 239–280.

[2] Roy, D. B., Fritzmann, T., & Sigl, G. (2020, November). Efficient hardware/software co-design for post-quantum crypto algorithm SIKE on ARM and RISC-V based microcontrollers. In Proceedings of the 39th International Conference on Computer-Aided Design (pp. 1–9).

[3] Hepp, A., & Sigl, G. (2021, May). Tapeout of a RISC-V crypto chip with hardware trojans: a case-study on trojan design and pre-silicon detectability. In Proceedings of the 18th ACM International Conference on Computing Frontiers (pp. 213–220).