Lattice-based cryptography

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

Lattice-based cryptography is the generic term for asymmetric cryptographic primitives based on lattices. While lattice-based cryptography has been studied for several decades, there has been renewed interest in lattice-based cryptography as prospects for a real quantum computer improve. Unlike more widely used and known public key cryptography such as the RSA or Diffie-Hellman cryptosystems which are easily attacked by a quantum computer, some lattice-based cryptosystems appear to be resistant to attack by both classical and quantum computers. Further the Learning with Errors variants of lattice-based cryptography come with security proofs which demonstrate that breaking the cryptography is equivalent to solving known hard problems on lattices.

History

Lattices were first studied by mathematicians Joseph Louis Lagrange and Carl Friedrich Gauss. Lattices have been used recently in computer algorithms and in cryptanalysis, for example, in 2009, when Craig Gentry proposed the first fully homomorphic encryption scheme.[1] In 1996, Miklós Ajtai showed in a seminal result the use of lattices as a cryptography primitive.[2]

Mathematical background

A lattice L in real analysis is a set of points in the n-dimensional Euclidean space Rn with a strong periodicity property. A basis of L is a set of vectors such that any element of L is uniquely represented as their linear combination with integer coefficients. When n is at least 2, each lattice has infinitely many different bases. All lattices over Rn have infinitely many elements, whereas in cryptography entities such as the ciphertext, public key, and private key must be taken from a finite space (bit strings of some fixed length). Therefore, the lattices used for cryptography are actually lattices over a finite field.

Mathematical problems are used to construct a cryptography system. These problems are usually hard to solve unless one has extra information. Mathematical problems based on lattices are the Shortest Vector Problem (SVP) [given a basis of a lattice, find the shortest vector in the lattice] and the Closest Vector Problem (CVP) [given a basis of a lattice and a vector not in the lattice, find the lattice vector with the least distance to the first vector]. These problems are normally hard to solve. There are algorithms to solve these problems with a good basis.

Lattice basis reduction is a transformation of an integer lattice basis in order to find a basis with short, nearly orthogonal vectors. If one can compute such a lattice basis, the CVP and SVP problems are easy to solve. A good basis reduction algorithm is the LLL algorithm.

Lattice-based cryptosystems

On June 25, 2009, Craig Gentry using lattice-based cryptography showed the first fully homomorphic encryption scheme as announced by IBM.[3][4][5] His scheme supports evaluations of arbitrary depth circuits.

Encryption

Signature

Hash function

Security issues[8]

Lattice-based cryptographic constructions hold a great promise for post-quantum cryptography. Many of them are quite efficient, and some even compete with the best known alternatives; they are typically quite simple to implement; and are all believed to be secure against attacks using conventional or quantum computers.

In terms of security, lattice-based cryptographic constructions can be divided into two types. The first includes practical proposals, which are typically very efficient, but often lack a supporting proof of security. The second type admits strong provable security guarantees based on the worst-case hardness of lattice problems, but only a few of them are sufficiently efficient to be used in practice.

The NTRU family of algorithms noted above are efficient but lack a strong security guarantee. Basic Learning with Errors designs have strong security guarantees but use impractically large key sizes. The Learning with Errors designs over Rings (Ring Learning with Errors or Ring-LWE) offer very efficient computation, moderately sized keys, and a strong proof of security.

Worst-case hardness

Worst-case hardness of lattice problems means that breaking the cryptographic construction (even with some small non-negligible probability) is probably at least as hard as solving several lattice problems (approximately, within polynomial factors) in the worst case. In other words, breaking the cryptographic construction implies an efficient algorithm for solving any instance of some underlying lattice problem. In most cases, the underlying problem is that of approximating lattice problems such as SVP to within polynomial factors, which is conjectured to be a hard problem. Such a strong security guarantee is one of the distinguishing features of lattice-based cryptography.

The importance of the worst-case security guarantee is twofold. First, it assures us that attacks on the cryptographic construction are likely to be effective only for small choices of parameters and not asymptotically. In other words, it assures us that there are no fundamental flaws in the design of our cryptographic construction. In fact, in some cases, the worst-case security guarantee can even guide us in making design decisions. Second, in principle the worst-case security guarantee can help us in choosing concrete parameters for the cryptosystem, although in practice this leads to what seems like overly conservative estimates and one often sets the parameters based on the best known attacks.

Quantum algorithms and lattices

Lattice problems are typically quite hard. The best known algorithms either run in exponential time, or provide quite bad approximation ratios. The field of lattice-based cryptography has been developed based on the assumption that lattice problems are hard. There are currently no known quantum algorithms for solving lattice problems that perform significantly better than the best known classical (i.e., non-quantum) algorithms. This is despite the fact that lattice problems seem like a natural candidate to attempt to solve using quantum algorithms: because they are believed not to be NP-hard for typical approximation factors, because of their periodic structure, and because the Fourier transform, which is usually exploited in quantum algorithms, is tightly related to the notion of lattice duality.

Attempts to solve lattice problems by quantum algorithms have been made since Shor’s discovery of the quantum factoring algorithm in the mid-1990s, but have so far met with little success if any at all.

See also

References

  1. Lua error in package.lua at line 80: module 'strict' not found.
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Craig Gentry. Fully Homomorphic Encryption Using Ideal Lattices. In the 41st ACM Symposium on Theory of Computing (STOC), 2009.
  4. http://www-03.ibm.com/press/us/en/pressrelease/27840.wss
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Lua error in package.lua at line 80: module 'strict' not found.

Bibliography

  • Oded Goldreich, Shafi Goldwasser, and Shai Halevi. "Public-key cryptosystems from lattice reduction problems". In CRYPTO ’97: Proceedings of the 17th Annual International Cryptology Conference on Advances in Cryptology, pages 112–131, London, UK, 1997. Springer-Verlag.
  • Phong Q. Nguyen. "Cryptanalysis of the Goldreich–Goldwasser–Halevi cryptosystem from crypto ’97". In CRYPTO ’99: Proceedings of the 19th Annual International Cryptology Conference on Advances in Cryptology, pages 288–304, London, UK, 1999. Springer-Verlag.
  • Chris Peikert, “Public-key cryptosystems from the worst-case shortest vector problem: extended abstract,” in Proceedings of the 41st annual ACM symposium on Theory of computing (Bethesda, MD, USA: ACM, 2009), 333–342, http://portal.acm.org/citation.cfm?id=1536414.1536461.
  • Oded Regev. Lattice-based cryptography. In Advances in cryptology (CRYPTO), pages 131–141, 2006.