Lattice-Based Cryptography
Lead Research Organisation:
Royal Holloway University of London
Department Name: Information Security
Abstract
How long does the BKZ algorithm run? What sounds like a rather niche question only of interest to theoretical computer scientists, is in fact a central open problem that needs to be resolved in order to keep the digital economy and private life safe. While largely hidden from view, cryptography underpins many aspects of modern life such as commerce, banking, governance or long distance personal communication. The cryptographic schemes securing digital communication, in turn, rely on one of two hard mathematical problems at their core. However, these mathematical problems, while still difficult to solve on a normal computer, are, in fact, easy to solve on a quantum computer. That is, in 1994, Peter Shor presented an algorithm for solving these problems - factoring and discrete logarithms - efficiently, essentially regardless of how big we choose parameters, i.e. he found a polynomial-time algorithm on a quantum computer.
To date, nobody has announced a sufficiently big quantum computer to run Shor's algorithm for any non-trivial problem and it remains unclear if it is at all possible. Still, recent theoretical and practical progress in the area of quantum computing has many people concerned. One motivation is the following scenario: an attacker could collect encrypted traffic now and store it until sufficiently big quantum computers are available. Once this is the case, the attacker can use their capabilities to decrypt the stored ciphertexts. Thus, if encryption ought to provide security well into the future, it might be under threat already by quantum computers ... even if they do not exist yet. Some estimates foresee the first quantum computer powerful enough to break real RSA keys for as early as 2030. On the other hand, the adoption of new cryptography often takes decades. Thus, the time to address this problem is now.
A second challenge for current generation cryptography is changes in usage pattern. In recent years, cloud services became increasingly relevant. These brought with them significant privacy challenges as these services rely on having access to personal data to add value. Ideally, we would like to utilise the power of third-party services without handing over sensitive private data.
For both of these challenges, lattice-based cryptography is a key building block to resolving them. That is, from hard lattice problems we can build cryptosystems which are believed to be secure even against quantum attackers. These cryptosystems also enable to compute with encrypted data also known as "fully homomorphic encryption". In both of these areas, standardisation efforts are currently underway to enable widespead adoption of these schemes.
However, before we can do that, we need to refine our understanding of how long it would take an attacker to break these schemes. Practical cryptographic schemes are never unconditionally secure, but they are "secure enough" where "secure enough" can mean different things depending on the desired performance/security trade-off. Thus, we want to make sure that it would take too long to be feasible while not picking our parameters so big to slow down our communications unduly. To answer this question "How long would it taken for an attacker to break the next generation of encryption schemes" is the same as the initial question - "How long does the BKZ algorithm take to run?" - since the BKZ algorithm is the preeminent algorithm with which an attacker would attempt break latticed-based cryptography. Currently, the cryptographic community disagrees on the true cost of this algorithm. Thus, this project sets out to resolve this question so that we can deploy the next generation of cryptography with confidence.
To date, nobody has announced a sufficiently big quantum computer to run Shor's algorithm for any non-trivial problem and it remains unclear if it is at all possible. Still, recent theoretical and practical progress in the area of quantum computing has many people concerned. One motivation is the following scenario: an attacker could collect encrypted traffic now and store it until sufficiently big quantum computers are available. Once this is the case, the attacker can use their capabilities to decrypt the stored ciphertexts. Thus, if encryption ought to provide security well into the future, it might be under threat already by quantum computers ... even if they do not exist yet. Some estimates foresee the first quantum computer powerful enough to break real RSA keys for as early as 2030. On the other hand, the adoption of new cryptography often takes decades. Thus, the time to address this problem is now.
A second challenge for current generation cryptography is changes in usage pattern. In recent years, cloud services became increasingly relevant. These brought with them significant privacy challenges as these services rely on having access to personal data to add value. Ideally, we would like to utilise the power of third-party services without handing over sensitive private data.
For both of these challenges, lattice-based cryptography is a key building block to resolving them. That is, from hard lattice problems we can build cryptosystems which are believed to be secure even against quantum attackers. These cryptosystems also enable to compute with encrypted data also known as "fully homomorphic encryption". In both of these areas, standardisation efforts are currently underway to enable widespead adoption of these schemes.
However, before we can do that, we need to refine our understanding of how long it would take an attacker to break these schemes. Practical cryptographic schemes are never unconditionally secure, but they are "secure enough" where "secure enough" can mean different things depending on the desired performance/security trade-off. Thus, we want to make sure that it would take too long to be feasible while not picking our parameters so big to slow down our communications unduly. To answer this question "How long would it taken for an attacker to break the next generation of encryption schemes" is the same as the initial question - "How long does the BKZ algorithm take to run?" - since the BKZ algorithm is the preeminent algorithm with which an attacker would attempt break latticed-based cryptography. Currently, the cryptographic community disagrees on the true cost of this algorithm. Thus, this project sets out to resolve this question so that we can deploy the next generation of cryptography with confidence.
Planned Impact
This research is concerned with establishing the security of post-quantum constructions to replace e.g. RSA and discrete-logarithm based schemes. Such alternatives are currently being sought by standardisation bodies such as NIST in the US and ETSI in EU. In particular, NIST is currently evaluating post-quantum candidates and NIST cryptographic standards have previously become de facto world-wide standards (e.g. AES, SHA-3). A strong contender in this area are lattice-based schemes.
Similarly, homomorphic encryption (computing with encrypted data) is in the process of being standardised after having seen a rapid speed of innovation since the first proposal was published in 2009.
Thus, the primary impact of this research will be to inform these standardisation processes which will decide on the next generation of cryptographic schemes. Through this mediation, this research will impact public-key cryptography in the long term. Impact in this regard will be achieved by publishing in high-profile academic venues which are currently being watched closely by standards bodies and by engaging in the standardisation process itself, e.g. through attending and contributing to the relevant meetings.
Government bodies will be impacted by the research performed here as it will provide material for their own security evaluation of post-quantum and homomorphic candidates.
End-users will be affected when the security mechanisms in their devices and software products changes. However, it is expected that on a high-level their user experience will not change, i.e. cryptography usually works hidden from view of the average users.
Similarly, homomorphic encryption (computing with encrypted data) is in the process of being standardised after having seen a rapid speed of innovation since the first proposal was published in 2009.
Thus, the primary impact of this research will be to inform these standardisation processes which will decide on the next generation of cryptographic schemes. Through this mediation, this research will impact public-key cryptography in the long term. Impact in this regard will be achieved by publishing in high-profile academic venues which are currently being watched closely by standards bodies and by engaging in the standardisation process itself, e.g. through attending and contributing to the relevant meetings.
Government bodies will be impacted by the research performed here as it will provide material for their own security evaluation of post-quantum and homomorphic candidates.
End-users will be affected when the security mechanisms in their devices and software products changes. However, it is expected that on a high-level their user experience will not change, i.e. cryptography usually works hidden from view of the average users.
People |
ORCID iD |
Martin Albrecht (Principal Investigator) |
Publications
Albrecht M
(2020)
Multilinear Maps from Obfuscation
in Journal of Cryptology
Albrecht M
(2019)
Implementing RLWE-based Schemes Using an RSA Co-Processor
in IACR Transactions on Cryptographic Hardware and Embedded Systems
Albrecht M
(2023)
Variational quantum solutions to the Shortest Vector Problem
in Quantum
Description | There are two main classes of algorithms for solving hard problems on lattices which underpin a lot of post-quantum and advanced cryptography. Lattice sieving builds a large database and then searches for collisions in this database. It requires exponential memory and time. Lattice-point enumeration is a form of brute force-search utilising the geometry of lattices. It requires only polynomial memory but runs in super-exponential time. - Lattice sieving outperforms lattice-point enumeration at low dimension. It is a family of algorithms that must be considered when selecting parameters for lattice-based cryptography. Previously, it had not been clear when the former outperforms the latter. - The cost of lattice-point enumeration is (asymptotically) lower than previously believed. This had been an open question for 10 years. This extends the range of parameters where -- on a quantum computer -- this class of algorithms outperforms lattice sieving. However, for selecting parameters lattice sieving is still the main algorithm to consider due to its performance on classical computers. - There are significant overheads associated with the cost of lattice sieving on a quantum computer which can reduce its advantage over classical computers. |
Exploitation Route | These results are already been used by others. For example, the repository above has 39 stars on GitHub and 10 forks. |
Sectors | Digital/Communication/Information Technologies (including Software) |
URL | https://github.com/fplll/g6k |
Description | The NIST Post-Quantum Standardization Process aims to generate a portfolio of strong and practical cryptographic algorithms that the international community regards as secure in the post-quantum world. The process began in 2016, and takes the form of an international public competition split into rounds. Groups of cryptographers submit algorithms, comments are invited, some submissions are removed in the next round and others are merged, the last two steps are repeated. There have been three rounds in total, and a group of finalists (submissions from Round 3 that are most highly recommended) was identified on 22 July 2020; 53% of round 3 proposals and 73% of the finalists were influenced by work undertaken in this project and reference its publications. The impact is on providing assurance for parameter selection by establishing the costs of "attacking" the schemes under consideration for standardisation. |
First Year Of Impact | 2019 |
Sector | Digital/Communication/Information Technologies (including Software) |
Impact Types | Societal Economic |
Description | Foundational technology in Homomorphic Encryption Security Standard |
Geographic Reach | Multiple continents/international |
Policy Influence Type | Citation in other policy documents |
Impact | Homomorphic encryption is a key technology to enable privacy-preserving data processing and our work is routinely relied upon to establish how to pick parameters for such schemes. |
URL | https://homomorphicencryption.org/standard/ |
Description | (PROMETHEUS) - PRivacy preserving pOst-quantuM systEms from advanced crypTograpHic mEchanisms Using latticeS |
Amount | € 5,496,969 (EUR) |
Funding ID | 780701 |
Organisation | European Commission |
Sector | Public |
Country | European Union (EU) |
Start | 01/2018 |
End | 06/2022 |
Description | ERC Consolidator Grant |
Amount | € 1,900,000 (EUR) |
Funding ID | 101079668 |
Organisation | European Commission |
Sector | Public |
Country | European Union (EU) |
Start | 08/2023 |
End | 08/2028 |
Title | Sieving costs in various classical/quantum cost models |
Description | This dataset collects the "costs" of performing the central step in a lattice sieve in various cost models. Lattice sieving is the asymptotically fastest technique to solve hard lattice problems and to deploy lattice-based cryptographic schemes in practice we need to understand its concrete costs on quantum and on classical computers. There are many parameters to optimise and this dataset contains optimised costs for dimensions up to 1000 (which suffices for cryptography) |
Type Of Material | Database/Collection of data |
Year Produced | 2020 |
Provided To Others? | Yes |
Impact | This dataset is used by others to estimate costs of lattice sieving, e.g. https://pq-crystals.org/kyber/data/kyber-specification-round3-20210804.pdf or our own https://github.com/malb/lattice-estimator/ |
URL | https://github.com/jschanck/eprint-2019-1161 |
Title | Lattice Estimator |
Description | This Sage module provides functions for estimating the concrete security of Learning with Errors instances. The main purpose of this estimator is to give designers an easy way to choose parameters resisting known attacks and to enable cryptanalysts to compare their results and ideas with other techniques known in the literature. |
Type Of Technology | Software |
Year Produced | 2021 |
Open Source License? | Yes |
Impact | This library (and its predecessor https://bitbucket.org/malb/lwe-estimator/) are routinely relied upon in the literature and by practitioners to assess the security of parameter sets for lattice-based cryptographic schemes. See e.g. https://twitter.com/zama_fhe/status/1474380108748177417 for the perspective of practitioners. |
Title | The General Sieve Kernel (G6K) |
Description | Lattice reduction software that implements the most recent theoretical and practical advancements. |
Type Of Technology | Software |
Year Produced | 2019 |
Open Source License? | Yes |
Impact | It (or variants of it) - at the time of writing - hold records for the largest instances of certain lattice reduction tasks critical to security evaluation of post quantum cryptography. It has already spurred on development of implementations on different hardware and shown the community which research directions seem to lead to the best practical performance of lattice based cryptanalytic tasks. |
URL | https://eprint.iacr.org/2019/089 |