The UK National Cyber Security Centre (NCSC) has published updated guidance to help system and risk owners plan their migration to post-quantum cryptography (PQC). The guidance builds on the NCSC 2020 white paper Preparing for Quantum-Safe Cryptography and includes advice on algorithms choices and protocol considerations following the availability of draft standards from the US National Institute of Standards and Technology (NIST).
The point at which quantum computers will be capable of breaking existing cryptographic algorithms such public-key cryptography (PKC) – known as “Q-Day” – is approaching. It’s a juncture that’s been discussed for years, but with advancements in computing power, post-quantum threats are becoming very real. Some security experts believe Q-Day will occur within the next decade, potentially leaving all digital information vulnerable under current encryption protocols.
PQC is therefore high on the agenda as the security community works to understand, build, and implement cryptographic encryption that can withstand post-quantum threats and attacks of the future. There have been multiple notable initiatives, programs, standards, and resources launched this year to help the creation/development of and migration to PQC.
In August, NIST published draft PQC standards that are designed as a global framework to help organizations protect themselves from future quantum-enabled cyberattacks. The standards were selected by NIST following a seven-year process that began when the agency issued a public call for submissions to the PQC Standardization Process. NIST again called for public feedback on three draft Federal Information Processing Standards (FIPS), which are based upon four previously selected encryption algorithms.
Migration to PQC requires more than just new algorithms
Migration to PQC requires more than just new algorithms – protocols and services need to be re-engineered, because PQC typically places greater demands on devices and networks than traditional PKC, wrote John H, head of crypt research at the NCSC. “This is especially true of the amount of data that needs to be communicated between parties using PQC to secure their communications.” International bodies have been working to update protocol standards in parallel with the development of algorithm standards, which is enabling test deployments of PQC by major service providers to understand the potential impacts of the transition, John H added.
While not straightforward, upgrading many major internet services (and the apps that access those services) will likely be one of the “easier” parts of PQC transition, John H said. “Many legacy and sector-specific protocols, including those used in critical national infrastructure (CNI) will also need to transition to PQC. Additional challenges in these use cases include having to run cryptography on devices with constrained resources, and on legacy systems that are hard to upgrade.”
Implications of PQC migration for users and system owners
For users of commodity IT, such as those using standard browsers or operating systems, the switchover to PQC will be delivered as part of a software update and should happen seamlessly (ideally without end-users even being aware), the NCSC’s updated guidance stated. To ensure devices are updated to PQC when it is available, system owners should ensure they keep devices and software up to date. “System owners of enterprise IT, such as those who own IT systems designed to meet the demands of a large organisation, should communicate with their IT system suppliers about their plans for supporting PQC in their products,” it added.
For a minority of systems with bespoke IT or operational technology, such as those that implement PKC in proprietary communications systems or architectures, choices will need to be made by system and risk owners as to which PQC algorithms and protocols are best to use, the NCSC said. “Technical system and risk owners of both enterprise and bespoke IT should begin or continue financial planning for updating their systems to use PQC. PQC upgrades can be planned to take part within usual technology refresh cycles once final standards and implementations of these standards are available.”
Choosing algorithms and parameters for your use cases
The following table gives the NCSC recommended algorithms, their functions, and specifications:
Algorithm | Function | Specification |
ML-KEM | Key establishment algorithm | NIST Draft – FIPS 203 |
ML-DSA | Digital signature algorithm | NIST Draft – FIPS 204 |
SLH-DSA | Digital signature algorithm for use cases such as signing firmware and software | NIST Draft – FIPS 205 |
LMS | Digital signature algorithm for use cases such as signing firmware and software | NIST SP 800-208 |
XMSS | Digital signature algorithm for use cases such as signing firmware and software | NIST SP 800-208 |
“The above algorithms support multiple parameter sets that offer different levels of security,” The NCSC wrote. The smaller parameter sets generally require less power and bandwidth, but also have lower security margins, it added. “Conversely, the larger parameter sets provide higher security margins, but require greater processing power and bandwidth, and have larger key sizes or signatures. The level of security required can vary according to the sensitivity and the lifetime of the data being protected, the key being used, or the validity period of a digital signature.” The highest security level may be useful for key establishment in cases where the keys will be particularly long lived or protect particularly sensitive data that needs to be kept secure for a long period of time. The NCSC strongly advised that operational systems should only use implementations based on final standards.
Post-quantum traditional (PQ/T) hybrid schemes
Post-quantum traditional (PQ/T) hybrid scheme is one that combines one (or more) PQC algorithms with one (or more) traditional PKC algorithms where all component algorithms are of the same type, the NCSC wrote. For example, a PQC signature algorithm could be combined with a traditional PKC signature algorithm to give a PQ/T hybrid signature.
There are greater costs to PQ/T hybrid schemes than those with a single algorithm. “PQ/T hybrid schemes will be more complex to implement and maintain and will also be less efficient. However, there may sometimes be a need for a PQ/T hybrid scheme, due to interoperability, implementation security, or constraints imposed by a protocol or system,” according to the NCSC.
“If a PQ/T hybrid scheme is chosen, the NCSC recommends it is used as an interim measure, and it should be used within a flexible framework that enables a straightforward migration to PQC-only in the future. Technical system and risk owners should weigh the reasons for and against PQ/T hybrid schemes including interoperability, implementation security, and protocol constraints, as well as the complexity, cost of maintaining a more complex system, and the need to complete the migration twice, the NCSC added.
Data and Information Security, Encryption, Government
Go to Source
Author: