Why crypto-agility is a must to become quantum-ready

By Brad Bowers, Lead Field CISO Global, SHI [ Join Cybersecurity Insiders ]

Nobody knows when ‘Q-day’ will arrive, but we know it’s coming and likely sooner than most organizations think.  Quantum computers with capabilities to decrypt today’s most popular cryptographic algorithms  could be with us as early as 2030 and this has seen a flurry of activity as nations seek to protect their encrypted data in a post-quantum world. 

Quantum computers will be able to crack the cryptographic algorithms used to encrypt data today in minutes and it’s widely suspected that organised criminal gangs and nation state attackers have already begun harvesting data in the belief they’ll be able to decrypt it later (HNDL). HNDL will allow threat actors to store valuable evergreen data such as IP, PII or even state secrets which they can then decode and act upon. 

But it’s not just data – the very future of the internet itself is also at risk due to the threat posed to digital signatures. Quantum will make it possible to forge these signatures, rendering the authentication of websites, APIs or software updates obsolete and making them susceptible to interception, spoofing, and manipulation.

To protect against this eventuality, CISA, NIST and the NSA have been urging businesses to educate IT leaders and begin the journey to quantum resiliency. In addition, NIST has been providing guidance and recommendations on how to assess and develop a quantum-readiness roadmap that will best ensure alignment with quantum resilient algorithms.  

Currently, NIST recommends organizations align to newer cryptographic algorithms that leverage “Lattice-based cryptography” that has been shown to be more resilient against quantum based attacks.   These comprise a standard for key establishment and encryption (ML-KEM) and two for digital signatures (ML-DSA and SLH+DSA), the latter of which is suitable for signing firmware and hardware. (These two also join the LMS and XMSS algorithms developed by NIST for digital signatures earlier on but which have limited application.)

To move across to these the business needs to become ‘quantum ready’ by adopting these quantum-safe data protection mechanisms before quantum computing becomes mainstream. It’s a process that involves technical, architectural and governance changes that allow encryption, certificates and key management systems to be upgraded and easily replaced. In fact, PQC ciphers are already being deployed alongside existing algorithms on web servers and clients, for example, which suggests there will be a hybrid state of deployment initially. 

The pros and cons of hybrid

A hybrid configuration buys the organization time to test, plan and implement the changes and provides a fallback option if PQC is unsupported. However, as the National Cyber Security Centre (NCSC) warns, a hybrid scheme should only ever be an interim measure. This is because combining PQC with a traditional public key cryptographic algorithm will afford no more security than a single PQC algorithm once a cryptographically-relevant quantum computer exists. 

The end goal has and always should be full migration to solutions that allow interchangeability of used encryption algorithms and that support emerging quantum resilient  encryption. This creates something of a dichotomous situation today because teams are having to weigh up the pros and cons of adopting a hybrid approach versus a straight migration to PQC. Considerations highlighted by the NCSC include interoperability, implementation security, protocol constraints, complexity, cost of maintaining that more complex system the prospect of having to do the migration twice in effect.

To minimise these downsides, it’s essential that the organization seeks to adopt a state of ‘crypto-agility’, that is the ability to ready systems to swap to PQC quickly and as standards evolve. This should be incorporated into a quantum-readiness plan (QRP) that the business puts in place to transition to PQC. The QRP should be methodical but it needn’t be complex – adopting a three phase approach will suffice. 

Step one should see the creation of an inventory and risk mapping to identify where cryptography is used and the assets most vulnerable to HNDL attacks. This is followed by redesigning the infrastructure so that algorithms and keys can be changed without the need for full re-platforming and sees PQC become a long-term architectural consideration. 

The final phase is to embed quantum readiness into policies and governance so that crypto-agility becomes part of procurement, vendor qualification processes and hardware lifecycle management procedures. This avoids the problem of accumulating additional technological debt by adding systems that can’t be adapted for PQC. 

With respect to digital signatures, we’re already seeing regulators exert pressure on organizations to rotate their certificates more often and to increase the strength of ciphers. This is driving a renewed emphasis on certificate lifecycle management, particularly as the realisation dawns that those that get ahead of the curve will be able to swap out their certificates with minimal impact and stand to reduce the risk of fake certificates.

Preparing for the eventuality

If there are only four years left to lay the groundwork for quantum, it’s vital that organizations prepare for the eventuality. In the US, some sectors are already compelled to do so, as Federal contracts require “demonstratable” PQC transition plan” for vendors and suppliers that sell to US Government and military by December 2025. All 16 critical infrastructure sectors spanning financial, pharma, transport, etc. are also expected to be aligned and compliant with PQC requirements by 2030 and this will have a trickle-down effect leading to smaller organizations “inheriting” PQR from their technology suppliers.

It therefore makes sense to look at how the business can begin to adapt efficiently through the use of automation which can, for instance, shorten the time taken to implement a QRP and enable keys and algorithms to be changed more quickly and with less disruption. 

Automation makes it easier with the mapping element in that it can be used to figure out where cryptography lives. While automated discovery can unearth the thousands if not millions of certificates and keys across clouds, on-premises infrastructures, devices and third-party applications and also create a living inventory of cryptographic assets that stays up-to-date even as the environment changes.

Automated lifecycle management of keys and certificates provides similar advantages in that it enables teams to rotate certificates at scale and to roll out new algorithms quickly, so that vulnerable, outdated or noncompliant certificates can be replaced in a timely manner. Plus, automation and orchestration can help improve governance through, for example, translating policies into effective controls by scanning for noncompliant certificates, auto-remediating misconfigurations and blocking deployments that violate standards before they go live.

Yet, while crypto-agility can be readily achieved using automated processes, there are worrying signs that organizations are not preparing. A survey by ISACA found that only 95% of technology and cybersecurity professionals have a QRP and more recently IBM found that executives are still failing to treat quantum as a planning priority. 

In its ‘The Enterprise in 2030’ report, IBM states that expectations and readiness are out of kilter, leading to a failure to adapt in time. It further stresses that quantum-safe security is not a competitive advantage but a baseline capability that organizations must develop alongside their broader technology strategies. 

The inertia we’re seeing today, however, means those treating quantum as tomorrow’s problem are taking a huge and unnecessary risk.

 

Join our LinkedIn group Information Security Community!

No posts to display