I spend my days designing and building systems that talk to each other across cloud regions, enterprise tenants, and trust boundaries. Lots of encrypted channels. Lots of assumptions baked into those channels that the maths underneath them is, essentially, unsolvable. But that guarantee is increasingly being eroded as we build better and better quantum computers. The attack isn't here yet. But the data being harvested for that attack might already be in someone's hands. As a staff engineer I design systems not just for today or tomorrow but for the next 5 to 10 years. The dangers of quantum powered cryptology is a problem in that timeline and anyone building sensitive systems should be preparing to address it.

The challenge I for see today is the effort of migrating to PQC. Cryptography isn't a discrete component you swap out; it's a load-bearing assumption baked into virtually every layer of a modern system. TLS configurations, certificate authorities, hardware security modules, key management services, third-party SDKs, and vendor APIs all carry cryptographic dependencies that most organisations have not fully mapped. Before the migration conversation can even begin, there's a more fundamental challenge: do you actually know where your cryptographic surface is? For most enterprise architectures, the honest answer is not really.

The unknown is where the real cost lives. The hard problem is organisational: migration requires coordinated effort across teams that rarely share a common dependency map, and in systems where cryptographic agility was never a design principle, retrofitting will be expensive.

I spend my days as platform security engineer, at the other end of this story: where cryptography stops being an abstract guarantee and becomes an operational system of certificates, keys, rotations, dependencies, and failure modes. Quantum computing is no longer science fiction, but it is not the cryptographic apocalypse either. Today, it sits in a more uncomfortable place: commercially real, scientifically accelerating, and not yet capable of breaking modern public-key cryptography at scale. My honest view on post-quantum cryptography is that the industry keeps treating it as a future problem, when the hardest parts of the migration are already sitting in production today. The quantum break may not be here yet, but the migration problem is. PQC will not become urgent because of a single dramatic moment. It will become urgent because long-lived data, certificates, keys, libraries, protocols, and compliance requirements all have to be moved before the deadline is obvious. Harvest-now-decrypt-later means the risk is already attached to what we ship today. The question is not whether algorithms will change, but whether our systems are ready to change with them

Introduction

The emergence of quantum computing poses unprecedented challenges to cyber security. Quantum computers, notably leveraging algorithms such as Grover's, present an unparalleled potential to disrupt the established security foundations of classical cryptographic methods.

None
Photo by FlyD on Unsplash

Quantum Computing Basics

Quantum computing uses the principles of quantum mechanics to perform computations using quantum bits (qubits). Unlike classical bits, qubits can exist simultaneously in multiple states, enabling quantum computers to process complex information much more efficiently. Quantum superposition and entanglement are the key features that allow these machines to solve specific problems exponentially faster than classical computers.

Quantum superposition is the ability of a quantum system, like a quantum bit, to exist in multiple states at the same time until it's measured. This is a fundamental concept in quantum mechanics and is a core property that gives quantum computers their power.

  • Classical Computers: Bits can only be 0 or 1. A group of 3 bits can represent one of eight possible combinations at any given moment.
  • Quantum Computers: Qubits can exist as 0, 1, or a combination of both simultaneously. A group of 3 qubits in superposition can exist in all eight possible combinations at the same time.

The crucial part is that this "multi-state" existence is maintained only until the qubit is measured or observed. At that point, the superposition collapses, and the qubit randomly "chooses" one of the classical states (0 or 1).

Superposition enables quantum parallelism. Because qubits can hold and process multiple states at once, a quantum computer can perform a huge number of calculations simultaneously. This allows it to explore many different possibilities at the same time, giving it a massive advantage over classical computers for specific, complex problems, particularly in the security space:

  • Searching huge, unsorted databases (Grover's algorithm)
  • Factoring very large numbers (Shors algorithm)

Quantum entanglement is a phenomenon where two or more qubits become linked in such a way that the state of one qubit is instantly correlated with the state of the other, regardless of the distance between them. This is often described as "spooky action at a distance," a term coined by Albert Einstein.

None
Photo by Collab Media on Unsplash
  • Superposition: Allows a single qubit to exist in multiple states at once.
  • Entanglement: Links the states of multiple qubits together.

Entanglement is a key resource for quantum algorithms. By entangling qubits, a quantum computer can create a highly correlated system that allows for complex, parallel computations. It is used to:

  • Speed up calculations beyond what superposition alone can achieve.
  • Implement quantum teleportation.
  • Enable quantum cryptography.

Quantum Algorithms vs. Classical Algorithms

  • Underlying Principles: Classical algorithms are based on classical physics and Boolean logic. Quantum algorithms are based on quantum mechanics.
  • Unit of Information: Classical algorithms operate on bits (0 or 1). Quantum algorithms use qubits (superposition of 0 and 1).
  • Computation Style: Classical algorithms process information sequentially. Quantum algorithms can explore multiple solutions simultaneously (quantum parallelism).
  • Logic Gates: Classical algorithms use AND, OR, NOT. Quantum algorithms use Hadamard, CNOT, etc.
  • Outcome: Classical algorithms are deterministic. Quantum algorithms are probabilistic.

Quantum algorithms are designed for specific, highly complex problems where the exponential power of quantum mechanics provides a significant speedup:

  • Shors Algorithm: Finds the prime factors of a large number exponentially faster than classical algorithms (major implications for cryptography).
  • Grover's Algorithm: Searches an unsorted database quadratically faster than a classical search.

Quantum Programming vs Classical Programming

Classical Programming

  • Uses languages like Python, C++, or Java.
  • Based on Boolean logic, bits (0 or 1).
  • Deterministic and sequential.

Quantum Programming

  • Uses qubits (0, 1, or superposition).
  • Probabilistic outcomes.
  • Parallelism via quantum gates (Hadamard, CNOT, etc.).

Q# is a programming language developed by Microsoft specifically for quantum computing. It's designed to be used with a classical host language like Python or C#. The classical program handles control flow, while Q# defines the quantum algorithm.

Quantum Impact on Cybersecurity

The vast majority of online security relies on asymmetric encryption (public-key cryptography like RSA and ECC). The security of these systems is based on the difficulty of reversing certain calculations (e.g., factoring large numbers).

A sufficiently powerful quantum computer could use Shor's algorithm to solve this problem exponentially faster, rendering these encryption methods effectively useless. The risk is that malicious actors could "harvest now, decrypt later" — collecting encrypted data today to decrypt it once quantum computers are available.

Shor's Algorithm efficiently finds the prime factors of a large composite number. It uses quantum period-finding and the Quantum Fourier Transform to extract the period, then classical post-processing to compute the factors.

  • Why it's a Big Deal: Modern encryption like RSA is based on the assumption that factoring large numbers is infeasible. Shor's algorithm can do this in polynomial time, threatening the security of RSA and similar methods.

Grover's Algorithm provides a quadratic speedup for searching an unstructured database or list. It uses superposition, a quantum oracle, amplitude amplification, and repeated measurement to find the target item.

  • Attacking symmetric-key cryptography (e.g., AES)
  • Preimage attacks on hash functions

Example:

  • Classical brute-force: 2^n operations for n-bit hash
  • Quantum (Grover's): 2^(n/2) operations

Post-Quantum Cryptography (PQC)

PQC is focused on developing new algorithms secure against attacks from both classical and quantum computers. The urgency is due to the "harvest now, decrypt later" threat.

PQC algorithms are based on mathematical problems thought to be difficult for both classical and quantum computers:

  • Lattice-based cryptography
  • Hash-based cryptography
  • Code-based cryptography

NIST has selected several algorithms to become official standards, including CRYSTALS-Kyber (key encapsulation) and CRYSTALS-Dilithium (digital signatures).

In January 2025, the U.S. federal government published new guidance to strengthen cybersecurity innovation, with a strong focus on post-quantum readiness — directing federal agencies to accelerate adoption of NIST's post-quantum cryptography standards, invest in quantum-resistant research, inventory vulnerable cryptographic assets, and build workforce capability, all underpinned by public-private partnerships. This reinforces the urgency for organisations to begin migration planning now.

Watch this video for a clear, visual explanation of post-quantum cryptography and its importance -

A Platform Security Engineer's Outlook on PQC

PQC largely focuses on improving asymmetric/public-key algorithms like Diffie-Hellman, RSA, Elliptic Curve Cryptography (ECC). This is because Symmetric key algorithms, albeit still vulnerable, seem to show better defense while using higher AES key lengths. Grover's algorithm halves the security level of Symmetric algorithms, which retains 128 bits of security for an AES-256 key. However, Shor's algorithm would make RSA and ECC insecure at practical sizes, so just increasing key lengths is not a viable long-term mitigation. NIST has additionally published three standard quantum-resistant algorithms.

  • Module-Lattice-Based Key Encapsulation Mechanism (ML-KEM)
  • Module-Lattice-Based Digital Signature Standard (ML-DSA)
  • Stateless Hash-Based Digital Signature Standard (SLH-DSA)

Classical public key cryptography rests on the shoulders of integer factorization and discrete logarithms. However, as shown by Shor's algorithm and Grover's algorithm, a quantum computer can crack the math much faster.

And that's why cryptography shifted its focus from algebra into a field that still has the power to instill fear in everyone, from high-schoolers to even quantum computers… Geometry. And lattice geometry, at that.

Lattice is a grid of points in an n-dimensional space. Many lattice based cryptographic protocols generally relate to two fundamental hard problems on lattices, the Shortest Vector Problem (SVP) and the Closest Vector Problem (CVP). Lattice-based signatures replace number theory with problems on structured lattices that are supposed to be hard for both traditional and quantum attacks. Modern cryptography is based on two major pillars,

  • Key establishment (to enable encryption) — Classical systems used RSA key transport or (EC)DHE to establish a shared secret that later becomes symmetric session keys. ML-KEM replaces these classical public-key key-establishment mechanisms with a post-quantum alternative.
  • Digital signatures (authentication and integrity) — Classical systems used RSA or ECDSA to sign certificates, handshakes, and messages. ML-DSA replaces these classical public-key signature schemes with a post-quantum alternative.

ML-KEM is based on Module-LWE. ML-DSA is based on module-lattice hardness typically described via Module-LWE and Module-SIS, short-vector style problems. The next few lines do a short (but fun) detour to understand LWE and SIS, and how they collaborate make lattice cryptography powerful.

Learning With Errors (LWE) involves recovering a vector given a set of noisy linear equations, finding s when provided values for A (public matrix) and b (public result) in the equation Α ∙ 𝑠 + 𝑒 ≡ 𝑏 (𝑚𝑜𝑑 𝑞) where 𝑒 is a small, unknown error term.

If not for the errors (who would have thought we would be thanking errors one day), finding the value of s would be easy using Gaussian elimination. Gaussian elimination takes linear combinations of n equations, thereby amplifying the error and corrupting the information. LWE is computationally equivalent to a variation of the Closest Vector Problem (CVP) on a specific type of lattice. Short Integer Solution (SIS) involves finding a non-zero vector that solves a linear system such that the values are small. Essentially, it aims to find a short, non-zero vector y, for a public matrix A, such that Α ∙ 𝑦 ≡ 0 (𝑚𝑜𝑑 𝑞) whose size is less than a certain specified amount. This corresponds to finding short solutions in lattices which makes it a Short Vector Problem (SVP).

At the start of the article, we described the operational challenges while migrating to post-quantum cryptography, but some technical challenges, could include:

  • Standards Alignment: Migration strategies and digital signature algorithms must align with emerging IETF standards and NIST.
  • Backward Compatibility: Support must continue for both classical and post-quantum algorithms.
  • Performance Overhead: The performance, size and computation overhead has to be compared against classical algorithms.
  • Infrastructure support: Infrastructure support remains uneven, public CA issuance, HSM/KMS-backed key custody, and widely deployed TLS/X.509 validation stacks often lag PQC.

So where does that leave us?

The migration to PQC could potentially touch every small surface of our infrastructure. As engineers, we don't get to wait with bated breath for quantum computers to arrive. We have to migrate before they do. That means planning hybrid deployments, measuring overhead, aligning with standards, and gradually tinkering our infrastructure slowly, but steadily towards ML-KEM, ML-DSA, and whatever follows next.

A quantum winter is coming. And our keys must be forged before the first fall of snow.

None
Photo by Mikhail Mamaev on Unsplash