Published on andrewbaker.ninja | Enterprise Architecture & Banking Technology
There is a quiet revolution happening in physics laboratories around the world, and most of the people who should be worried about it are not paying attention yet. That is about to change. Quantum computing is advancing faster than anyone predicted five years ago, and when it matures, it will shatter the encryption that protects virtually everything we hold dear in our digital lives, bank transactions, medical records, state secrets, and the messages you send to your family.
This is not science fiction. It is an engineering problem with a hard deadline, and the deadline is closer than you think.
1. Let’s Start at the Beginning: What Is Encryption, Really?
Before we can understand the quantum threat, we need a clear picture of what encryption is and why it works.
Imagine you want to send a secret message to a friend. You agree on a secret code beforehand, say, shift every letter three positions forward in the alphabet, so “A” becomes “D” and “B” becomes “E”. Anyone who intercepts the message sees gibberish. Only your friend, who knows the shift rule, can decode it. That is the essence of encryption.
Modern encryption works on the same principle but uses mathematics instead of alphabet shifts. Specifically, it relies on mathematical problems that are trivially easy to do in one direction but astronomically hard to reverse. The classic example is multiplication. Take two large prime numbers, say, a number with 300 digits, and multiply them together. Any computer can do that multiplication in a fraction of a second. But if I hand you only the result and ask you to find the original two prime numbers, even the most powerful computers on Earth today would take longer than the age of the universe to work it out.
That difficulty is the foundation of most encryption you encounter every day.
2. The Algorithms We Rely On Right Now
The encryption landscape today rests on a relatively small number of foundational algorithms. Understanding them at a high level matters, because each has a different vulnerability profile against quantum attacks.
RSA (named after its inventors Rivest, Shamir, and Adleman) is the workhorse of public key cryptography. When your browser shows a padlock icon and establishes a secure HTTPS connection, RSA is almost certainly involved. It protects the handshake that sets up the encrypted tunnel. RSA’s security rests entirely on that multiplication problem described above, the difficulty of factoring large numbers.
Elliptic Curve Cryptography (ECC) is a more modern and efficient cousin of RSA. It provides the same level of security with much shorter key lengths, making it preferred in environments where computing power is constrained, think mobile devices, payment terminals, and IoT sensors. ECC underpins much of the TLS encryption used in banking APIs and mobile applications today. Its security rests on a related mathematical problem called the discrete logarithm problem on elliptic curves.
AES (Advanced Encryption Standard) is a symmetric cipher, meaning both parties use the same key. It is used to encrypt the actual data once RSA or ECC has established a secure channel. AES protects data at rest, encrypted hard drives, database columns, archived files. It is widely considered robust and is used by governments and militaries worldwide.
SHA (Secure Hash Algorithm) is not an encryption algorithm in the traditional sense but a hashing function. It converts any input into a fixed length fingerprint. Banks use SHA to verify data integrity, if even a single byte of a transaction record changes, the hash changes completely. SHA also underpins digital signatures, which prove that a document has not been tampered with and that it came from a verified source.
The TLS protocol (Transport Layer Security), which you encounter every time you see “https” in your browser, combines these algorithms. RSA or ECC negotiates a shared secret, AES encrypts the actual data flowing back and forth, and SHA verifies integrity. It is an elegant system that has served us well for decades.
3. Enter the Quantum Computer
A classical computer, the one in your laptop, your phone, the servers running your bank, processes information as bits. Each bit is either a 0 or a 1. Every calculation is a sequence of operations on these binary values.
A quantum computer uses quantum bits, or qubits. And here is where physics gets strange. A qubit can be a 0, a 1, or, thanks to a quantum property called superposition, effectively both at the same time. Furthermore, qubits can be entangled, meaning the state of one qubit is instantly correlated with the state of another, regardless of physical distance. These properties allow a quantum computer to explore enormous numbers of possible solutions simultaneously rather than one at a time.
For most problems, this does not help much. But for certain specific mathematical problems, quantum computers are not just faster, they are exponentially faster in ways that completely break the difficulty assumptions that encryption relies on.
In 1994, a mathematician named Peter Shor published an algorithm, now called Shor’s Algorithm, that runs on a quantum computer and can factor large numbers exponentially faster than any classical computer. When a sufficiently powerful quantum computer running Shor’s Algorithm exists, RSA and ECC are broken. Not weakened. Broken. What currently takes longer than the age of the universe takes hours.
A second relevant algorithm, Grover’s Algorithm, provides a quadratic speedup for searching through unstructured data. This halves the effective key length of symmetric algorithms like AES. AES-128 becomes roughly as secure as a 64-bit key, which is crackable. AES-256 becomes roughly equivalent to AES-128, still acceptable for now, but the margin has shrunk significantly.
4. The “Harvest Now, Decrypt Later” Problem
Here is the part that should genuinely alarm every security professional and every executive responsible for sensitive data.
Quantum computers powerful enough to break RSA and ECC do not exist today. The current state of the art, systems from IBM, Google, and others, have hundreds to a few thousand qubits, but they are error prone and nowhere near the scale needed to run Shor’s Algorithm on real encryption keys. Most credible estimates put that capability somewhere between five and fifteen years away.
So why does this matter today?
Because sophisticated adversaries, nation states in particular, are almost certainly already collecting encrypted data they cannot currently read. They are storing it, waiting. When quantum capability arrives, they will decrypt years of harvested communications and data. This is not speculation. It is a rational strategy, and it costs almost nothing to execute given how cheap data storage has become.
Consider what that means in practice. A message encrypted and transmitted today that remains sensitive in ten years, say, a diplomatic cable, a long term business strategy, or a patient’s medical history, is already compromised in principle. The lock has been photographed. The key just has not been cut yet.
For banking, this has profound implications. Long term financial records, customer identification data, credit histories, and interbank settlement data could all be sitting in harvested caches waiting for quantum decryption.
5. Post Quantum Cryptography: The Response
The good news is that the mathematical and cryptographic community has known about this threat for decades and has been working on solutions. These solutions go by the name Post Quantum Cryptography (PQC), or sometimes Quantum Resistant Cryptography.
The approach is straightforward in concept: replace the mathematical problems that quantum computers can solve easily with different mathematical problems that quantum computers cannot. Three main families of problems have proven promising.
Lattice based cryptography relies on the difficulty of finding short vectors in high dimensional geometric structures called lattices. Imagine a crystal with billions of dimensions, finding a specific point within it is computationally intractable for both classical and quantum computers. Lattice problems have been studied for decades and have strong theoretical underpinnings. The leading PQC algorithms, CRYSTALS-Kyber for key encapsulation and CRYSTALS-Dilithium for digital signatures, are lattice based.
Hash based cryptography builds security on the same SHA hashing functions already in widespread use. SPHINCS+ is the primary hash based signature scheme. Its security assumptions are more conservative and better understood than newer approaches, which makes it attractive for high assurance applications.
Code based cryptography is based on the difficulty of decoding certain types of error correcting codes. This is one of the oldest areas of post quantum research, with the McEliece cryptosystem dating to 1978.
6. The NIST Standardisation Process
The United States National Institute of Standards and Technology (NIST) recognised the urgency of this problem in 2016 and launched a multi year global competition to evaluate and standardise post quantum algorithms. Cryptographers from around the world submitted candidates, and the process involved years of public scrutiny, attempted attacks, and mathematical analysis.
In August 2024, NIST published its first set of finalised PQC standards. These are not experimental proposals, they are production ready specifications intended for immediate adoption.
The three initial standards are ML-KEM (based on CRYSTALS-Kyber, used for key encapsulation, establishing shared secrets), ML-DSA (based on CRYSTALS-Dilithium, used for digital signatures), and SLH-DSA (based on SPHINCS+, a hash based signature alternative). A fourth standard, FN-DSA (based on Falcon, another lattice based scheme optimised for smaller signature sizes), is expected to be finalised shortly.
These standards represent the global consensus on what quantum resistant cryptography looks like for the next generation of secure systems.
7. What This Means for Your Technology Stack
This is where things get very concrete and very expensive. The encryption algorithms described above are not isolated modules sitting in one place. They are woven into virtually every layer of modern technology infrastructure, and ripping them out and replacing them is a massive undertaking.
7.1 Data in Flight
Every TLS connection uses RSA or ECC for its handshake. That covers your web applications, your APIs, your service to service communication inside microservice architectures, your database connections, your message brokers, your load balancers, and your VPNs. All of it needs to be upgraded to support hybrid key exchange, a transitional approach that combines a classical algorithm with a post quantum one, providing protection even if one is compromised.
Modern versions of TLS (1.3) and the underlying libraries, OpenSSL, BoringSSL, and similar, are already adding support for post quantum key exchange. But every system that terminates TLS needs to be upgraded: web servers, API gateways, CDN edge nodes, load balancers, network appliances, HSMs (Hardware Security Modules), and more. Many of these have long hardware refresh cycles and embedded firmware that is difficult to update.
7.2 Data at Rest
AES-256 remains acceptable against quantum attacks, Grover’s Algorithm halves its strength, but 256-bit strength halved is still 128-bit equivalent strength, which is currently considered secure. The immediate priority for data at rest is therefore ensuring you are using AES-256 everywhere, not AES-128. Many legacy systems still use AES-128 or, worse, older algorithms like 3DES, which need to be remediated regardless of quantum concerns.
However, the key management infrastructure protecting your AES keys is another matter entirely. Those keys are typically encrypted or exchanged using RSA or ECC. If your key management system, whether that is a cloud KMS service, an on premise HSM cluster, or a custom solution, uses classical public key cryptography to protect AES keys, the chain of trust is broken at the key management layer even if the data encryption itself is quantum resistant. Key management infrastructure needs to be upgraded to use post quantum algorithms for key wrapping and key exchange.
7.3 Digital Certificates and PKI
Public Key Infrastructure (PKI) is the system of trust that underpins digital certificates, the mechanism that allows your browser to verify it is talking to your real bank and not an impersonator. Every certificate in use today is signed using RSA or ECC. Certificate authorities, certificate revocation mechanisms, OCSP responders, and the trust stores built into every operating system and browser all need to be migrated to post quantum signature schemes.
This is complicated by the fact that certificates have expiry dates measured in months to a few years, so the migration can be staged, but the root certificates at the top of the trust hierarchy are long lived and need early attention. Browser vendors and operating system providers are already working on this, but enterprise PKI environments, which often include private certificate authorities for internal services, need their own migration plans.
7.4 Secure Shell (SSH)
SSH is the protocol used to securely administer servers and network infrastructure. It uses RSA, ECC, and related algorithms for both host key authentication and user authentication. Every SSH server and client, which means virtually every Linux server, network device, and cloud instance, will need updated key types and algorithm preferences. The OpenSSH project has already added experimental support for post quantum key exchange, but enterprise environments need planned migration paths.
7.5 Code Signing and Software Supply Chain
Software companies sign their releases digitally so that operating systems and update mechanisms can verify that the software you are installing is genuine and has not been tampered with. These signatures use, you guessed it, RSA or ECC. A quantum capable adversary could forge signatures on malicious software. Migration to post quantum signature schemes for code signing is critical for long term software supply chain security.
7.6 Hardware Security Modules
HSMs are specialised hardware devices designed to perform cryptographic operations and store keys securely. They are the backbone of payment processing, certificate authorities, and high assurance key management. HSMs have long lifecycles, five to ten years is common, and many current generation devices have limited or no support for post quantum algorithms. Organisations need to inventory their HSMs and plan replacements or firmware upgrades accordingly. This is not cheap, and procurement lead times for specialised hardware can be long.
7.7 Internet of Things and Embedded Systems
Perhaps the most difficult part of the migration is embedded systems and IoT devices. Payment terminals, ATMs, smart meters, industrial control systems, and connected devices of every description run firmware with hardcoded cryptographic algorithms. Many cannot be updated remotely. Some cannot be updated at all. For the banking sector specifically, the number of deployed payment terminals and ATMs globally is enormous, and the logistics and cost of replacing or updating them is staggering.
8. The Banking Sector: A Special Case
Banks sit at the intersection of almost every dimension of this problem. They hold extraordinarily sensitive data about their customers, financial histories, identity documents, behavioural patterns, and they are governed by strict regulatory frameworks that mandate specific security controls. They operate complex ecosystems involving core banking systems that are decades old, modern digital banking platforms, real time payment rails, card networks, and a vast web of third party integrations.
The interbank settlement systems, the infrastructure through which banks settle obligations with each other, are critical national infrastructure. In South Africa, systems like SAMOS (the South African Multiple Option Settlement system) and the various payment clearing mechanisms operated by BankservAfrica represent the plumbing of the financial system. The cryptographic protections on these systems need to be quantum resistant before quantum threats materialise.
SWIFT, the global interbank messaging network, has already published guidance on post quantum migration timelines and is working on updates to its protocols. Card schemes including Visa and Mastercard are engaged in similar efforts. The PCI-DSS standard, which governs payment card security, will inevitably incorporate post quantum requirements in future versions.
Regulatory bodies globally are beginning to take notice. The Financial Stability Board has flagged quantum computing as a systemic risk. Central banks and prudential regulators are starting to ask questions about quantum readiness in their supervisory processes. Boards and executives who are not yet thinking about this should be.
9. Crypto Agility: The Architectural Principle That Changes Everything
One of the most important lessons from the post quantum migration is not specific to quantum at all. It is about a concept called crypto agility: designing systems so that cryptographic algorithms can be swapped out without fundamental architectural change.
Most systems built over the past twenty years hardcode specific algorithms deep in their implementations. Changing the algorithm means changing the code, testing the change, deploying it, a significant engineering effort multiplied across every system in the estate. If the entire industry had adopted crypto agile architectures from the beginning, the quantum migration would be an operational challenge rather than an existential one.
Going forward, every new system should be built with crypto agility as a first class requirement. Algorithm selection should be a configuration concern, not a code concern. Cryptographic operations should be encapsulated behind well defined interfaces that can be backed by different implementations. Key management systems should be designed to support multiple algorithm types simultaneously.
10. What Should You Be Doing Right Now?
The migration to post quantum cryptography is not a project that can be started when quantum computers become a near term reality. By then it will be too late. The harvest now, decrypt later threat means the window for protecting long lived sensitive data has already partially closed.
A practical roadmap looks something like this.
Start with a cryptographic inventory. You cannot protect what you cannot see. Every system, every data store, every API endpoint, every certificate needs to be catalogued with the algorithms it uses. This is tedious work, but it is foundational. Many organisations are surprised to discover how much classical cryptography is buried in unexpected places, legacy batch processes, backup systems, monitoring agents, and logging pipelines.
Assess the sensitivity and longevity of your data. Not all data needs the same level of urgency. Data that will be public in five years and is not sensitive today is a lower priority. Data that must remain confidential for twenty years, long term contracts, personal identification records, health records, needs to be protected now with quantum resistant methods or at minimum with hybrid approaches that add a post quantum layer on top of classical encryption.
Begin hybrid deployments for data in flight. Major cloud providers and CDN vendors already support hybrid key exchange in TLS. Enabling this configuration for internet facing services is a relatively low risk first step that provides immediate protection against harvest now, decrypt later attacks.
Plan your PKI migration. Identify your certificate authorities, understand your certificate inventory, and develop a migration plan for moving to post quantum signing algorithms. This is a long runway project given the dependencies on browser and OS trust stores, but the planning needs to start now.
Engage your hardware vendors. Ask your HSM vendors, network appliance vendors, and embedded system suppliers about their post quantum roadmaps. If they do not have credible answers, that should factor into your procurement decisions.
Build crypto agility into new systems. Every greenfield project should be designed from the outset to support algorithm agility. This is the easiest time to get it right.
Train your teams. Post quantum cryptography involves concepts that are unfamiliar to most engineers and architects. Building internal capability now pays dividends throughout the migration.
11. The Horizon
Quantum computing and post quantum cryptography are one of those rare convergences where the threat and the defence are both genuinely new. The mathematics is settled, we know what is broken and we know what the replacements are. What remains is the enormous operational challenge of migrating the world’s technology infrastructure.
The organisations that treat this as an urgent priority today will be in a strong position as quantum capability advances. Those that wait for the threat to become immediate will face a chaotic scramble to protect data that is already potentially compromised.
We are not at the end of the encryption era. We are at a transition point, and the post quantum era is already beginning. The NIST standards are published. The algorithms are ready. The only question is how quickly we can deploy them.
The padlock on your digital life is being changed. The question for every organisation is whether they will do it on their own terms and timeline, or be forced to do it in a panic when the quantum threat arrives.
Andrew Baker is Chief Information Officer at Capitec Bank. He writes about enterprise architecture, cloud technology, and the future of banking at andrewbaker.ninja.