This month in Canada, every federal department was required to produce a migration plan, backed by annual reporting and real deadlines for critical systems by 2031, and a full transition by 2035.
That may sound like policy alignment with global timelines. It’s not. What Canada has effectively done is force every agency to answer a much harder question: Where is your cryptography actually embedded? And, can you change it?
There’s still a tendency to frame PQC as a straightforward upgrade: swap RSA and ECC for Kyber and Dilithium, update libraries, and move on. That’s neat in theory and misleading in practice.
Cryptography isn’t centralized. It’s scattered across systems, hardcoded into applications, buried in legacy protocols, tied into identity infrastructure, and often controlled by vendors. In many organizations, no one has a complete picture of where keys are generated, how encryption is enforced, or which systems quietly depend on outdated assumptions.
Canada’s mandate brings that reality to the surface. Before anything can be upgraded, it must be found. And that discovery process is where most of the real work lies today.
The “harvest now, decrypt later” threat has created urgency for PQC adoption. Adversaries are already collecting encrypted data with the expectation that it will be readable once quantum capability matures.
But there’s a quieter issue: for many systems, the migration timeline is already longer than the quantum timeline.
Think about long-lived infrastructure from embedded systems, to industrial environments, and government platforms that were never designed for change. Add to that the complexity of PKI ecosystems, certificate chains, vendor dependencies, and performance constraints. Even with standardized algorithms, deploying them safely across production systems is not trivial.
Canada’s deadlines acknowledge this reality. The push to start now isn’t about algorithm readiness; it’s about system readiness.
The most important part of Canada’s move isn’t the 2035 endpoint. It’s the requirement that new digital procurements are PQC-ready and crypto-agile. That’s where the real acceleration happens.
Engineering-led migrations are slow and uneven. Procurement-driven change is systemic. When requirements shift at the point of purchase, vendors adapt, architectures evolve, and new systems are built with flexibility in mind. Over time, that matters more than any single upgrade cycle. It’s how large-scale transitions actually take hold.
Most current approaches to PQC fall into familiar patterns. Some organizations are adopting hybrid cryptography and layering classical and post-quantum algorithms together. Others are waiting for standards and implementations to fully mature before acting. Both approaches have merit. Neither fully addresses the underlying issue.
They assume cryptography will remain embedded in systems and periodically swapped out. That assumption leads to repeated cycles of disruption; every time algorithms change, systems must change with them.
Canada’s roadmap hints at a different requirement: not just stronger algorithms, but the ability to adapt continuously. That’s where crypto-agility comes in. Crypto-agility is often described as the ability to swap algorithms. In reality, it’s much more demanding.
It means decoupling cryptography from the systems that depend on it. It means being able to update security policies without rewriting applications. It means treating encryption as something that can evolve independently, not something baked into infrastructure.
Without that separation, every future transition (quantum or otherwise) becomes another multi-year engineering effort. With it, change becomes manageable.
This is where the industry starts to diverge.
One path is to retrofit existing infrastructure: update libraries, replace certificates, and work through systems one by one. It’s necessary in many cases, but it’s slow, expensive, and difficult to scale. The other path is to rethink how cryptography is delivered altogether.
Arqit’s approach sits firmly in that second category. Rather than relying on public-key exchange mechanisms that need to be upgraded over time, it focuses on delivering symmetric encryption keys at scale, managed centrally and applied dynamically. The goal is not just to be quantum-safe today, but to remove long-term dependency on vulnerable key exchange models entirely.
In practical terms, that means security can evolve without requiring every underlying system to change. It shifts the burden away from infrastructure and into a layer that can be updated independently.
Canada’s regulations are a signal that the timeline is no longer theoretical. Planning is required now. Procurement is changing now. Progress will be measured now.
The organizations that respond effectively won’t treat this as a compliance exercise. They’ll use it as an opportunity to understand their systems more deeply, to design for flexibility, and to reduce long-term risk, not just from quantum threats, but from the inevitability of future cryptographic change.
The industry has spent years asking which algorithms will win. That’s not the right question anymore. The real question is whether your systems can survive continuous cryptographic change.
Canada has made it clear that this isn’t a future problem. It’s a present one. And for many organizations, the hardest part isn’t what comes next - it’s understanding what they’ve already built.
28 April 2026