Global Quantum Software Research Agenda
(2026–2035)
Quantum Software Alliance
February 7, 2026
Abstract
Quantum computing is transitioning from laboratory prototypes to systems integrated with high-performance computing (HPC) and AI. Practical utility depends on hardware scaling and software capability. Hardware must increase in physical scale and fidelity. Software must translate these physical resources into logical performance. This requires a software stack that integrates compilers, error correction, and rigorous verification. Simultaneously, algorithmic discovery and optimization are necessary to identify impactful applications. These algorithms must be co-designed with the software stack to ensure the most efficient use of available hardware. This Agenda defines a scientific plan to advance these capabilities over the next decade (2026–2036). We aim to translate research into sector-level impact through auditable pilots. The Quantum Software Alliance maintains this Agenda as a neutral, community-led resource, updated annually. It reflects collective research priorities and does not reproduce any single institutional or national programme plan.
Executive Summary
Quantum computing is entering a decisive moment. The field has progressed from theoretical exploration to the systematic construction of processors that grow in scale and capability. While hardware roadmaps [1] provide clear milestones for physical scaling, software research remains fragmented. Unlike classical software engineering, the field lacks a coordinated infrastructure to capitalize on breakthroughs in high-performance computing (HPC) and artificial intelligence (AI). Currently, no shared trajectory exists to synchronize quantum software readiness with hardware delivery.
To bridge this gap, we must establish a software roadmap based on rigorous resource estimation. This framework connects fundamental research with current hardware realities, data, and end-users while remaining forward-looking; it embraces new methods and ensures that long-term algorithmic innovation is supported alongside the development of practical tools for today’s processors [2]. This approach integrates algorithm discovery, error mitigation, error correction and optimal compiler, to define the specific hardware performance bounds, such as qubit counts, gate fidelities, connectivity, and latencies, required to identify and solve the full set of impactful use-cases [3]. The frontier has shifted from proof-of-principle demonstrations toward the co-development of tools that show practical utility on real devices at the scale of hundreds of qubits. This transition is critical as society, industry, and government now demand useful quantum applications [4]. By defining these technical requirements, software sets the targets that hardware must meet to deliver a stack that is rigorously verified, tested, benchmarked, trusted, and provides certifiable utility.
The Quantum Software Alliance (QSA) will act as the global alliance for quantum software developers and users. We commit to a future in which quantum software is practical, collaborative, and integrated with AI and HPC through open standards and shared benchmarks. This Agenda defines core Capabilities for the software stack and shows how they translate into Sector Programmes. Each capability includes measurable outcomes and a year-on-year progression. The Agenda places error correction and fault-tolerant operation within the software remit, including code selection, syndrome scheduling, and decoder integration. To ensure results are testable on real hardware, the QSA will maintain a living artefact repository including intermediate representation (IR) conformance tests and reference workloads. The QSA maintains this Agenda as a neutral, community-led document, updated annually to reflect the evolving evidence base.
Purpose and Scope
This Agenda is the community reference for what quantum software capabilities to build, how to measure progress credibly, and where to apply those capabilities so that research and development produces reliable value. It responds to the current phase of quantum computing, in which practical utility depends on software that is resource-realistic, verifiable, and engineered for hybrid operation with high-performance computing (HPC). It recognises that the economic opportunity is concentrated in the software and application layer but can only be captured through a coordinated international effort. To operationalise this, the QSA will facilitate a global open platform that federates existing community repositories into a neutral, one-stop index of code, datasets, benchmarks, standards drafts, and testbed access procedures. Rather than duplicating existing infrastructure, the Alliance provides a unified discovery layer where registration and conformance tracking are managed by topic-area task forces. This index points to a living registry of reference workloads and verification-ready pipelines, providing a collaboration framework that connects end-users and research teams for sector pilots.
The Agenda operationalises the software roadmap by mapping a ten-year progression of capabilities (2026-2036). It sets technical readiness standards by defining the milestones, including resource estimation targets and verification-ready outputs, required to synchronize software readiness with hardware scaling. These metrics, such as conformance status and cross-testbed replication, allow the community to evaluate whether research has reached the maturity needed for deployment. Scope spans the full software stack needed to convert hardware progress into dependable applications:
- Algorithms and Methods: Complexity results and discovery for credible advantage.
- Languages and Compilers: Intermediate representations with formal semantics and conformance tests.
- Verification and Benchmarking: Protocols that tie device metrics to application-level performance.
- Systems and Architectures: Error-correction frameworks, hybrid HPC runtimes, and distributed quantum networking.
- Cross-cutting Enablers: Resource-estimation models, open-science infrastructure, and workforce development.
The intended audience includes policymakers, funders, national laboratories, HPC facilities, standards bodies, and the academic and industrial software communities. To ensure long-term coordination, the Alliance operates under a General Assembly and Steering Committee. Geographically balanced task forces curate the index and its references, ensuring that technical advances remain portable, comparable, and verifiable across diverse regions and hardware modalities.
Vision and Guiding Principles
Quantum computing is entering a decisive phase in which progress depends on building the structures that turn discovery into dependable capability. The Alliance’s vision is not to chase premature end-user solutions, but to operate an engine for discovery in which ideas become rigorous frameworks. These frameworks are integrated into algorithmic and programming environments, evaluated through verification and benchmarking, and connected to high-performance computing, cloud systems, and AI technologies [5]. In this model, co-development built on a foundation of core algorithmic research is the norm: system and compiler design reflect device constraints, decoders and runtime control are part of the software remit, and verification informs the definition of future hardware performance targets. This approach moves the community beyond proof-of-principle demonstrations toward practical software that runs on real devices with credible resource models and testable claims.
The guiding principle is systematic rather than opportunistic discovery. New algorithms and protocols are developed with transparent scaling analysis and open resource‑estimation that map design choices to logical depth, qubit counts, runtimes, and energy budgets. Translation is demand‑driven yet readiness‑aware: application candidates are framed against the best classical baselines, evaluated on cross‑testbed pipelines, and progressed only when verification criteria are met. Adaptivity is explicit because hardware progress is non‑linear; error‑correction milestones, connectivity, and control advances can accelerate or resequence feasibility, so priorities are revisited as evidence accumulates. These practices align with a verification‑first workflow in which benchmarking is derived from formal properties, assurance links device metrics to application performance, and reproducibility is achieved through open artefacts and conformance tests.
The Alliance will serve as a neutral, global platform that connects researchers, testbeds, standards bodies, and end-users. Its role is to curate open intermediate representations, compilers, datasets, benchmarks, decoders, and orchestration software, and to maintain a source‑of‑truth catalogue of artefacts and teams so collaboration is straightforward across regions and modalities. Governance ensures a lean process for annual updates of the Agenda and artefact releases, with geographically balanced leadership and task forces that can act quickly when the evidence base shifts. This open platform mandate and cadence follow directly from the QSA Charter.
This vision fixes the structure of the Agenda. The following chapters specify the software Capabilities the community should build, the Cross-cutting Enablers that make them portable, and the Application roadmaps that convert these capabilities into sector value. Each capability is accompanied by measurable outcomes across near- and long-term horizons, establishing a ten-year trajectory to align policy, funding, and procurement with credible milestones.
Software Capabilities
C1: Algorithms and Methods
Algorithms determine whether quantum processors deliver practical value and are the foundational capability upon which the entire software stack depends. This capability defines the trajectory of the Agenda: to identify where quantum advantage arises and to set the resource budgets that hardware must meet. If algorithmic discovery fails to demonstrate a verifiable advantage over classical methods, the remaining software layers lose their utility. Progress requires foundational research into new algorithmic principles alongside the continued investigation of quantum-specific methods that can open new application domains.
Near term focus. Design algorithms for problems that matter to end-users in industry and academia and that can yield a robust advantage on relatively small and noisy devices. Promising families include modelling of physical, material and chemical systems, approximate optimisation, and machine learning tasks amenable to hybrid execution. Development covers design, implementation, and validation on hardware at the available scale.
Longer term focus. As devices mature, priority lines include quantum simulation and scientific computing, specialized data analysis and machine learning approaches, amplitude amplification and estimation and related Monte Carlo accelerators, optimisation methods and problem relaxations, search problems, linear-system and differential-equation solvers. It will be important that quantum advantages are robust and do not concern only highly fine-tuned instances. Complexity and lower bound studies remain essential to identify genuine advantage and to set resource budgets for early fault-tolerant routes.
Methodology. End-to-end baselines against the best classical methods and open resource estimation tools that map algorithmic choices to logical qubits, depth, gate counts, wall clock time, and energy, including the classical computing needed for control and error decoding.
Verification oriented development integrates measurable acceptance criteria and assurance pipelines so that correctness and performance are testable on real hardware.
Deliverables. Reference implementations with explicit resource models; domain specific standardized subroutines; interoperable toolchains that connect algorithm design to compilers, runtimes, and data management; reproducible results with classical baselines engineered for hybrid HPC orchestration.
C2: Languages, Intermediate Representations, and Compilers
Intermediate representations with formal semantics must bridge high level programs to hardware aware back ends. Quantum compilers translate high level algorithms or model specifications into runnable code on specific devices; they optimise to reduce resource overhead; they perform architecture aware transformations including routing, synthesis, scheduling, and the introduction and management of error correction [6]; and they verify, to the greatest extent possible, that a program implements the intended algorithm before execution.
Progress requires both hardware agnostic techniques at higher levels of abstraction and hardware sensitive techniques that account for topology, control constraints, calibration data, and error models. Compilation is also a vehicle for assurance: the toolchain should surface proof obligations, enable equivalence checking, and produce artefacts that link algorithmic choices to logical resource estimates and device level constraints so that claims about correctness and performance are testable. Open and interoperable IR ecosystems with conformance tests are necessary to ensure portability and fair comparison and to prevent lock in.
C3: Verification, Assurance, and Benchmarking
Verification and testing are design drivers. Classical simulation does not scale to certify large quantum devices, so trustworthy performance requires new theoretical ideas and engineering practices. Future architectures should be shaped by their testability as well as by throughput, with verification hooks exposed at every layer of the stack and integrated with error correction and decoding.
Assurance is layered. At the specification level, program logics, type systems, and property checking establish correctness where possible and surface proof obligations that can be carried through compilation to the device [7]. At the protocol level, cryptographic and interactive verification techniques allow small trusted components to bootstrap trust in larger systems and enable verifiable delegation. Benchmarking is verification inspired: formal tests are compiled into executable workloads that produce platform agnostic capability certificates and link hardware metrics and logical error rates to application level performance. Device characterisation and diagnosis on the hardware level using tools such as randomized benchmarking feed compilers, decoders, and schedulers.
Deliverables. A unified assurance pipeline from specification to hardware; verification inspired benchmark suites with open artefacts; IR conformance tests; and design for testability guidelines exercised across multiple platforms and networked settings.
C4: Architectures, fault-tolerance, and Hybrid Quantum–HPC Systems
This capability defines the systems layer that makes quantum processors usable at scale. It spans architectural choices, error correction and fault-tolerant operation, classical co-processing for decoding and control, and hybrid orchestration with high-performance computing. The goal is predictable performance on real hardware under explicit constraints of noise, latency, and energy.
Architectures and programming models. Provide models that map high level descriptions to physical constraints, including compilation, routing, scheduling, and mapping to limited control surfaces. Plan for modular systems built from interconnected parts with partitioning, placement, and resource aware communication, and with verification hooks for testability.
Error correction and fault-tolerant computing. Select codes, schedule syndrome extraction, integrate decoding that translate syndrome information into actual quantum error correction steps into runtime paths, and link logical error rates to application level performance [8]. Near term variants tailored to small qubit counts reduce overheads and prepare subsystems for subsequent scaling.
Hybrid orchestration with HPC. Treat quantum as a first class accelerator [9]. Schedulers, workflow engines, and data management coordinate CPU, GPU, and QPU resources. Runtimes support real-time decoding paths and expose device agnostic job semantics. Orchestration integrates error aware simulation for pre- and post-processing, keeps energy and latency budgets explicit, and enables federation across facilities. Programming models allow quantum accelerated subroutines to be embedded in established HPC codes with audit trails for reproducibility.
Deliverables. Reference architecture and programming model for hybrid operation; open compiler pipeline targeting code families and logical instruction sets; decoder implementations with documented accuracy, latency, and energy profiles; runtime support for real-time decoding and device-agnostic job execution; verification-ready artefacts that link hardware metrics and logical error rates to application level outcomes.
C5: Distributed or Networked Quantum Computing and Security
At scale, useful workloads will involve multi-node execution, remote access, and secure delegation. A global quantum network architecture will distribute entanglement over terrestrial and space links and allow distributed quantum computation coordinated by long-range classical communication [10]. Such networks enable security properties that are not achievable with classical communication alone, including information theoretic guarantees for tasks such as uncloneability, certified deletion, and device or position verification. Networked algorithms must be designed to respect signalling constraints implied by relativity while using entanglement, teleportation, and error correction to reduce communication where possible. The software stack should therefore treat computation and communication as one design problem and optimise responsiveness to data generated in real-time across the network.
Software support is needed for routing and scheduling across heterogeneous nodes, authenticated access and privacy-preserving orchestration, and verifiable delegation protocols that make remote services certifiable [11]. Proofs of quantumness provide an operational path to demonstrate that cloud services are using quantum resources and can anchor service level agreements. A network operating layer should expose explicit latency and bandwidth budgets, manage entanglement distribution and buffering, and record evidence required for compliance and audit. Conformance tests and capability certificates should extend naturally from single node settings to multi-node execution so that performance claims remain portable and comparable.
Large general-purpose quantum computers will break the current widely deployed public-key cryptography. Migration to quantum-safe alternatives is therefore a mandatory, long-duration systems task. Initial standards have been published and their adoption will take years [12]. The underlying assumptions are largely lattice based and hash based, and increased academic work is required at the interface of quantum algorithms and the relevant mathematics, including revisiting classical proof techniques that do not carry over to quantum attackers. Software deliverables for this migration include parameter-selection tools, reference implementations integrated with mainstream libraries, and crypto agile architectures that allow algorithms and parameters to be updated without service disruption.
Quantum communication devices for key distribution and random-number generation already exist, but wider uptake depends on certification profiles, interoperable standards, and well-defined interfaces to post-quantum cryptographic schemes. The software stack should provide network components that combine these quantum primitives with post-quantum cryptography in a coherent, auditable system. Discovering where quantum information techniques provide genuine security advantages remains an open research frontier, especially for tasks with quantum inputs or outputs and for large-scale networks where light speed signalling constraints matter. Progress will be fastest when physicists, engineers, security experts, and cryptographers work to common specifications that tie algorithmic choices to measurable end-to-end outcomes.
Cross Cutting Enablers
Standards and Interoperability
The current quantum computing landscape is highly fragmented due to vendor-specific abstractions, vertically integrated toolchains, and proprietary programming models. This duplication of effort and lack of standardized benchmarks impede growth. Standards and interoperability are thus critical to building a cohesive global quantum-software ecosystem. Specifically, open, consensus-driven specifications are required to govern language and intermediate representations (IRs), compilers, runtime and job APIs, benchmark schemas, and crypto agility [13]. Crucially, mandatory conformance testing must be fully integrated across all modalities and hardware providers. Over the next decade, global actors will need to align around open, consensus-driven specifications that allow quantum software to be written once and executed across diverse backends. Key elements of this effort will include: common quantum-software abstractions and IRs; interoperable runtime and job-submission APIs; benchmarking metrics and verification frameworks; and security, cryptography, and governance standards.
The Quantum Software Alliance (QSA) must serve as a neutral convener, bridging academia, industry, national laboratories, and international standards organizations. By fostering pre-competitive collaboration, curating best practices, mapping standardisation gaps, and designing roadmaps, QSA will coordinate global efforts to ensure the quantum software ecosystem develops with core principles of openness, portability, and security.
Resource Estimation and Cost Models
Accurate and transparent resource estimation will be essential for evaluating quantum algorithms, planning system architectures, and guiding national investments over the next decade. As quantum computation progresses toward fault-tolerance, stakeholders require transparent, community-driven tools that map high-level algorithmic designs to detailed costs, including logical qubit counts, error-correction overheads, runtime under noise, and energy consumption. These tools will create a shared, comparable language between algorithm designers, compiler engineers, and hardware providers, directly informing procurement and policy decisions [14].
Simultaneously, cost models must expand beyond technical resources to include operational considerations such as energy consumption, cooling requirements, latency in hybrid quantum-classical workflows, and total cost of ownership for quantum infrastructure. Standardizing these estimation frameworks will improve clarity for policymakers, procurement officers, and funding agencies, allowing them to assess feasibility, prioritize research directions, and allocate resources effectively. By fostering interoperable tools, shared datasets, and reference workflows, the community can build a consistent methodology for projecting the real-world impact and scalability of quantum computation.
Open Science Infrastructure and Workforce
Expanding open science infrastructure will be vital to ensuring transparency, reproducibility, and equitable participation in quantum software research. Sustained investment in open-source stacks, reference implementations, and standardized datasets enables validation and benchmarking. Furthermore, interoperable cross-testbed harnesses—which allow the same workflow to execute on diverse hardware backends—will accelerate rigorous experimental comparisons and best-practice development. Long-term funding models are needed to strengthen these shared resources, foster community-driven governance, and ensure continuity as platforms evolve.
In parallel, the global quantum software workforce must grow in both size and breadth. Quantum software engineering increasingly depends on expertise in classical domains such as compilers, programming languages, formal verification, distributed systems, cloud orchestration, and cybersecurity. Education and training programs should therefore integrate these disciplines alongside quantum information fundamentals, offering students and professionals accessible pathways into the field. Collaborative initiatives—such as international summer schools, coordinated curriculum frameworks, shared teaching materials, and executive-level training programs—will help harmonize competencies across regions and strengthen global capacity.
QSA can provide guidance for developing and growing robust open science infrastructure and a well-prepared workforce as foundations for sustainable progress in quantum software, enabling reproducible, secure, and globally-inclusive innovation.
Community and end-user Engagement
Quantum software should be grounded in real needs. Sustained community and end-user engagement is essential to grounding quantum software research in real-world needs and delivering measurable value. Currently, many potential users—from scientists to industry experts—lack clear pathways to articulate requirements or evaluate capabilities. Building mechanisms for ongoing dialogue helps align research priorities with real-world demand, preventing misallocation of effort and accelerating adoption where quantum advantage is plausible. Community and end-user engagement includes shared datasets, open benchmarks, procurement pilots structured with audit trails and reproducibility requirements, and participation in standards development. These enable diverse stakeholders to explore applications and contribute domain expertise and allow end-users to shape specifications, tooling requirements, and interoperability expectations.
QSA can serve as a neutral reference for cultivating and curating these forms of engagement, fostering an inclusive, application-driven ecosystem that ensures global investments address genuine societal and industrial needs.
Ethics and Responsible Communication
Ethical and responsible communication are critical to maintaining public trust, guiding informed policy, and preventing unrealistic expectations. It is essential to clearly distinguish between demonstrated results, plausible advances, and speculative projections. Transparent reporting of assumptions, uncertainties, noise models, and methodological limitations is required to ensure decision makers and end-users can accurately interpret claims and make evidence-based choices. A commitment to verification-first practices—including reproducible benchmarks, documented pipelines, and open artefacts—strengthens accountability. Publishing reference implementations, datasets, and experimental logs enables the community to validate findings and refine methodologies collaboratively. This openness reduces the risk of overstated capabilities and accelerates technical progress by making empirical foundations widely accessible. Responsible communication must also address broader ethical dimensions, including equitable access to knowledge, the societal implications of quantum computation, and minimizing hype-driven distortions in public discourse. Coordinated guidelines, community norms, and training in science communication can help researchers, companies, and policymakers articulate realistic expectations while highlighting genuine opportunities.
By institutionalizing these practices, QSA can promote integrity, foster trust, and ensure that the field’s development is grounded in rigor, transparency, and societal responsibility.
Applications and Sector Roadmaps
Capabilities translate into sector impact through application driven validation loops. Representative problem kernels are selected, classical baselines are established, algorithms and toolchains are designed with explicit resource models [15], hybrid execution is orchestrated, and outcomes are verified with benchmarking pipelines. Sectors follow a common structure to enable comparability.
Health and Life Sciences
Quantum software targets molecular modelling and discovery workflows, including ground and excited state energies, reaction pathways, and ligand–receptor interactions. Near-term emphasis is on hybrid simulation and embedding strategies that keep classical pre- and post-processing explicit and move only the computationally expensive quantum tasks to hardware. To evaluate performance, we identify representative kernels, the fundamental computational subroutines that capture the core mathematical challenges of the sector. These include ab-initio and data-assisted quantum computations of binding energies, adiabatically assisted variational eigensolvers for strongly correlated fragments, and quantum-enhanced Monte Carlo for rare-event sampling. Compiler support must coordinate basis choices, mappings, and error-aware synthesis, while exposing resource models that report qubits, depth, and wall-clock time. Verification uses domain-specific units, such as chemical accuracy with uncertainty budgets, and results must be replicated across testbeds against mandatory classical baselines. Privacy-preserving workflows and quantum-safe data pipelines are gating requirements for any deployment in clinical settings.
Materials and Advanced Manufacturing
For catalysis, battery systems, and strongly correlated matter, progress depends on hybrid strategies that combine disentangling transforms and embedding with compiler support for materials primitives and error‑aware synthesis [16]. Domain‑driven workflows couple dynamical mean‑field methods and other advanced classical solvers with quantum subroutines for spectral features and barrier heights. Verification‑inspired benchmarking anchors claims: spectral lines, reaction barriers, and correlation functions are compared to trusted references, results are replicated across testbeds, and resource estimates are reported with transparent assumptions and energy budgets. Early fault‑tolerant routes based on linear‑algebra primitives and Quantum singular value transformation can be exercised on a small set of community‑curated materials problems with open artefacts and explicit logical error targets.
Energy and Climate Systems
Energy systems combine optimisation, simulation, and data-driven modelling. Target workloads include unit-commitment variants (the scheduling of power generation units to meet grid demand), optimal power-flow under network constraints, short-term forecasting for renewables, and discovery of carbon-capture materials. Software priorities are robust encodings for network-constrained optimisations, hybrid orchestration that meets latency and bandwidth budgets for grid operations, and assurance pipelines suitable for safety-critical deployments. Benchmarks report optimisation-gap reductions and forecast-skill deltas at fixed reliability thresholds, all against strong classical baselines, with explicit reporting of wall-clock, energy, and memory costs. Materials-focused sub-problems reuse the simulation kernels from the materials sector, enabling cross-sector reproducibility.
Finance
Canonical kernels are risk aggregation (including VaR and CVaR), anomaly and fraud detection, and pricing of path‑dependent derivatives. Near‑term approaches embed quantum subroutines inside classical pipelines: amplitude‑estimation variants for Monte Carlo components, quantum optimisation primitives for portfolio allocation, and sampling‑based routines for rare‑event analysis. Compilers must handle arithmetic intensity and memory access patterns, expose resource and energy models, and support audit trails appropriate for regulated environments. Acceptance criteria are framed in task‑level metrics, include model‑risk considerations, and are always evaluated against tuned classical baselines running on modern accelerators. Migration to quantum‑safe cryptography on the classical perimeter is a gating requirement for integration with financial infrastructure.
Cybersecurity and Public Digital Infrastructure
Security evolves along two coordinated axes. First, the classical perimeter must transition to quantum‑safe standards. This is a long‑horizon systems engineering task that needs parameter‑se\-lec\-tion tools, reference implementations integrated with mainstream libraries, and crypto‑agile architectures that allow algorithms and parameters to be updated without service disruption. Sector pilots should exercise end‑to‑end upgrade paths in representative infrastructures such as identity management, secure messaging, data‑at‑rest, and key management for cloud services. Second, quantum‑native capabilities are emerging. Quantum key distribution and quantum random‑number generation can provide security properties that are classically unattainable, but broader uptake depends on certification and interoperable standards; software deliverables include conformance tests and deployment profiles. Protocols for verifiable delegation, certified deletion, proofs of quantumness, and task‑specific privacy are candidates for early networked demonstrations, with acceptance criteria that include explicit threat models, quantified soundness and completeness, and independently auditable logs. As quantum computing becomes distributed, the network operating layer must enforce authenticity and privacy across heterogeneous nodes, schedule tasks with explicit latency and bandwidth constraints, and record evidence required for compliance. Co‑design of computation and communication ensures that resource estimation and verification account for both local processing and network costs. Progress requires close collaboration between cryptographers, quantum information theorists, systems engineers, and standards bodies, with strong connections between post‑quantum cryptography and quantum‑communication communities.
Manufacturing and Supply Chains
Routing and scheduling under uncertainty, in‑line quality control, and secure traceability are core challenges. Quantum‑accelerated optimisation and learning routines are integrated into digital‑twin workflows where classical and quantum resources are orchestrated under explicit latency and energy budgets. Runtime systems broker resources across vendors and modalities while preserving auditability. Verification ensures comparability of results, robustness to data‑shift, and reproducibility across testbeds. Acceptance criteria include throughput improvements, cycle‑time reductions, and scrap‑rate reductions under fixed reliability thresholds, with cryptographic signing and quantum‑safe pipelines used for provenance in regulated environments.
Science and Engineering
Quantum simulation remains a leading application area. The immediate goal is to model quantum‑mechanical systems that challenge classical methods across chemistry [17], materials, condensed-matter physics [18], and high‑energy physics [19], and to do so at realistic noise and scale [20]. Priority kernels include many‑body dynamics, lattice models, spectroscopy, and compact electronic‑structure tasks that admit hybrid execution. Implementations combine error‑aware compilation with explicit classical pre‑ and post‑processing and hybridisation with tensor networks, dynamical mean‑field theory, and other advanced solvers. Workflows must state data movement, memory, and latency costs, and expose interfaces that allow quantum subroutines to be used as libraries inside existing HPC codes. Suitability for near‑term and early fault‑tolerant platforms is assessed before execution through transparent reporting of qubits, depth, logical error rates where relevant, wall‑clock time, and energy budgets. Verification captures both physical accuracy and computational performance. Benchmarks compare spectral lines, correlation functions, and thermodynamic quantities to trusted references and are replicated across testbeds with acceptance thresholds defined in domain units and explicit uncertainty budgets. As devices mature, simulation methods based on linear‑algebra primitives, QSVT, and amplitude estimation provide routes to higher precision; these should be exercised on community‑curated problem sets with open resource estimates and artefacts.
AI and Data Ecosystems
Quantum machine learning sits at the intersection of artificial intelligence and quantum computing [21,22]. Algorithmic work indicates potential acceleration in linear‑algebra subroutines and new families of quantum-native approaches to recommendation, classification, regression, and generative modelling. The sector goal is to turn such ideas into verified end‑to‑end workloads that operate under realistic constraints of data movement, memory, latency, and energy. Practical progress targets hybrid models that embed quantum subroutines inside classical training and inference loops, for example quantum‑enhanced sampling, kernel methods, and structured stochastic components in generative models. Development prioritises problems where input data is low dimensional but complex, compressible, generated in situ by physical models, or prepared by feature pipelines that keep I/O costs explicit. All claims are evaluated against strong classical baselines on modern accelerators with acceptance criteria framed in task‑level metrics and reported at fixed energy and latency budgets. Where large quantum memories are unavailable, emphasis is on regimes that operate on compact encodings, synthetic data, or features produced by classical preprocessing. As systems mature, quantum methods for linear systems, gradient estimation, and amplitude estimation will support learning pipelines. To ensure utility in regulated environments, these methods must integrate uncertainty quantification, interpretability, and privacy maintenance as core principles. Software deliverables for this sector must include compiler-generated instruction counts, logical error rates, and verification hooks to ensure that quantum-enhanced AI remains transparent, safe, and verifiable.
Milestones and Measurable Outcomes
Within 1 to 3 years publish IR specifications with conformance tests, release verification inspired benchmark suites with open artefacts, operate reference hybrid schedulers with digital twin style test rigs in HPC facilities, and provide quantum safe reference implementations. Within 3 to 6 years support early fault-tolerant subsystems with logical level compilation and decoder co-design accredited by program level protocols, make production HPC integration routine with device agnostic partitioning, and report sector pilots with reproducible gains under certified conditions. Within 6 to 10 years deliver end-to-end fault-tolerant workflows for at least two sector kernels with distributed execution across heterogeneous nodes. Throughout the decade capability certificates and open artefacts turn isolated demonstrations into cumulative and comparable progress.
Policy and Funding Actions
Treat software as strategic infrastructure. Fund the five capabilities with explicit verification deliverables and open artefacts. Support standardisation for language and IR, runtime APIs, benchmark formats, and quantum safe conformance. Use procurement to require verification-ready outputs and cross platform comparability. Invest in hybrid infrastructure that couples HPC facilities with quantum testbeds under common orchestration [23]. Resource programmes that maintain community tools and datasets. Sector pilots should be small and auditable with classical baselines and clear acceptance thresholds.
Implementation and Maintenance
QSA will maintain this Agenda as a living document. A General Assembly elects a geographically balanced Steering Committee that oversees topic area task forces. Annual updates are ratified in open meetings. Artefacts such as benchmarks, IR tests, and datasets are curated in public repositories. QSA will engage with governments and international organisations to align efforts and will provide expert guidance on priorities that support long-term impact.
References
[1] Quantum Flagship. Strategic Research and Industry Agenda (SRIA 2030): Roadmap and Quantum Ambitions over this Decade. Technical report, European Commission, Feb. 2024. https://qt.eu/about-us/strategic-research-agenda-sria. The official roadmap for the EU’s quantum technology strategy, detailing the transition to industrial scale software.
[2] J. Eisert and J. Preskill. Mind the gaps: the fraught road to quantum advantage, 2025. arXiv: 2510.19928 [quant-ph]. https://arxiv.org/abs/2510.19928.
[3] R. Babbush, R. King, S. Boixo, W. Huggins, T. Khattar, G. H. Low, J. R. McClean, T. O’Brien, and N. C. Rubin. The grand challenge of quantum applications, 2025. arXiv: 2511.09124 [quant-ph]. https://arxiv.org/abs/2511.09124.
[4] European Quantum Industry Consortium (QuIC). Strategic Industry Roadmap 2025: A Shared Vision for Europe’s Quantum Future. Technical report, QuIC, Apr. 2025. https://www.euroquic.org/strategic-industry-roadmap-sir. The industrial roadmap focusing on the software supply chain and hybrid quantum-HPC integration.
[5] A. D. Carleton, M. Klein, E. Harper, et al. Architecting the Future of Software Engineering: A National Agenda for Software Engineering Research & Development. Technical report, Carnegie Mellon University Software Engineering Institute (SEI), Nov. 2021. https://resources.sei.cmu.edu/library/asset-view.cfm?assetid=741193. Establish quantum software as a formal engineering discipline with specific architectural paradigms.
[6] F. J. Cardama, J. Vázquez-Pérez, T. F. Pena, et al. Quantum compilation process: a sur vey. The Journal of Supercomputing, 2024. https://link.springer.com/article/10.1007/s11227-024-06123-x. Detailed survey of the compilation stack from high-level languages to pulse control (Capability C2).
[7] M. Paltenghi and M. Pradel. A survey on testing and analysis of quantum software. arXiv preprint arXiv:2410.00650, Oct. 2024. https://arxiv.org/abs/2410.00650. A comprehensive review of verification and testing methods for Capability C3.
[8] A. de Marti i Olius, P. Fuentes, R. Orús, P. M. Crespo, and J. Etxezarreta Martinez. De coding algorithms for surface codes. Quantum, 8:1498, Oct. 2024. https://quantum-journal.org/papers/q-2024-10-10-1498. State-of-the-art review of decoding algo rithms relevant to Capability C4.
[9] V. Bartsch, G. Colin de Verdière, J.-P. Nominé, et al. Quantum for HPC: White Pa per. Technical report, ETP4HPC (European Technology Platform for High-Performance Computing), 2021. https://www.etp4hpc.eu/white-papers.html. Foundational reference for the orchestration of hybrid Quantum-HPC systems.
[10] D. Barral, F. J. Cardama, G. Díaz, et al. Review of distributed quantum computing: from single QPU to high performance quantum computing. arXiv preprint arXiv:2404.01265, Apr. 2024. https://arxiv.org/abs/2404.01265. The definitive survey for Capa bility C5. It comprehensively reviews the entanglement distribution, blind delegation, and network layers you describe.
[11] Secretariat of Science, Technology and Innovation Policy. Strategy of Quantum Future In dustry Development. Technical report, Cabinet Office, Government of Japan, Apr. 2023. https://www8.cao.go.jp/cstp/english/quantum/quantum_index.html. Japan’s national strategy for integrating quantum software into the ’Society 5.0’ industrial initia tive.
[12] National Institute of Standards and Technology (NIST). FIPS 203: Module-Lattice-Based Key-Encapsulation Mechanism Standard. Technical report, U.S. Department of Com merce, Aug. 2024. https://csrc.nist.gov/pubs/fips/203/final. The finalized Post-Quantum Cryptography standard, essential for the cybersecurity sector roadmap.
[13] IEEE Standards Association. International Roadmap for Devices and Systems (IRDS) 2024 Edition. Technical report, IEEE, 2024. https://irds.ieee.org. Global industry standards for quantum computing metrics and volumetric benchmarking.
[14] CSIRO Futures. Growing Australia’s Quantum Technology Industry (Updated Economic Modelling). Technical report, Commonwealth Scientific and Industrial Research Organ isation (CSIRO), Oct. 2022. https://www.csiro.au/en/work-with-us/services/consultancy-strategic-advice-services/csiro-futures/futures-reports/quantum. Provides economic benchmarks for the value of quantum software, control engineering, and error correction.
[15] A. M. Dalzell, S. McArdle, M. Berta, P. Bienias, C.-F. Chen, A. Gilyén, et al. Quan tum algorithms: a survey of applications and end-to-end complexities. arXiv preprint arXiv:2310.03011, Oct. 2023. https://arxiv.org/abs/2310.03011. The ’Gold Standard’ reference for Section 6. It provides the rigorous end-to-end resource estimation for Chemistry, Finance, and Materials that you call for in your Sector Roadmaps.
[16] S. E. Herman et al. Quantum computing for chemical and materials sciences. Nature Reviews Chemistry, 7:692–709, 2023. https://www.nature.com/articles/s41570-023-00517-9. Validates Section 6.1 (Health/Life Sciences). It explicitly discusses the transition from VQE to the ’early fault-tolerant’ embedding methods you prioritize.
[17] S. McArdle, S. Endo, A. Aspuru-Guzik, S. C. Benjamin, and X. Yuan. Quantum compu tational chemistry. Rev. Mod. Phys., 92:015003, 1, Mar. 2020. doi: 10.1103/RevModPhys. 92.015003. https://link.aps.org/doi/10.1103/RevModPhys.92.015003.
[18] Y. Alexeev, M. Amsler, M. A. Barroca, S. Bassini, T. Battelle, D. Camps, D. Casanova, Y. J. Choi, F. T. Chong, C. Chung, C. Codella, A. D. Córcoles, J. Cruise, A. Di Meglio, I. Duran, T. Eckl, S. Economou, S. Eidenbenz, B. Elmegreen, C. Fare, I. Faro, C. S. Fernández, R. N. B. Ferreira, K. Fuji, B. Fuller, L. Gagliardi, G. Galli, J. R. Glick, I. Gobbi, P. Gokhale, S. de la Puente Gonzalez, J. Greiner, B. Gropp, M. Grossi, E. Gull, B. Healy, M. R. Hermes, B. Huang, T. S. Humble, N. Ito, A. F. Izmaylov, A. Javadi Abhari, D. Jennewein, S. Jha, L. Jiang, B. Jones, W. A. de Jong, P. Jurcevic, W. Kirby, S. Kister, M. Kitagawa, J. Klassen, K. Klymko, K. Koh, M. Kondo, D. M. Kürkçüoglu, K. Kurowski, T. Laino, R. Landfield, M. Leininger, V. Leyton-Ortega, A. Li, M. Lin, J. Liu, N. Lorente, A. Luckow, S. Martiel, F. Martin-Fernandez, M. Martonosi, C. Marvinney, A. C. Medina, D. Merten, A. Mezzacapo, K. Michielsen, A. Mitra, T. Mittal, K. Moon, J. Moore, S. Mostame, M. Motta, Y.-H. Na, Y. Nam, P. Narang, Y.-y. Ohnishi, D. Ottaviani, M. Otten, S. Pakin, V. R. Pascuzzi, E. Pednault, T. Piontek, J. Pitera, P. Rall, G. S. Ravi, N. Robertson, M. A. Rossi, P. Rydlichowski, H. Ryu, G. Samsonidze, M. Sato, N. Saurabh, V. Sharma, K. Sharma, S. Shin, G. Slessman, M. Steiner, I. Sitdikov, I.-S. Suh, E. D. Switzer, W. Tang, J. Thompson, S. Todo, M. C. Tran, D. Trenev, C. Trott, H.-H. Tseng, N. M. Tubman, E. Tureci, D. G. Valiñas, S. Vallecorsa, C. Wever, K. Wojciechowski, X. Wu, S. Yoo, N. Yoshioka, V. W.-z. Yu, S. Yunoki, S. Zhuk, and D. Zubarev. Quantum-centric supercomputing for materials science: a perspective on challenges and future directions. Future Generation Computer Systems, 160:666–710, 2024. https://doi.org/10.1016/j.future.2024.04.060.
[19] A. Di Meglio, K. Jansen, I. Tavernelli, C. Alexandrou, S. Arunachalam, C. W. Bauer, K. Borras, S. Carrazza, A. Crippa, V. Croft, R. de Putter, A. Delgado, V. Dunjko, D. J. Egger, E. Fernández-Combarro, E. Fuchs, L. Funcke, D. González-Cuadra, M. Grossi, J. C. Halimeh, Z. Holmes, S. Kühn, D. Lacroix, R. Lewis, D. Lucchesi, M. L. Martinez, F. Meloni, A. Mezzacapo, S. Montangero, L. Nagano, V. R. Pascuzzi, V. Radescu, E. R. Ortega, A. Roggero, J. Schuhmacher, J. Seixas, P. Silvi, P. Spentzouris, F. Tacchino, K. Temme, K. Terashi, J. Tura, C. Tüysüz, S. Vallecorsa, U.-J. Wiese, S. Yoo, and J. Zhang. Quantum computing for high-energy physics: state of the art and challenges. PRX Quantum, 5:037001, 3, Aug. 2024. doi: 10.1103/PRXQuantum.5.037001. https://link.aps.org/doi/10.1103/PRXQuantum.5.037001.
[20] Office of Science, U.S. Department of Energy. Basic Research Needs in Quantum Comput ing and Networking. Technical report, Advanced Scientific Computing Research (ASCR), July 2023. url: https://www.osti.gov/biblio/2001045. Defines priority research direc tions for the US software stack, including middleware and networked quantum computing.
[21] G. Acampora, A. Ambainis, N. Ares, L. Banchi, P. Bhardwaj, D. Binosi, G. A. D. Briggs, T. Calarco, V. Dunjko, J. Eisert, O. Ezratty, P. Erker, F. Fedele, E. Gil-Fuster, M. Gärt tner, M. Granath, M. Heyl, I. Kerenidis, M. Klusch, A. F. Kockum, R. Kueng, M. Krenn, J. Lässig, A. Macaluso, S. Maniscalco, F. Marquardt, K. Michielsen, G. Muñoz-Gil, D. Müssig, H. P. Nautrup, S. A. Neubauer, E. van Nieuwenburg, R. Orus, J. Schmiedmayer, M. Schmitt, P. Slusallek, F. Vicentini, C. Weitenberg, and F. K. Wilhelm. Quantum computing and artificial intelligence: status and perspectives, 2025. arXiv: 2505.23860 [quant-ph]. url: https://arxiv.org/abs/2505.23860.
[22] Quantum Economic Development Consortium (QED-C). Quantum Computing and Ar tificial Intelligence Use Cases. Technical report, QED-C, Mar. 2025. url: https://quantumconsortium.org/reports. Explores the convergence of AI and Quantum (QC for-AI and AI-for-QC) for hybrid applications.
[23] Ministry of Science and ICT (MSIT). Quantum Science and Technology Strategy of Korea. Technical report, Government of the Republic of Korea, June 2023. url: https://www.msit.go.kr/eng/bbs/view.do?sCode=eng&mId=4&mPid=2&pageIndex=&bbsSeqNo=42& nttSeqNo=837. South Korea’s strategy targeting a 1,000-qubit system and a supporting software ecosystem by the early 2030s.