《1 Introduction》

1 Introduction

In recent years, developed countries and high-tech companies have given significant attention to quantum computation, established long-term development planning, and invested important resources to boost the development of technology. Owing to the series of significant developments in the field of quantum computation, quantum computation has started receiving attention from not only the academia but also the general public. Quantum computation has become a diversified technical field involving cutting-edge fundamental research (e.g., in the fields of mathematics and physics), integration with many engineering disciplines, and highly engineered development of applied technologies, and the momentum of rapid development continues unbated. Therefore, it is extremely difficult to systematically explore the entire field of quantum computation, and it is nearly impossible to obtain a deep understanding of all the aspects of quantum computation. Based on the key points in the history of significant developments in the field of quantum computation, the historical and current status of the corresponding development is comprehensively examined and summarized. Moreover, the future developments in the field of quantum computation is deliberated and forecasted, which is valuable and is also a necessary link to better promote the development of the field of quantum computation in China.

Quantum computation and even the broader field of quantum information represent an assembly of concepts and techniques that concern the nature and processing of information based on quantum mechanics. The formation of quantum computational ideas and concepts has gone through a long period of time [1]. Some physicists and mathematicians, driven by different motives, have conducted fundamental research that plays a critical role in promoting the development of quantum computation. At least in the early stages of the development of quantum computation, the basic logic of this field was not closely correlated with the obsolescence of Moore’s law or the improvements in computing power. Understanding this point enables us to view the history of quantum computation correctly and provide a deeper perspective on how to promote future development. Richard Phillips Feynman, a theoretical physicist, and Yuri Ivanovich Manin, a mathematician, pointed out that classical computation cannot efficiently simulate a quantum system owing to quantum superposition and entanglement. Feynman further proposed that a controllable quantum computer could be used to efficiently simulate the quantum system that needs to be researched. In 1994, Peter Williston Shor, a mathematician, proposed Shor’s quantum algorithm for the prime factorization of large numbers [2], which is the first quantum algorithm having a specific purpose and an outstanding application value, posing a threat to one of the most excellent public-key cryptography schemes (RSA public-key password).

The early technical routes of quantum computation consisted of nuclear magnetic resonance (NMR), superconducting quantum circuits, semiconductor quantum dots, trapped ions, and cold atoms. Researchers have successfully implemented quantum bits (qubits) and their precise manipulation on these platforms. On some mature platforms (e.g., NMR), researchers have even demonstrated small-scale quantum algorithms. However, there are still many doubts regarding the feasibility of quantum computation in academia, especially whether the loss of quantum information caused by quantum decoherence can be overcome. The milestone in the development of quantum computation is the establishment of quantum error correction (QEC). Shor and Steane independently proposed the concept of QEC, the basic principle of which is somewhat similar to that of classical error correction; both concepts are based on redundancy encoding. Based on the QEC code, Shor proposed a framework in which fault-tolerant quantum computations were built in a noisy quantum system. In 1997, Kitaev found a relationship between QEC and topological states and pointed out that topologically protected degenerate states could be used as logic bits. The surface code proposed by Kitaev and Bravyi became a qubit model to be protected by the topology. In 2001, Kitaev and Preskill et al. further pointed out that if the so-called fault-tolerant threshold, which was approximately 1%, could be realized, the surface code could be used for implementing QEC, and it became the surface error correction code for the main technology of QEC. This series of efforts laid the theoretical foundation for the implementation of fault-tolerant quantum computation [1].

It has been more than 40 years since Feynman et al. proposed the original idea of quantum computation, during which the corresponding progress has been remarkable. However, it is undeniable that there is still no practical application of quantum computation. Regarding the application of quantum computation, it is believed in academia that quantum computation has its own limitations and cannot completely substitute classical computation. In principle, classical computation can perform everything that can be achieved using quantum computation. The main difference between them is the level of efficiency. It is reasonably estimated that we will be in the era of “noisy intermediate-scale quantum (NISQ)” for a very long period before the implementation of universal fault-tolerant quantum computation [1,3]. In the NISQ era, quantum computation will mainly serve as a platform for fundamental R&D of more advanced quantum control techniques. This outlook may be far from the general expectation of practical quantum computation in the minds of many people, but it still has distinct significance. Consequently, in terms of quantum computation as currently understood, quantum simulation remains the main use of quantum computation. It can be reasonably speculated that quantum computation may find important applications in areas such as pharmaceutical development, new materials, and agriculture. However, we should be clear that these are still distant possibilities, rather than technologies that have been realized or are to be realized.

In summary, a clear view of the overall situation is required for the development of quantum computation. While the prospects are bright, there are some difficulties. On one hand, an optimistic attitude should be maintained. For the scientific theory to change our world view on a very fundamental level, its technical application is bound to be a revolutionary application. On the other hand, the more groundbreaking the technology is, the more difficult it is to make specific expectations before putting it into practical use. There is a huge difference between quantum computation and the existing information technologies; therefore, it is considerably difficult to predict the long-term impact of quantum computation technology. Progress in the field of quantum computation will not be a simple linear process, and linear extrapolation based on existing technologies and progress should be avoided to the greatest extent possible to predict long-term development.

This study consists of 10 sections. Section 1 briefly describes the fundamental concepts, origin of ideas, history, status, and trends in quantum computation. Section 2 discusses the theories, algorithms, and applications of quantum computations. Section 3 presents the software and control system architectures for quantum computation. Sections 4–8 discuss five representative technical routes of quantum computation (i.e., superconducting quantum computation, distributed superconducting quantum computation, photonic quantum computation, trapped-ion quantum computation, and silicon-based quantum computation); distributed superconducting quantum computation, which is a booming field, may be critical to the implementation of large-scale quantum computation in future. Section 9 presents the other types of technical routes of quantum computation (e.g., neutral atom quantum computation, nitrogen-vacancy (NV) centers in diamond, NMR quantum computation, quantum computation with spin waves, and topological quantum computation). The last section offers some suggestions for the development of quantum computation in China.

《2 Theories, algorithms, and applications of quantum computation》

2 Theories, algorithms, and applications of quantum computation

《2.1 Theories and algorithms》

2.1 Theories and algorithms

The development of quantum algorithms over the last 40 years can be roughly divided into four stages: (1) from 1985 to 1992, toy algorithms were developed to demonstrate the advantages of quantum computation; (2) from 1993 to 1994, algorithms with various advantages were developed to demonstrate its applications in solving real-world problems; (3) since 1995, quantum algorithms were developed to expand the application range of quantum computation; (4) since 2013, quantum algorithms targeting NISQ systems were developed to accelerate the application process.

Shortly after Feynman proposed simulating a quantum system with another quantum system in 1985, Deutsch mathematically expressed this idea as a quantum Turing machine (QTM) [4] and proposed the first quantum algorithm, demonstrating the advantages of quantum computation over classical computation in terms of a simple decision problem. Computers equivalent to the QTM are also known as quantum computers. Subsequently, Deutsch proposed a quantum circuit model that was more feasible in terms of physical implementation [5]. In 1993, Yao proved the equivalence of the circuit model to the QTM, and the circuit model became the standard model for building a universal quantum computer [6]. While proposing a quantum algorithm that calls the given oracle only once to find the hidden binary string [7] to demonstrate quantum advantage, Bernstein and Vazirani established the quantum complexity theory, which theoretically proved that quantum computers can be more efficient than classical computers in solving certain problems.

In 1994, Shor proposed the quantum Fourier transform (QFT) algorithm and discrete logarithm algorithm, giving birth to the factorization algorithm [2]. Shor’s factorization was the first quantum algorithm with real-world applications and exponential speedup compared to known optimal classical algorithms, and it significantly stimulated the interest of the academia in quantum computing. Kitaev et al. proposed a phase-estimation algorithm in 1995, which later became a key component of many quantum algorithms. In 1996, Grover proposed a search algorithm over an unordered database that obtains a squared-root speedup [8]. In 1996, Lloyd proposed an algorithm simulating local Hamiltonians based on the work by Suzuki and Trotter et al. [9] and established the basis for quantum chemistry simulations. In 2009, Harrow, Hassidim, and Lloyd proposed an HHL algorithm that applies phase estimation to linear system solutions [10]. In 2011, Panella and Martinelli et al. proposed quantum neural networks (QNNs) [11]. The HHL algorithm and QNNs signal the way forward for the application of quantum computation in artificial intelligence (especially machine learning). According to Google, quantum machine learning is a key candidate for achieving quantum advantages in short-term real-world applications [12]. In 2013, a research team at Harvard University proposed the variational quantum eigensolver (VQE) algorithm [13], which enables the simulation of quantum chemistry using NISQ machines. In 2016, Farhi et al. proposed a quantum approximate optimization algorithm (QAOA) based on the VQE algorithm [14], which was used to solve combinatorial optimization problems.

The aforementioned algorithms significantly broaden the applications of quantum computers and demonstrate the potential of quantum computers in fields such as searching, quantum chemistry simulation, artificial intelligence, and machine learning. Devising algorithms that can be executed by NISQ processors to solve real-world problems is the main issue in quantum computing research.

《2.2 Potential application》

2.2 Potential application

Quantum chemistry simulation is an important potential application of quantum computations. The resources required to perform quantum chemistry simulations using classical computers have grown exponentially in terms of the problem size. Addressing this issue, researchers have attempted to design more efficient algorithms for quantum chemistry simulations (e.g., quantum phase estimation and VQE algorithms are used to compute the ground state of molecules and their energy). There are relatively few resources necessary for the VQE algorithm with certain noise immunity; thus, it will play a vital role in the NISQ era. The current research on quantum chemistry simulation focuses on the simulation of large molecule systems on actual quantum hardware to realize the simulation of molecules, such as beryllium hydride and water molecules. The simulation of ethylene, hydrogen cyanide, and other molecules on the classical simulator is the main objective. These achievements demonstrate the universality and feasibility of the VQE algorithm, but there it is still a long way to go before quantum advantage is embodied. While the hardware performance of quantum computation is increased, algorithms are constantly improved (design better variational ansatz and use more suitable parameterization and optimization methods) to develop a quantum chemistry simulation algorithm of practical value.

As a shallow variational algorithm, the QAOA algorithm is used for the approximate solution of combinatorial optimization problems and is suitable for implementation on NISQ-era quantum hardware. At the beginning of the proposal for QAOA, the fundamental idea is that the quantum adiabatic evolution algorithm is discretized. Then, the correlation between QAOA and quantum walks are defined to update the understanding of QAOA. In addition, QAOA has also been applied to solve a system of linear equations by building a variational quantum search algorithm to solve combinatorial optimization problems. Researchers have proposed a series of approaches to accelerate QAOA classical optimization, which includes two categories: heuristic initialization and machine-learning-assisted optimization. Further research should be conducted to determine whether QAOA has any potential quantum advantage and trainability.

Quantum machine learning integrates machine learning and quantum systems, and the research content is in two directions: the application of classical machine learning to the physical system and the design and implementation of classical machine learning algorithms based on QNNs. One of the challenges of classical machine learning is that both data volume and computation are approaching the limit of the classical computation mode, while the learning paradigm of the quantum system or quantum algorithm is completely different from that of classical computation. Therefore, the implementation of traditional machine learning in the quantum system provides the possibility of breaking through the classical limit. The relevant research results include quantum principal component analysis, quantum convolutional neural networks, implementation of deep neural networks in quantum optical systems, and quantum generative adversarial networks. Currently, the problems mainly lie in the loss of effective theoretical interpretation of deep learning operation mechanisms and the exploration and tapping of quantum computation potential.

《3 Quantum software and control architecture》

3 Quantum software and control architecture

Quantum software and control systems bridge quantum algorithms and physical implementations of quantum systems and constitute a key layer in a quantum computation system. A quantum algorithm designs the process for solving specific problems using a quantum computer and must be described in the quantum programming language. After the algorithm is translated and optimized by the quantum compiler, a low-level format program that can be executed on hardware, such as quantum assembly, binary executables, and waveform data, is generated. The quantum measurement and control system execute the low-level format program and generates the corresponding signals in real time, applying quantum gates and measurements to qubits. The organization of the quantum measurement and control system and the software-hardware interface is termed the quantum control architecture.

《3.1 Quantum software》

3.1 Quantum software

Studies in quantum programming languages originate from the quantum random access machine and pseudo code for quantum algorithms proposed by Knill in 1996. Computer scientists have laid the theoretical foundations of quantum programming [15], such as the structure and semantics of quantum programs, formal verification of quantum programs, quantum recursion, un-computation, and quantum-classical hybrid computation, by studying the characteristics of quantum algorithms and the requirements of their execution. Guided by quantum programming theory, quantum software engineering has steadily developed, and a series of quantum programming languages have been proposed. Almost every quantum language has a corresponding quantum complier. Each language and its complier are developed in a coordinated manner and gradually support features such as classical logic synthesis, quantum gate decomposition, automatic inversion, quantum uncomputation, hierarchical description of quantum circuits, hybrid quantum-classical programming, and capability to represent basic quantum experiments.

Currently, quantum programming languages with relatively well-developed software ecosystems include Qiskit, Q#, PyQuil, and Penny Lane. Regarding quantum compliers, the Qiskit transpiler attracts feedback from many developers, which not only supports the basic functions of quantum compilation (e.g., mapping and scheduling of qubits and quantum gate decomposition), but also supports application-specific optimization (e.g., quantum chemistry simulation). The QCOR language is distinctive in terms of supporting hybrid quantum-classical programming, multiple front ends/back ends, and cross-language conversions. Quantum programming languages developed by Chinese groups include QPanda, Quingo [16], and isQ [17]. In the field of quantum compliers, the use of multi-level intermediate representation (MLIR) as the core infrastructure is now in the ascendant, and both Quingo and isQ compilers are developed based on MLIR. An increasing number of users and developers are attracting more abundant libraries, and supporting tools are provided to build an ecological environment for quantum software, which is the key point of competition for the application of a variety of current languages.

《3.2 Quantum control architecture》

3.2 Quantum control architecture

The quantum software-hardware interface aims to provide a comprehensive and flexible software programming method to support the execution of quantum applications in quantum measurement and control systems. The organizational design of quantum measurement and control systems focuses on how to apply control signals to qubits and receive measurement signals from qubits in a scalable manner according to the executable generated by the quantum software and support real-time feedback when required.

Consider the quantum measurement and control system for superconducting qubits as an example. The existing superconducting quantum measurement and control systems can be roughly classified into two generations. The first generation is primarily composed of electronics that can directly generate and receive analog microwave signals. Scalability and programming capability are both limited because no real-time feedback is supported and the system is easy to implement. In 2017, Delft University of Technology proposed a microarchitecture, QuMA [18], which can generate precise timing control signals in real time and has a flexible programmable feedback capability and better scalability. Such quantum measurement and control hardware systems based on custom-designed digital logic (particularly the usage instruction set) are known as the second generation of quantum measurement and control systems. Major international suppliers in this field (e.g., Zurich Instruments, Keysight Technologies, and Quantum Machines) have launched second-generation quantum measurement and control system products, and domestic enterprises have started R&D of second-generation quantum measurement and control systems (e.g., QuAPE) [19].

At present, the main challenge for quantum control system architecture is that an extremely low feedback delay (hundreds of nanoseconds or shorter) is used to implement programmable feedback control, and the scalability of the measurement and control system can be ensured. Quantum software should be tightly connected with the quantum control system architecture, while the development in the two directions should still be relatively intense, and there is a realistic problem in which capabilities fail to match. Another challenge in quantum computer engineering is coordinating the development of quantum software, quantum measurement, and control systems to realize seamless connections.

《4 Superconducting quantum computation》

4 Superconducting quantum computation

The advantages of superconducting quantum circuits are as follows: The device-making process is compatible with traditional semiconductor fabrication techniques, the design of the superconducting qubits is flexible to engineers, and the control of the superconducting qubits is compatible with the mature technology of microwave electronics. Two computational paradigms use superconducting qubits such as the gate-based and quantum annealing (QA). We only discussed the gate-based model in this study because there is no theoretical proof for quantum speed-up in the QA scheme.

The superconducting qubit is physically a nonlinear resonant circuit [20] formed by Josephson junctions and other superconducting components, with information encoded in different degrees of freedom, such as electric charge, phase, magnetic flux, and hybrid ones. One of the popular types is the Transmon qubit and its variants, which are insensitive to fluctuations in environmental charges and have reproducibly long decoherence times. Other qubit types include flux qubits, Fluxonium, and 0-π qubit.

There are multiple methods to couple superconducting qubits with external circuits for control and readout. Taking Transmon as an example, state manipulation is implemented by capacitively coupling the qubit to the external control circuitry. Regarding the qubit readout, qubits are coupled to a resonator, in which the frequency of which is dependent on the qubit status. For the coplanar waveguide transmission line resonator, the dispersive readout method was used to measure the frequency of the resonator to discern the qubit state. Controllable qubit–qubit interactions are required to build a multi-qubit quantum circuit. In a planar superconducting circuit, qubits are capacitively or inductively coupled to each other. One of the major developments in recent years has been the use of a more scalable tunable coupling scheme [21]. In addition, a three-dimensional resonant cavity is utilized to encode quantum information; however, the main difficulty is to realize flexibly tunable coupling, posing a challenge to scalability.

In 2019, a research team at Google announced the “Sycamore” superconducting quantum processor [22] containing a grid of 54 qubits and 88 tunable couplers, and the average fidelity of the simultaneous two-qubit gates reached 99.4%. On the processor, they demonstrated quantum supremacy in random circuit sampling problems for the first time. In 2021, a research team from the University of Science and Technology of China improved the experiment with a larger scale and deeper circuit on a 66-qubit superconducting quantum processor called “Zu Chongzhi” [23]. Recently, IBM unveiled a 127-qubit processor on its quantum cloud platform.

In addition to the above progress in integrating more qubits, other important performance metrics have also significantly increased. Recently, by replacing aluminum, the mainstream material for making superconducting qubits with tantalum and the decoherence time of the planar Transmon qubits can be increased to 300 μs [24]. In addition, the Beijing Academy of Quantum Information Sciences further optimized the result to 500 μs [25]. Moreover, research teams at the Massachusetts Institute of Technology and IBM have made progress in two-qubit gate fidelity, approaching the 99.9% hallmark [26,27]. For the most critical step toward fault-tolerant quantum computation, some progress has been made in QEC (e.g., the demonstration of the surface code is one of the most efficient error-correcting codes suitable for superconducting qubits) [28,29].

Currently, there is intense competition in the field of superconducting quantum computation. The internationally renowned academic institutions include Yale University, Massachusetts Institute of Technology, Princeton University, University of Chicago, University of California, Berkeley, National Institute of Standards and Technology (in the United States), ETH Zurich, Delft University of Technology, and Chalmers University of Technology in Europe. Many internationally renowned enterprises, such as Google, IBM, and Rigetti Computing, Inc., have been conducting relevant research and observing rapid development. In China, several renowned organizations engaged in superconducting quantum computation include the University of Science and Technology of China, Nanjing University, Zhejiang University, Tsinghua University, Beijing Academy of Quantum Information Sciences, Institute of Physics of Chinese Academy of Sciences, Southern University of Science and Technology, Huawei Technologies Co., Ltd., Alibaba Group, Shenzhen Tencent Computer System Co., Ltd., and Hefei Origin Quantum Computation Technology Co., Ltd.

There are three major challenges for superconducting quantum processors to realize practically useful computational tasks. (1) The planar structure limits the connectivity between qubits. For nearest-neighbor couplings, a huge overhead is incurred during the execution of quantum algorithms. (2) Because of planar constraints, it becomes increasingly difficult to route all the control lines, which must be drawn from the chip edge to the center, to individual qubits without inducing crosstalk. Multilayer integration technologies can ameliorate the problem to some extent, but it is nevertheless challenging to solve the wiring and crosstalk problems for thousands or millions of qubits without compromising connectivity. (3) The decoherence time of superconducting qubits must be further improved through the optimization of materials, designs, processes, and testing environments based on a better understanding of the microscopic origin. With optimistic projection, small-scale quantum applications are expected to be demonstrated in 3–5 years, and the R&D of QEC is undergoing steady progress, laying a foundation for the implementation of fault-tolerant quantum computation in 10 years or longer.

《5 Distributed superconducting quantum computation》

5 Distributed superconducting quantum computation

Regarding the integrated circuit chip, its level of integration can be improved, and its power consumption can be decreased by reducing the size of the devices, while similar strategies are not applicable to superconducting quantum chips. This is because a reduction in superconducting qubits will decrease the mode volume, and a stronger decoherence occurs when the energy density is increased and the electric field is coupled with metal interface defects. Most of the current superconducting quantum chips are occupied by control lines; therefore, it is impossible to reduce the size to improve the level of integration. Through simple estimation, the typical size of the 2D Transmon is 0.5 mm, and tens of thousands of chips can be integrated on a wafer with a diameter of 100 mm. The factors of control line fan-out and crosstalk are further considered, and the number of chips that can be integrated is reduced to several thousands. At present, the level of integration of single chips (approximately several hundreds) fails to reach the abovementioned limit; however, in the future, it is inevitable that qubits will be distributed on multiple chips and the scale of the processor will be expanded in a modular fashion to continue to expand the superconducting quantum processor.

Distributed superconducting quantum computation requires a reliable connection between chips, and the common interconnected channels include NbTi superconducting coaxial cables and aluminum waveguides. Considering the coupling strength and channel length, the available modes of quantum state transfer (QST) between chips are as follows: the transfer is conducted via one of the channel multimode standing-wave modes [30], and the transfer is implemented by transmitting and receiving microwave flying photons [31]. The former scheme is relatively simple, but it is limited by the microwave length, causing failure to establish long-distance connections. At present, some explorations have been made in terms of constructing a modular quantum processor over short distances. The latter scheme allows the establishment of long-distance connections, but it is necessary to shape and capture microwave photons. Relevant technologies have made progress in single chips, and the cross-chip QST based on microwave flying photons is completed. In 2020, Magnard et al. demonstrated QST across refrigerating units [32], and Zhong et al. demonstrated a multi-qubit entangled state transfer across chips [33].

Currently, distributed superconducting quantum computations have attracted increasing attention. In 2020, Google published a quantum computer roadmap and planned to adopt a multi-chip interconnection scheme to build a superconducting quantum processor containing one million qubits. Rigetti Computing, Inc. also conducted research on the integration of multiple chips using the flip-chip technology. In 2021, a research team published a multi-chip connection plan that covered 30 m. Distributed superconducting quantum computation will evolve toward longer channels containing more qubits.

Distributed superconducting quantum computations also have technical challenges. (1) The cross-chip channel connection loss is relatively large, which is derived from the connection between chips and microwave cables, microwave components inserted and channels. Therefore, the quality of connection and channels must be improved and the intermediate elements should be minimized. (2) Regarding the fidelity of QST, the maximum fidelity of inter-chip state transfer is reported as 91.1% [33] without post-selection, and there is still a gap between it and the fidelity (reaching 99%) of two-bit gate on single chips. These challenges hinder the scale of distributed superconducting quantum computation from being further expanded but provide an opportunity for latecomers to overtake.

《6 Photonic quantum computation》

6 Photonic quantum computation

The photonic quantum system has the advantages of resistance to decoherence, simple and accurate single-qubit manipulation, and provision of distributed interfaces, and multiple degrees of freedom of a photon can be used for encoding; thus, this system is an important quantum information processing system. Photonic quantum computations include both special and universal quantum computation models. It can also be divided into a discrete variable model and a continuous variable model (or a combination of them) using the encoding method. These paths are expected to implement universal quantum computations.

The core hardware of photonic quantum computation consists of a quantum light source, a photonic quantum circuit, and a single-photon detector. A quantum light source is used to prepare the specific initial state, and the common types include deterministic single-photon sources, light sources of squeezed vacuum states, and light sources of entangled photon pairs. Semiconductor quantum dots radiate single photons, such as atoms, under laser excitation; therefore, it is an important way of implementing a deterministic scalable single-photon source. A research team at the University of Science and Technology of China first developed pulsed resonance excitation technology with quantum dots in 2013 and developed a deterministic single-photon source with polarization, high purity, and high efficiency [34]. In 2018, the team implemented the conversion of entangled photon pairs using parameters such as high indistinguishability and high excitation efficiency. In 2019, the team implemented a two-photon entanglement source characterized by high fidelity, efficiency, and indistinguishability. In 2020, researchers first implemented a light source characterized by high purity, high indistinguishability, and a forecasting efficiency higher than 90%.

The early photonic quantum computation was based on the linear optics of free space [35]. The experimental technique was mature, and the loss of photons in crystals and free space was considerably low, but the scalability was poor in this scheme. The feasible path for large-scale extension is to integrate optical elements on optical chips, for example, quantum light sources, circuits, and detectors, which are integrated on one waveguide chip [36]. This optical chip scheme is characterized by high stability and good scalability; however, the current efficiency should be further improved. Overall, relevant research is in its early stage.

As regards the measurement of photonic qubits, the superconducting single-photon detectors are widely used. The National Institute of Standards and Technology, Delft University of Technology (America), Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences, and other organizations can produce superconducting nanowire single-photon detectors characterized by a high detection efficiency (>90%) and high repetition frequency (> 150 MHz).

The basic operation of photonic quantum computation (e.g., probabilistic control logic gates) and the simple demonstration and verification of various quantum algorithms have been implemented. A research team from the University of Science and Technology of China devised the photonic quantum computer prototype, “Jiuzhang,” and its updated version, “Jiuzhang 2.0” and realized a milestone in quantum supremacy [37,38]. In 2022, Xanadu Quantum Technologies Inc. implemented quantum supremacy verification using time-encoded Boson sampling [39].

Currently, the biggest challenge for photonic quantum computation circuits is the implementation of a deterministic two-bit entanglement gate, and large-scale entangled state preparation and circuit manipulation and the R&D of high-efficiency detectors are also difficult to investigate. There are two ways to implement the two-bit entanglement gate: (1) the implementation is based on linear optics, and (2) nonlinearity is introduced based on linear optics. In terms of the preparation of large-scale, scalable entangled states, it is expected that quantum dot spin can be used as a medium to prepare radiated single photons in a large-scale entangled state. Recently, the development trend of photonic quantum computation has been as follows: the noisy and intermediate-scale quantum computation application and the deterministic two-bit entanglement gate are implemented, the bottlenecks in the universal photonic quantum computation are solved, a large-scale entangled state is prepared, and fault-tolerant quantum computation based on the GKP state is implemented.

《7 Trapped-ion quantum computation》

7 Trapped-ion quantum computation

The trapped-ion system is the first physical platform used to implement quantum computations [40]. The hyperfine or Zeeman levels of the ions trapped in the RF electric field are used as qubits. Coherent manipulations of qubits are conducted using lasers or microwaves. The resonant frequency of ion qubits is determined only by the ion species and external magnetic field; therefore, ion qubits have perfect uniformity compared with superconducting qubits, quantum dot qubits, and other artificial qubits. Multiple ions trapped in a potential well will form a stable lattice structure with the action of Coulomb repulsion, and the harmonic oscillation of the entire lattice can be used as the medium to generate quantum entanglements between different ions in the trap. The trapped-ion system has full connectivity (i.e., there is a direct interaction between any qubits in the system) [41]. Ions trapped in different traps can also utilize their own radiated photons as media to implement remote entanglement.

The early development of the trapped-ion system benefited from mature experimental techniques of atomic/molecular optics, without significant technical bottlenecks. For small-scale systems, the trapped-ion system has an hour-level coherence time and extremely high fidelity of quantum gates [42,43]. However, the trapped-ion system has difficulties in scaling up and in its stability (e.g., with increasing number of ions in the same trap, the ionic lattice becomes increasingly unstable, and the harmonic oscillation spectrum of the lattice tends to become dense, making it more difficult to use). Currently, the mainstream scheme for solving this problem is the “quantum charge-coupled device (QCCD) architecture,” in which electrodes on a surface trap are designed to define multiple trap areas, and each area contains only a few ions. The voltage of the electrodes is modulated to drive the ions to move and exchange between different areas [44].

The future trend of the QCCD architecture is to integrate control circuits, optical waveguides, and optical detectors onto the chip to realize the integration and miniaturization of the system. However, it is difficult to achieve this goal because it involves the collaboration of multiple technologies, such as chip fabrication, chip packaging, optical waveguide fabrication, and surface treatment. At present, there are no principle barriers to building a medium- and small-scale system containing tens to hundreds of ions. However, the following technical problems must be solved: (1) the design and fabrication of an integrated chip trap system with optical waveguides, circuits, and detectors; (2) cryogenic trap fabrication and chip surface treatment to reduce the anomalous heating rate of the ion lattice, which has a negative impact on the fidelity of quantum gate operations; and (3) modularization and miniaturization of subsystems, such as vacuum, optics, and signal control, to reduce the coupling between subsystems and increase overall stability.

Research on ion traps is relatively more active in foreign countries, involving standardization and commercialization of the overall system, development of the QCCD architecture, design and fabrication of chip traps, and integration of optical waveguides into the chip trap. In China, research began relatively late and is now in the follow-up stage. There are research groups at Tsinghua University, National University of Defense Technology, Wuhan Institute of Physics and Mathematics of the Chinese Academy of Sciences, University of Science and Technology of China, Renmin University of China, Southern University of Science and Technology, and Huawei Technologies Co., Ltd.

《8 Silicon-based quantum computation》

8 Silicon-based quantum computation

Silicon quantum computation uses a single electron or hole trapped in a quantum dot as the qubit and implements qubit driving and coupling using electric pulses [45]. This technical route has the following advantages: most of the processes are compatible with the traditional metal-oxide-semiconductor (MOS) process, have the potential for scaling up, and are easy to transfer to the semiconductor industry at the stage of commercialization [46]; the qubit coherence time is considerably long; the gate manipulation fidelity is high; and full electrical manipulation is available. There are two modes of implementation for silicon-based quantum dots. (1) Voltages are applied to the gate electrodes to trap a single electron or electron hole, and the electrodes are utilized to manipulate the quantum state. This mode can flexibly adjust inter-qubit coupling, but the integrated gate electrodes are considerably dense; thus, multiplexing or floating electrodes should be used to realize large-scale qubit arrays [47,48]. (2) A scanning tunneling microscope (STM) lithography or ion beam implantation method is used to embed dopant atoms in the silicon substrate as the qubit carrier, and the MOS electrode or STM lithographed electrodes are used to manipulate the electron spin of dopant atoms. It has the advantages of high qubit indistinguishability, easy scalability, low electrode density, and long coherence time, while the difficulty in processing is high [49].

Recently, research on silicon quantum computation has rapidly progressed, and multiple research teams have independently demonstrated 3–6 qubit operations [50], increasing the gate fidelities of quantum gates above the fault-tolerant threshold and realizing strong coupling between the electron spin and microwave photon in a superconducting cavity and the long-range spin coupling via the microwave photon. Cryogenic complementary metal oxide semiconductor (Cryo-CMOS) quantum measurement and control chips developed recently are expected to solve medium- and large-scale reading and control problems by combining silicon-based qubits. Motivated by the goal of combining silicon-based quantum chips and classical CMOS cryogenic chips, research teams have implemented silicon-based hot qubits that perform well in the region of approximately 1.1–4 K [51–53]. As a result, the silicon-based platform is the only demonstrated system that can be implemented at a temperature of 4 K using the classical quantum hybrid architecture, where large-scale integrated semiconductor technology is compatible.

The development challenges for silicon-based quantum computation are as follows: (1) the space occupied by the elements required to realize a single-qubit gate is relatively large; thus, the qubit driving scheme should be optimized; (2) the scheme and technology of multi-qubit integration requires further breakthroughs; (3) in terms of single-atom quantum computation, the accuracy and success rate of atom placement should be improved to obtain a reliable single-atom array; (4) with the technological level being further enhanced, the silicon-based substrate quality and electrical noise of the dielectric layer can be improved to increase the chip yield.

Silicon-based quantum computation will be booming in the next few years, and though it is still a while to catch up with superconducting quantum computation, the advantages of large-scale integration and compatibility with the traditional semiconductor industry are becoming more apparent, making this research field very competitive and active. The world’s leading research organizations in this field include Princeton University, University of Tokyo, Delft University of Technology, and University of New South Wales. Famous enterprises, such as Microsoft Corporation and Intel Corporation, have started early commercial development. In China, publicly reported research organizations, including the University of Science and Technology of China, Institute of Microelectronics of the Chinese Academy of Sciences, Institute of Physics of the Chinese Academy of Sciences, Beijing Academy of Quantum Information Sciences, and Southern University of Science and Technology of China.

《9 Other types of quantum computation systems》

9 Other types of quantum computation systems

《9.1 Neutral atom quantum computation》

9.1 Neutral atom quantum computation

Neutral atom quantum computation uses laser cooling and trapping techniques to build a neutral-atom array in optical traps. Quantum information is encoded in the internal states of a single atom, and then microwave or optical transitions are used to implement single-qubit manipulation. Multi-qubit manipulation and quantum information processing can be implemented based on the Rydberg blockade effect or spin-exchange collisions. The advantages of the neutral atom system include the following: the qubit coherence time is long due to weak couplings with environment noises; the distance between adjacent atoms is on the order of microns, making the independent manipulation of single qubits can be easily implemented with low crosstalk; the atomic spacing and atom array pattern can be changed, indicating that the atom array has the potential for flexible connection and scaling up.

In a study on alkali metal atoms [54], the Rydberg blockade effect was utilized to implement a two-qubit entanglement and controlled-NOT gate for the first time in 2010. Approximately 50 single-atom arrays were prepared in 2016, increasing the fidelity of the single-qubit gates to 99.6% (reaching the fault-tolerant quantum computation threshold). Since 2020, programmable quantum processors containing at least 200 qubits and the demonstration of quantum simulations have been implemented. In research related to alkaline-earth metal atoms [55], the two-qubit entanglement fidelity has increased to 99.5%.

The key technologies in this field include the preparation of quantum registers, addressing of individual qubits, and high-fidelity manipulation of multi-qubits. The following breakthroughs are expected to be achieved in the short term: quantum registers containing thousands of qubits, multi-qubit manipulation with fidelity exceeding the fault-tolerant quantum computation threshold, quantum simulation of complex physical systems with hundreds of qubits, and demonstration of quantum supremacy.

《9.2 Quantum computation of nitrogen-vacancy (NV) centers in diamond》

9.2 Quantum computation of nitrogen-vacancy (NV) centers in diamond

Quantum computation using NV centers in diamonds refers to the technical route that can implement quantum information processing at room temperature using the electron spin and Carbon-13 nuclear spins of the NV in diamonds as solid-state qubits. The NV electron spin can be initialized and measured using a laser pulse, and the single-qubit quantum gates can be implemented using a microwave pulse. The coherence time can reach milliseconds at room temperature.

After more than 20 years of development, the technologies in NV center quantum computing (e.g., the design and machining of diamond samples, nuclear spin detection, and multi-qubit manipulation) are relatively mature. However, there are many technical difficulties in implementing scalable quantum computation in the system, such as the implementation of the integration of functional devices and arrays depends on high-efficiency and controllable NV center preparation, and the noise introduced during the process of micro-nano machining should also be effectively suppressed. An accurate multi-qubit control technology should be developed to suppress the crosstalk errors as the number of qubits increases. These are important links for realizing a multinode entangled network based on a solid-state system [56].

In the past decade, remarkable progress has been made by various domestic teams in this physical system toward universal quantum computers [57]; however, overall, there is still a gap compared with the world’s leading teams.

《9.3 Nuclear magnetic resonance quantum computation》

9.3 Nuclear magnetic resonance quantum computation

After approximately 80 years of development, NMR spectroscopy has prompted many applications in the fields of life sciences, physics, and chemistry, and the works of seven Nobel laureates is related to it. NMR quantum computation utilizes 1/2-spin nuclei as qubits, which was the first experimental system to implement Shor’s factorization algorithm and Grover’s search algorithm [58]. Currently, it can manipulate up to 12 qubits [59].

In terms of academic research, many colleges and universities, such as the University of Waterloo, University of Stuttgart, Tsinghua University, University of Science and Technology of China, and Southern University of Science and Technology, have established active research teams and obtained highly influential results in quantum computation, quantum simulation, and quantum machine learning, such as quantum face recognition, quantum handwriting recognition, quantum many-body localization, and 12-qubit random quantum circuit simulation. In terms of industrialization, the joint team from Tsinghua University and the Southern University of Science and Technology launched the world’s first 4-qubit NMR quantum cloud in 2017. Shenzhen SpinQ Technology Co., Ltd. launched a miniaturized two-qubit NMR quantum computer in 2019.

With the ensemble characteristics, NMR has difficulties in scalability, but it is still one of the few experimental systems that can manipulate more than 10 qubits. Therefore, it is deemed a good experimental platform for developing quantum control technology, exploring the frontiers of quantum machine learning, and deepening the quantum industry. Another promising direction of development is nuclear electric resonance, in which the atomic nucleus is injected into silicon-based materials and the electric field is utilized for manipulation, that can possibly solve the problems of qubit frequency crowding and qubit crosstalk.

《9.4 Quantum computation with spin wave》

9.4 Quantum computation with spin wave

Spin-wave-based quantum computation is a promising new quantum-computation scheme. A spin wave refers to the collective precession mode of electron spins in magnetic materials and its quantized quasi-particles are known as magnons. Each magnon carries a spin angular momentum of one reduced Planck constant. Magnons have long relaxation times and controllability and can be used to encode quantum information [60].

The combination of magnonics and quantum information science forms a new field of quantum magnonics, covering conventional spintronics, magnonics, quantum optics, quantum computation, and quantum information science. Magnon-based qubits are theoretically confirmed, whereas integration with other quantum computation platforms needs to be realized to utilize magnons for quantum information processing. Magnon transport does not involve charge movement, and magnons can be transmitted by the magnetic insulator over long distances, significantly reducing the energy dissipation of quantum devices.

Currently, research on spin-wave quantum computation is in its early stages. There are two main methods of implementation: (1) spin waves combined with other quantum computation systems to develop a new hybrid quantum information processing technique and (2) the direct use of magnons to implement quantum computation. The first idea is the mainstream current research, especially the combination of magnons and superconducting quantum circuits [61]. The architectural design is utilized to enhance the coupling between photon-magnon-superconducting qubits in the microwave cavity, which is a challenging problem.

《9.5 Topological quantum computation》

9.5 Topological quantum computation

The biggest challenge encountered in the development of quantum computation techniques is the solution of errors due to inevitable decoherence. Compared to other approaches, topological quantum computation can fundamentally solve this problem [62]. Theoretically, only a few topological qubits (or even one qubit) are (is) required to build 1 logic bit. After implementing topological qubits, the era of integrated logic bits begins, indicating the leap-forward progress in the development of quantum computation. Represented by Microsoft, several research teams have focused on implementing topological qubits.

At present, systems for exploring topological quantum computation include strong spin–orbit coupling materials and s-wave superconductor hybrid systems, topological insulators and s-wave superconductor proximity systems, iron-based superconductors, and intrinsic topological superconductors. Key technologies in this direction include quantum material growth, preparation of topological quantum devices, and quantum transport measurements. Currently, scientific research plays a key role in possible breakthroughs in industrial applications.

Experimentally, the zero-bias conductance peak is used as the smoking gun to determine whether the Majorana quantum state exists. To date, the consensus is that this evidence is not reliable. Implementing and meeting the braiding operations of non-Abelian statistics is a core problem to be solved in this field. Overall, topological quantum computation is promising and requires further investigation.

《10 Suggestions on the development of quantum computation technology in China》

10 Suggestions on the development of quantum computation technology in China

Quantum computation is a system engineering technique that challenges the extreme scientific and technological capabilities of human beings, involving the total integration of ultimate technologies of almost all disciplines. Although the prospects for field development are bright, there are some difficulties. Since the quantum computer is used in system engineering, there is an urgent need to strengthen the cooperation of transdisciplinary talents of different disciplines and fields, jointly tackling key problems and boosting its development. The low-level redundant development, such as “following suit” and “small and all-inclusive” development, should be avoided, and a significant waste of resources possibly caused by internal vicious competition should be eliminated. Meanwhile, with a view to the medium- and long-term steady development in the field of quantum computation, creative talent should be actively cultivated, fundamental research should be continuously strengthened, and the R&D of core and key technologies should be highlighted to further the development of China’s quantum computation, which complies with the national conditions of China and creates value.

《10.1 Focus on the strategic planning and layout and maintain the view of overall situation》

10.1 Focus on the strategic planning and layout and maintain the view of overall situation

Recently, China has attached great importance to quantum science and technology and listed it in the national strategies for priority development, taking a series of measures to strengthen planning and layout, and establishing the National Laboratory of Quantum Information Science. These measures set the direction and create an environment for the medium- and long-term development of quantum science and technology in China. Based on the range and depth of the future R&D layout of quantum science and technology in China, it is recommended to establish a national professional committee to provide the macro-direction for the overall development of quantum science and technology in China and offer relevant consulting services and policy suggestions.

The personnel responsible for the R&D of quantum computation and policymaking should maintain a broad perspective on the development of this field. Although the prospects of quantum computation are bright, there are some difficulties. Regarding the two meanings suggested by this perspective, neither can be neglected: (1) without a clear understanding of the potential value of quantum computation, it is impossible to treat relevant R&D and establish policies from a higher position, and it is easy to slip into trivialities; (2) if there is insufficient mental preparation for the difficulties in the development, it will be prone to blind optimism, causing problems such as being eager for quick success and instant benefits and giving up easily.

Currently, the prevailing view in academia is that there are no fundamental difficulties in implementing scalable universal fault-tolerant quantum computation. Therefore, it is relatively optimistic regarding the long-term development of this field. However, the academic community has different opinions on the quantum computer development roadmap of practical value and has taken a very cautious approach. In other words, the academic community maintains a view of the overall situation of the medium- and long-term development of quantum computation. In the industry, some of the top international technology companies have conducted R&D in the field of quantum computation for over 30 years.

《10.2 Establishment of an innovative talent system to cultivate high-level teams》

10.2 Establishment of an innovative talent system to cultivate high-level teams

Talent is a core element in the development of high-tech businesses. It is necessary to increase the scale and quality of the talent teams to ensure long-term and healthy development in the field of quantum computation in China. It would be better to keep the introduction of high-end talents, actively conduct service support for talents, support rapid functioning in new environments, and expand the good development space. Therefore, we must cultivate talent in the long run to solve the talent shortage problem in the development of quantum computation in China. The R&D of quantum computation is based on system engineering, with diverse demands for talent. Early quantum computation R&D focused on fundamental research. In the future, this will be large-scale and practical, requiring an increasing number of professional engineers. To adapt to this diversification trend, it is necessary to build a reasonable talent cultivation and assessment mechanism.

In terms of talent cultivation, attention should be paid to inter-disciplinary talents having the fundamentals of mathematics and physics and the strong engineering and technical background to build a “bridge” between the traditional fundamental research field and the engineering application development field. There are multiple ways to cultivate interdisciplinary talent. For example, a special discipline of quantum science and engineering is established, and a reasonable industry–university–research system is used to guide the diversion of talent. A perfect industry–university–research system helps to form a positive interaction between academia and industry, providing sufficient opportunities and possibility of the cultivation, mobility, and development of various talents.

A reasonable talent assessment mechanism is key to ensuring a long-term and healthy development in the field of quantum computation. In view of the diverse demands for talents in quantum computation, the corresponding talent assessment mechanism should also remain flexible. The talents related to fundamental research are responsible for the R&D of quantum computation at the early stage. In addition, the corresponding assessment mechanism set theses, projects, and expenditures as the main criteria, but it is difficult to provide engineers and other talents with stable positions and promotion channels. Solving similar problems also relies on a perfect industry–university–research system and a reasonable and flexible talent assessment standard.

《10.3 Strengthening the fundamental research to strive for more innovations and breakthroughs》

10.3 Strengthening the fundamental research to strive for more innovations and breakthroughs

From a historical perspective, the development of technical applications based on quantum mechanics remains in its early stage. In the future, it will be difficult to accurately predict which bottlenecks will be encountered in current technical routes. Similar cases frequently occur in the history of technology development (i.e., in principle, the feasible scheme cannot be implemented in projects but is replaced by other original breakthroughs). Therefore, strengthening the fundamental research is of great significance to the R&D of quantum computations.

The distinctive feature of fundamental research is unpredictability. This means that it is unlikely to predict the directions and issues in which a breakthrough will be made in advance, plan the time schedule of breakthroughs in advance, and/or predict the specific application of a breakthrough in the future. To create a good atmosphere for fundamental research, development plans and policy guidance failing to comply with the fundamental research law should be avoided first. The more original and groundbreaking the research, the more likely it is to appear in unexpected directions. Second, an inflexible talent assessment mode should be avoided. The traditional talent assessment mechanism is based on mandatory indicators that can be assessed and played a role and will continue to play a role in the science and technology business of China. However, simplification and vulgarization should be avoided to the greatest extent possible in the application of such assessment system. It will take a long time to explore breakthroughs in fundamental research by researchers. In addition, the direction is not mainstream; therefore, it is extremely difficult to ensure a stable output. If a considerable emphasis is placed on the mandatory indicators that can be assessed, hot pursuit will be stimulated, and people will be eager for quick success and instant benefits. Under extreme conditions, it will even become a punishment for researchers who aspire to “sit on a cold bench (i.e., hold an unimportant post)”, causing that “there is no bench to sit on.”

It is recommended to make every effort to create a flexible and free environment for quantum computation research, support researchers to follow their interests and tastes to select directions and subjects, and freely conduct academic and applied exploration. It will be better to have a “success will be assured when conditions are ripe” and “there will be something we cannot ask for” attitude toward fundamental research and strive for more original breakthroughs in the long run.

《10.4 Strengthen the R&D of core technologies and equipment to ensure independence and control》

10.4 Strengthen the R&D of core technologies and equipment to ensure independence and control

China lags behind developed countries in the field of science and technology, and there is a bigger gap in terms of key equipment and technologies supporting development in the field of science and technology. This current status is determined by the history of socio-economic development in China and the current industrial structure and is also an inevitable stage of development. A deep understanding of this point will help build confidence in solving the problems of outdated key equipment and technologies.

Attention should be paid to the following points to solve the problems associated with outdated key equipment and technologies. (1) Essential planning and guidance should be strengthened to avoid being completely dominated by the laws of the market and the capital. As a strategic emerging technology, quantum computation has been developed based on the strategic support of the country. A good policy environment should be created, relevant technology promotion laws should be established, department cooperation and specialized lifecycle management should be supported, and a trans-departmental collaborative management mechanism at the state level should be established. (2) Attention should be paid to the positioning and sustainability of R&D. A good industry–university–research system should be built to provide opportunities for converting relevant research results into products and application technologies. Legislation and standardization of quantum computation technology should be carried out, the utilization rate of property rights should be improved, and the patent value should be maximized. (3) In the future, quantum computation R&D activities will prompt demand for the development of key equipment and technologies, and every effort should be made to seize valuable opportunities.

《Compliance with ethics guidelines》

Compliance with ethics guidelines

The authors declare that they have no conflict of interest or financial conflicts to disclose.