Quantum Computing and Computed Tomography: A Roadmap towards QuantumCT

Quantum computing (QC) is considered as a rising star of computing technologies with very promising possibilities towards novel solutions of even more complex computing tasks than those which are tackled using today’s supercomputers. However, direct use in everyday applications is lacking both: large scale quantum computers and quantum algorithms, i.e. software. We are carrying out the first project on a road towards QC enabled Computed Tomography (CT). In our paper we present this project after its first out of five years duration and share the first steps we made with the community. It is worth to mention that the hardware for QC is still in a phase of development, which implies that most of the software research is in a phase of becoming ready for productional use cases.


Introduction
With the broader accessibility of quantum computers many companies start their first approach into the world of QC to familiarize themselves with the technology and trying to combine it with their field of expertise [1]. Hereby the first questions arise on how a quantum computer can be used to solve or improve industrial problems by using quantum algorithms. The challenge lies in finding such a problem and creating the algorithm, because the use of a quantum computer requires a different approach than classical programming. While a classic computer works with bits of the values 0 and 1, the quantum computer uses a quantum bit (Qubit) which can be brought into arbitrary superpositions of these binary states. This superposition enables the quantum computer to simultaneously perform a single computation on a wide range of values which is one of the advantages a quantum computer provides. Superposition of quantum states especially allows the entanglement (a specific form of correlation) of multiple qubits, which represents the most distinguishing feature of QC and is central to its fundamental conceptual advantages. However, the difficulties lie in the final step once one desires to retrieve the result of that computation. At individual qubit measurements, the qubits will collapse into either of their two basis states. The latter are then mapped to classical values of 0 and 1. Which of the two values will be measured depends on the quantum state of the system and is in general a superposition of the basis states. The results of repeated measurements will follow the respective probability distribution accordingly. The consequence is that a single quantum calculation will often not provide the desired value directly. The calculation has to be repeated multiple times (so called shots), depending on the number of qubits, to provide a statistical distribution so that the desired value can be found. The time consumed for repeated quantum measurements can exceed the advantage a quantum computer can have in comparison to a calculation duration on a classical computer. To achieve a performance advantage, a quantum algorithm is required that can reduce the required number of shots by exploiting constructive and destructive interference in the computation, as shown in the well-known algorithms of Deutsch-Jozsa [2] and Grover [3] algorithms. We started a five-year project within the public funded BayQS consortium with the objective to find out what potential a quantum computer has in the context of computed tomography.

Roadmap towards Quantum Computed Tomography (QCT)
In 2021, the Fraunhofer-Gesellschaft gained exclusive access to IBM Q System One, the European IBM quantum computer in Ehningen (near Stuttgart), with the aim of researching and developing new technological solutions in quantum computing. This state-of-the-art quantum computer utilizes 27 qubits and provides the German business and innovation landscape with several application-specific research and development opportunities. It does this by offering scientists the option of testing their own algorithms on the IBM quantum computer under European data protection and IP law [4]. We would like to invite the community to join us to analyse which of our technology problems can be solved efficiently on a quantum computer. Therefore, the work of the project presented here aims to be both pioneering scientific work in an application related field and a starting point for novel ideas in the community.
The development or even feasibility check of QCT is more complex than just trying to replace classical computers by quantum computers. First, a set of computational tasks for CT with promising complexity is required.
In contrast to medical applications, industrial CT is used to inspect a very wide range of different components. In order to achieve high-quality images, many acquisition parameters must be found and optimized before each acquisition.

Figure 1 Idea of a QC enabled computed tomography.
In the BayQS project one task is to investigate whether and how we can use QC to automatically determine the most favorable configuration of a CT system. One approach is to synthetically generate possible points on a trajectory (e.g. by simulation) and to choose the best ones under given boundary conditions.
Planning procedures for complex sensor systems can be applied to the Travelling Salesman Problem under certain conditions. It is already known that this problem can be solved faster with quantum annealing-based methods (see e.g. [5]).
Another task of BayQS is to explore how QC can be used for CT image reconstruction. One approach is to write the reconstruction problem as a mathematical optimization problem and solve it with already known QC techniques (e.g., quantum annealing [6], quantum approximate optimization algorithm (QAOA) [7]). This could be the first step towards QCT.
We will show first results on how we can solve optimization problems related to CT reconstruction respectively acquisition planning.

Technical Framework
A program is run on a quantum computer by writing a classical software script on a classical computer. Inside this script the "quantum program" gets compiled and sent to the quantum computer. The quantum computer executes the program. Finally the script should return how long it took to perform the quantum computation, how many times it executed the quantum program (called shots) and the resulting distribution of measurement results. For the communication between a quantum computer and a classical computer an interface is needed. This interface depends on the quantum computer system, in our case the IBM Q System One. This quantum computer is accessible from anywhere using the python library Qiskit [8] which provides the required software abstraction. Qiskit is an open-source SDK developed for interacting with quantum computers at the level of pulses, circuits and application modules. With Qiskit one connects easily to the quantum computer and sends quantum programs to it. These programs get automatically queued and the quantum computer executes one job after another. The quantum program itself is also written with Qiskit.
Writing a quantum program is more like designing a circuit. It contains how many qubits and bits should be used, their initial states and which operations should be done. Each operation on each qubit must be defined and written down. These operations are called gates, analogue to classical computers where gates are operating on bits. These quantum gates can for example flip qubits or entangle them. To get a result from the qubit system the qubit states must be read out at the end of the program, the so-called measurement. In the moment of measuring, a qubit collapses to its basis states, 0 or 1. Measuring a qubit can lead to different results due to its quantum mechanical wave nature. Thus, a quantum program must generally be executed several times to get the result distribution.
The quantum computer executing the quantum program must fulfil the requirements of the quantum program. Therefore, the number of qubits, the number of gates and the way of qubit entanglement in the circuit have to be considered. Due to quantum computer hardware infrastructure, it is not always possible to entangle every combination of two arbitrary qubits. On some quantum computers, only specific qubits can be entangled by specific gates to other qubits which can have great impact on designing a quantum circuit. Furthermore, the quantum states of the qubits are not stable for a long time. Thus, one must consider that the designed quantum circuit should be as short as possible and must be finished before the defined states decay. Otherwise, one obtains wrong results of unstable qubit states.
For testing quantum programs with only a few qubits, it is possible to run the quantum program on a simulator which runs on a classical system. There, one can read out the wave function of each qubit at any time, since the entire system is calculated by linear algebra. This can help to understand and check what the gate operations on qubits do.

Image processing on a QC: a challenge by itself
Image processing isbesides the hardware used for measurementone of the most challenging parts in computed tomography. Usually, we talk about thousands of images with an order of 10 6 to 10 7 pixels each (with sizes of 1024x1024 up to 4096x4096) that need to be loaded, post-corrected, filtered and processed by a reconstruction algorithm. The data usually is stored as 16-bit to 32-bit grayscale values in a linear order in the volatile or non-volatile memory of specialized computers using high-end graphical processing units to process the data highly parallel.
On classical computers the position of each pixel in an image is defined by its position in the memory and reading out the gray values means to perform a measurement of the potential state of the physical accessible memory [9]. Therefore (in terms of storage of these images) the pixels can be treated separately since there is no interaction in between them. For a typical 16-bit image with 1024x1024 pixels this means that we need 1024*1024*16 = 16,777,216 classical bits for storing such an image.
Using a quantum computer for image processing offers completely new opportunities and challenges. The qubits of a quantum system can be used in a quantum mechanical superposition state giving the chance to encode an image in very few qubits.
The encoding of quantum images -meaning actual representation of the data on a quantum device -is itself a challenging part and can be done in various ways. The most common implementations for storing grayscale images are called flexible representation for quantum images (FRQI) [10] and novel enhanced quantum representation (NEQR) [11]. Both approaches use the same idea, that the gray-value and the position of a pixel is stored in different sets of qubits. But in contrast to a classical computer, where the grey value has to be stored in different bits for each pixel, in these schemes one can use the same qubits to encode the information of all pixels by exploiting the superposition principle of quantum states.
In the case FRQI we only need one qubit to encode the grayscale value, while in the case NEQR this number depends on the bit-depth of the image. The number of qubits for the encoding of the position of each pixel in the image depends on the dimensions of the image, e.g., if we have an image with 1024x1024 (=2 10 *2 10 =2 20 ) pixels, we would need 20 qubits to encode the 2 20 =1,048,576 different positions.
This example of storing an image already shows one powerful aspect of quantum computers: parallelism via the quantum mechanical superposition principle. Instead of storing the information of each pixel in different bits (like in classical computers), the same qubits can be used for all pixels on quantum computers. This drastically reduced the number of needed qubits. E.g., for storing a 16-bit image with 1024x1024 pixels on a quantum computer we only need 21 (FRQI) or 24 qubits (NEQR) [12].
But this reduction in the number of qubits also comes with a price. First of all, the transfer of an image to a quantum computer is not straight forward, but already requires a non-negligible number of quantum gates (given an image of 2 n x 2 n Pixels and a grayscale range of [1,…,2 q -1] NEQR requires O(qn2 2n ) and is therefore less computationally complex than FRQI with O(2 4n ) [12]. Secondly due to the nature of quantum mechanics, recovering the image from a quantum computer requires multiple measurements of the same state representing the image. Here the NEQR scheme only requires a fixed number of repetitions to recover the image, while the FRQI scheme can only provide an approximation to the real image with many repeated measurements. This means that the quantum image (and all following image operations) has to be prepared and measured many times, before one can extract the complete information.
Even though a lot of work has been done in the field of quantum image processing in recent years, it is still in its infancy and so far its focus was mostly on the translation of classical algorithms to their quantum counterpart [25]. It is still not clear if and how these new quantum versions bring a performance increase, since there is no explicit study, which compares the computational complexity of state of the art classical and quantum image processing algorithms in detail [25]. In the future, novel quantum algorithms have to be developed in order to exploit the nature of quantum mechanics, which is not an easy task.
As a summary, there are a lot of challenges that need to be addressed before X-ray projections or even CT Volumes can be handled on a quantum computer in terms of image processing. There are various concepts for loading and storing image information in qubits, even if the computational capabilities (number of qubits and applicable number of gates in a row) of current quantum computers are still too low for practical application. Since the information contained in the images is not as easily accessed as it is on a classical computer, the computation of typical CT-applications like filtering and projection/summation remains a task for research in the coming years. Additionally, the retrieval of the image information from a quantum computer is a (time) expensive process.

CT Image reconstruction
Reconstructing three dimensional images from measured X-ray projection data is a crucial step in Computed tomography (CT). This step usually is time consuming and still today limited by the computational performance of conventional hardware. Our objective is to explore the potential of QC when applied to the CT image reconstruction problem. To find out whether there is or is not a quantum-advantage to CT image reconstruction or not requires designing a quantum-algorithm for image reconstruction. We start this task by exploring the feasibility to transfer well-known conventional reconstruction algorithms to a quantum computer. Image reconstructions algorithms that are usually employed today can be divided into three general classes: 1. Algebraic formulation of the measurement 2. Inverse integral transforms 3. Statistical methods, which were not treated in the project up to now and thus are still part of future work to do.
This first class of reconstruction methods is based on a discrete formulation of the image formation problem. A finite set of measured X-ray attenuation values pj results from a given two-dimensional image µi (image pixels are sequentially listed in a vector). The corresponding physical measurement can be written as a matrix multiplication: Thereby the (known) geometry of the measurement system is described by the weight matrix = . Mathematically, the inversion of the measurement equation can be formulated easily: = −1 ⋅ . At a first glance this solution seems simple. Nevertheless, the numerical challenge arises from the fact that with real applications the dimensions of W are large, typically 10002 image pixels by 106 measured data points. In practice, a set of µ is approximated by an iterative algorithm. Consequently, an implementation of this method on a Quantum Computer must deal with the general task of solving a high dimensional system of algebraic equations. The second method to reconstruct a cross-sectional image is to solve the continuous integral equations defining the socalled Radon-transform which is describing the physics of the x-ray measurement: The two-dimensional function µ at an arbitrary point ( , ) can be recovered from the Radon space p with a simple algorithm that can be derived from the inverse transform formula. This algorithm is usually called "filtered backprojection" and is structured in two steps. First the measured projection data p for a given projection angle are filtered with a pre-defined convolution kernel h. ( ) = ( ) * ℎ( ) filtering In order to calculate µ at a particular position, the filtered projections are subsequently back-projected by integrating over 180° angular range of projections: ( , ) = ∫ ( cos + sin ) dθ 0 back-projection It should be noted that this integral, usually realized as a sum over the given set of projection angles, needs to be repeated for each pixel at position ( , ). There are many numerical speed-up techniques to realize this algorithm, but in essence, the basic steps are the same. A straight forward approach to implement the filtered back-projection on quantum computer makes use of the existing algorithm of quantum Fourier Transform (QFT) to run the "filter"-part of the filtered back-projection [26]. Obviously, this is a hybrid approach. Its performance needs to be qualified carefully, since data has to be transferred to and from the quantum computer, while back-projection is done locally on classical computer.

Summary and Upcoming work
Quantum computing represents a promising future technology that could potentially change the way complex computational tasks will be addressed. QC can serve as an enabler for new, innovative developments and advancements in computer science as well as its applications in many industrial sectors. This might be especially true for the field of industrial computed tomography, where Quantum CT could enable more efficient CT measurement planning or faster image reconstruction algorithms. Or even more complex image evaluation tasks on the complex datasets, as they emerge from CT scans. With an increasing spatial resolution of CT measurements, the data volume grows, which in turn leads to longer computational times. With quantum computing new algorithms for image processing can be developed to reduce evaluation times and make high resolution scans of bigger objects feasible. Additionally, complex measurement trajectories could be optimized with the help of quantum algorithms, yielding better image quality in less acquisition time.
Although the technology is just finding its way out of the laboratory into the real-world, it is rapidly growing and evolving e.g., IBM plans to increase the number of qubits of their quantum computers by more than 15-times in the next two to three years [27]. Therefore, it is vital to engage with this new technology as soon as possible to bring its advantages to real-world applications like we want to do with QuantumCT.
Within our very first year of this project, which was also our entry point into the novel technology, we conducted mainly pioneering work in terms of acquiring good understanding of the quantum physics, gathering experiences in quantum algorithm design and simulation of these as well as developing first concepts to transfer existing QC-algorithms in CT related software tasks.
From an abstract point of view, tomographic reconstruction and related processing problems can be reduced to optimization tasks and Fourier processing, both of which are central and actively researched elements of QC.
One major next step in our project is to address the problem of an efficient way of storing CT Data (i.e. voxel data) -in this context, efficient storing means a way of representing voxel data with the least possible expenditure of qubits, but still in a way that matches the needs of QC data processing algorithms. Concerning CT measurement planning task, our next step will be the reformulation of well-chosen planning tasks in the manner of travelling salesman like problems, with the goal of addressing them with literature-known solutions.
Other community related questionswhich we would like to answer in the progress of the projectare: Is QCT feasible at allwhat prerequisites do exist? Will QCT provide superiority to classical computing approaches and under which conditions? When is QCT going to be operational?