# Is phototonic quantum computing the answer to commercial quantum use? Maybe

- Summary:
- Is photonic quantum computing a breakthrough, or a high tech diversion? We're going to find out soon - here's why the momentum for photonic approaches to quantum is increasing.

Photonic quantum computing is an approach that relies on the use of photons as quantum bits, or qubits. Photons are particles of light and can be manipulated and controlled to carry and process quantum information.

In photonic quantum computing, qubits are encoded in the quantum states of photons. The two most common approaches for encoding qubits in photons are polarization and path encoding. In polarization encoding, the quantum states of photons are defined by the orientation of their electric field oscillations, such as horizontal (|0⟩) and vertical (|1⟩) polarizations. In path encoding, qubits are encoded based on the spatial paths that photons take, such as taking one path (|0⟩) or another path (|1⟩).

Photons have several advantages for quantum computing. First, they can travel long distances without significant loss of information, which is crucial for building large-scale quantum networks. Second, photons are relatively immune to environmental noise and can maintain their quantum states over longer periods compared to other physical systems, and can be easily manipulated using linear optical components, such as beam splitters and phase shifters.

One of the main differences between photonic qubits and physical qubits, such as those based on superconducting circuits or trapped ions, is the way they are manipulated and measured. In photonic quantum computing, operations on qubits are typically implemented using linear optical components and measurements are performed using photodetectors. This contrasts with physical qubits, where operations are achieved through the manipulation of the physical properties of the system, such as the energy levels of superconducting circuits or the internal states of ions.

Another difference is the way qubits are interconnected. In physical qubit systems, qubits can interact directly with each other through their physical properties, enabling the implementation of two-qubit gates. In photonic systems, qubits do not naturally interact with each other. Instead, quantum gates are implemented by manipulating the states of individual photons and using additional resources, such as entangled photon pairs or quantum memories, to enable interactions between distant qubits.

I realize this sounds a little like science fiction, but the details are far more spooky.

## Facts and fantasy about cooling

Current hype around photonic is that photon qubits do not need to be cooled, that they can operate at room temperature. In most practical implementations of photonic quantum computers, the photons themselves are not typically cooled to very low temperatures. However, the components used to manipulate and detect photons, such as waveguides, beam splitters, and detectors, often rely on materials that require cryogenic temperatures.

Superconducting materials, which are commonly used for photodetectors and other components in photonic quantum computing, typically operate at temperatures near absolute zero (around 0.01 Kelvin). Cooling the components to these extremely low temperatures helps reduce thermal noise and enhances the coherence and stability of the quantum states being manipulated.

So, while photons themselves may not require extreme cooling, the supporting infrastructure and components in photonic quantum computers still need to be cooled to cryogenic temperatures to achieve the necessary quantum coherence and maintain the fragile quantum states required for quantum computation.

In a previous article, I addressed the research around using diamonds in the form of diamond-based qubits, specifically nitrogen-vacancy (NV) centers. NV centers are atomic-scale defects that occur naturally in diamonds and can serve as qubits due to their unique properties.

Here's how diamonds, particularly NV centers, are used in quantum computers:

**1. Qubit Initialization**: NV centers in diamonds can be initialized to a specific quantum state, typically the ground state, which represents the "0" state of the qubit. This initialization is crucial for performing quantum computations.

**2. Coherent Control:** NV centers in diamonds can be manipulated using microwave and laser pulses. By applying carefully calibrated pulses of electromagnetic radiation, the quantum state of the NV center can be modified, enabling operations such as single-qubit gates and entanglement generation.

**3. Sensing:** One remarkable feature of NV centers is their exceptional sensitivity to external magnetic fields. This property allows diamonds to be used as highly sensitive sensors for various applications, including magnetic field imaging, biomolecular sensing, and even the detection of weak signals from individual neurons in biological systems.

**4. Long Coherence Times:** NV centers in diamonds exhibit long coherence times, meaning they can maintain their quantum states for extended durations. This property is advantageous for performing complex quantum computations as it reduces the effects of noise and increases the stability of the qubits.

**5. Scalability:** Diamonds offer the potential for scalability in quantum computing. The crystal structure of diamonds allows for the precise positioning of individual NV centers, which can be used as interconnected qubits. This controlled arrangement opens the possibility of building larger-scale quantum systems and implementing multi-qubit operations.

It's important to note that the integration of diamond-based qubits into practical quantum computing systems is an active area of research. While diamond-based qubits have several desirable properties, there are still challenges to overcome, including improving coherence times, enhancing the fidelity of operations, and addressing scalability issues. And just like photonic quantum computers, no one has overcome the cryogenic requirements for the other components.

## My take

A few years ago, Google engineers proposed a roadmap for quantum computing that I describe in another article:

Google plans to search for commercially viable applications in the short term, but they don’t think there will be many for another ten years - a time frame I've heard one referred to as “bound but loose.” What that meant was, no more than ten, maybe sooner. In the industry, the term for the current state of the art is NISQ – Noisy, Interim Scale Quantum Computing.

The largest quantum computers are in the 50-70 qubit range, and Google feels NISQ has a ceiling of maybe two hundred. The "noisy" part of NISQ is because the qubits need to interact and be nearby. That generates noise. The more qubits, the more noise, and the more challenging it is to control the noise.

But Google suggests the real unsolved problems in fields like optimization, materials science, chemistry, drug discovery, finance, and electronics will take machines with thousands of qubits and even envision one million on a planar array etched in aluminum. Major problems need solving such as noise elimination, coherence, and lifetime (a qubit holds its position in a tiny time slice).

Billions are being spent by private companies, research universities and governments on quantum and Google was even mistaken or just being coy, but “classic” quantum computers have already exceeded one thousand qubits. Whether photonic is the answer or not, we’ll have to see, but if the pace of development continues, it will not take seven more years (Google made the projection in 2020).