The quantum computing firm PsiQuantum is partnering with universities and a national lab to build the largest US-based quantum computing facility, the company announced today.
The firm, which has headquarters in California, says it aims to house a quantum computer containing up to 1 million quantum bits, or qubits, within the next 10 years. At the moment, the largest quantum computers have around 1,000 qubits.
Quantum computers promise to do a wide range of tasks, from drug discovery to cryptography, at record-breaking speeds. Companies are using different approaches to build the systems and working hard to scale them up. Both Google and IBM, for example, make the qubits out of superconducting material. IonQ makes qubits by trapping ions using electromagnetic fields. PsiQuantum is building qubits from photons.
A major benefit of photonic quantum computing is the ability to operate at higher temperatures than superconducting systems. “Photons don’t feel heat and they don’t feel electromagnetic interference,” says Pete Shadbolt, PsiQuantum’s cofounder and chief scientific officer. This imperturbability makes the technology easier and cheaper to test in the lab, Shadbolt says.
It also reduces the cooling requirements, which should make the technology more energy efficient and easier to scale up. PsiQuantum’s computer can’t be operated at room temperature, because it needs superconducting detectors to locate photons and perform error correction. But those sensors only need to be cooled to a few degrees Kelvin, or a little under -450 °F. While that’s an icy temperature, it is still easier to achieve than what’s required for superconducting systems, which demand cryogenic cooling.
The company has opted not to build small-scale quantum computers (such as IBM’s Condor, which uses a little over 1,100 qubits). Instead it is aiming to manufacture and test what it calls “intermediate systems.” These include chips, cabinets, and superconducting photon detectors. PsiQuantum says it is targeting these larger-scale systems in part because smaller devices are unable to adequately correct errors and operate at a realistic price point.
Getting smaller-scale systems to do useful work has been an area of active research. But “just in the last few years, we’ve seen people waking up to the fact that small systems are not going to be useful,” says Shadbolt. In order to adequately correct the inevitable errors, he says, “you have to build a big system with about a million qubits.” The approach conserves resources, he says, because the company doesn’t spend time piecing together smaller systems. But skipping over them makes PsiQuantum’s technology difficult to compare to what’s already on the market.
The company won’t share details about the exact timeline of the Illinois project, which will be a collaboration with Argonne National Lab, the University of Chicago, and several other Illinois universities. It does say it is hoping to break ground on a similar facility in Brisbane, Australia, next year and hopes that facility, which will house its own large-scale quantum computer, will be fully operational by 2027. “We expect Chicago to follow thereafter in terms of the site being operational,” the company said in a statement.
Significant hurdles lie ahead. Building the infrastructure for this facility, particularly for the cooling system, will be the slowest and most expensive aspect of the construction. And when the facility is finally constructed, there will need to be improvements in the quantum algorithms run on the computers. Shadbolt says the current algorithms are far too expensive and resource intensive.
The sheer complexity of the construction project might seem daunting. “This could be the most complex quantum optical electronic system humans have ever built, and that’s hard,” says Shadbolt. “We take comfort in the fact that it resembles a supercomputer or a data center, and we’re building it using the same fabs, the same contract manufacturers, and the same engineers.”