Why it matters: Considered by many to be the future of processor technology, quantum chips have so far proven difficult to test, validate, and manufacture. Intel, in partnership with Bluefors and Afore, took a huge step forward, unveiling the Cryogenic Wafer Prober enabling the company to gather data from qubits at temperatures approaching absolute zero. The cryoprober greatly reduces data collection times, and aims to significantly accelerate development in the quantum computing industry.

Intel, along with partners Bluefors and Afore, has created the first Cryogenic Wafer Prober which will test and validate qubits needed for quantum computing.

Last year, Intel developed their first quantum wafers, considered a huge step toward the mass production of quantum chips. The cryoprober tool will enable engineers to test these 300mm wafers at temperatures approaching absolute zero, or zero degrees kelvin. To date, data collection and access to data on quantum chips has been very difficult.

Because the characteristics of qubits must be measured at extremely low temperatures, the equipment and technological limitations of existing testing environments meant small subsets of data often took days to collect in conventional dilution refrigerators. The cryoprober tool will allow Intel to automate and collect information on qubits in a matter of minutes.

The hope is that a tool such as this will help accelerate development in quantum computing, which has slowed in recent years. Jim Clarke, the director of quantum hardware at Intel, hopes the cryoprober will help the entire quantum computing industry.

“So far the past year, Intel has worked with Bluefors and Afore to combine our expertise and build a fast, electrical characterization tool that can operate in the quantum regime. We hope that by designing this tool, the industry can use it to accelerate the progress of quantum computing,” Clarke said of the project.

Quantum computing could allow computers to solve problems that were previously unsolvable. While conventional computers use binary notation for computations, a quantum computer uses "qubits," bits that can exist as a 0 and a 1 at the same time. This superposition allows quantum computers to perform complex operations concurrently, instead of the sequential operation found in conventional computers.

Development of quantum computers could have wide ranging effects, from AI, to medical technology, and mathematics.