Quantum computing has advanced outside of being purely theoretical or the purview of science fiction. Several companies have specialized computes as their research projects or proof of concepts. IBM put up a publicly available quantum computer for testing with their IBM Q initiative. They’ve now expanded that from an available 5-qubit processor to 16-qubit. But it’s still the Wild West for the field.
For example, simply measuring performance gets surprisingly difficult. It’s easy to forget in classical computing with the bevy of benchmarks available, but even the language for performance on the quantum side isn’t agreed upon. Chris Lee at Ars Technica gives an in-depth look at what IBM is introducing as a measure of quantum computing performance: quantum volume. Previous measures have highlighted single aspects of quantum computing, either gate speed or gate fidelity (reliability).
Quantum volume as a performance metric is based off the idea of circuit depth, which is the number of operations that can be performed before it is unreasonable to expect a given qubit state to be correct. This is then multiplied by the number of total qubits to give a number that has meaningful implications.
Chris thinks its an interesting approach, but one with potential pitfalls and issues. He goes into the gory quantum details in the piece. It shows you how primordial quantum computing still is when even the idea of how to benchmark is still up for grabs.
Chris Lee Comments:
The race to build the first useful quantum computer continues apace. And, like all races, there are decisions to be made, including the technology each competitor must choose. But, in science, no one knows the race course, where the finish line is, or even if the race has any sort of prize (financial or intellectual) along the way.
On the other hand, the competitors can take a hand in the outcome by choosing the criteria by which success is judged. And, in this rather cynical spirit, we come to IBM’s introduction (PDF) of “quantum volume” as a single numerical benchmark for quantum computers. In the world of quantum computing, it seems that everyone is choosing their own benchmark. But, on closer inspection, the idea of quantum volume has merit.
Read more at Ars Technica
- VMware NSX: Going Big with Micro-Segmentation - May 23, 2017
- DNA Storage is Weird - May 23, 2017
- NetApp and Open Source - May 23, 2017
- What is Big Data? The On-Premise IT Roundtable - May 23, 2017
- NAS Effect: 10TB Western Digital Red Drives - May 22, 2017
- Intel NFV, an SD-WAN Cook-Off, and a Missing Control Plane in Gestalt Networking News 17.6 - May 22, 2017
- “Big Data” Isn’t a Thing - May 19, 2017
- Managed Storage with ClearSky Data - May 19, 2017
- Microsoft Opening Data Centers in Africa - May 18, 2017
- Datrium And Open Convergence - May 18, 2017