IBM Achieves Highest Quantum Volume to Date
Andrew Wheeler posted on March 04, 2019 |

In 2017, IBM introduced "Quantum Volume" as a simple benchmark for quantum computers. Some skeptics argued that those engaged in the quest to build the world's first quantum computer shouldn't set the benchmarks. But the quantum volume benchmark from IBM seems to capture multi-dimensional performance characteristics of a quantum processor in a way that is valid and universal. Typically, researchers use the speed by which a quantum gate performs an operation (gate speed), or gate fidelity, which measures the reliability of a gate operation.

When we benchmark a CPU or GPU for reviews at engineering.com, we don't measure just one characteristic of the hardware, like cache size or clock speed. Many different individual features have to be taken into consideration and weighed against each other in a comprehensive comparison. Quantum Volume is a fundamental performance metric designed by IBM to determine the overall power of a quantum computer.  It accounts for circuit compiler efficiency, device connectivity, measurement and gate errors and device cross talk. 

Why does Quantum Volume count?

In theory, the larger the Quantum Volume of a quantum computer, the closer it approaches something called Quantum Advantage. This term indicates the point at which the Quantum Volume is high enough to give a quantum computer an advantage in running quantum applications that perform better than anything classical computers can do.

At the American Physical Society March Meeting today, IBM announced that it had achieved a record Quantum Volume from its Q Systems One quantum computer, which was unveiled at CES 2019 in January.

These results were achieved on IBM's recently unveiled Q System One quantum computer. Q System One features a fourth-generation 20-qubit processor, which has produced a Quantum Volume of 16, roughly double that of the current IBM Q 20-qubit premium devices (which have a Quantum Volume of 8). IBM Q System One's performance reflects some of the lowest error rates IBM has ever measured, with an average 2-qubit gate error less than 2 percent, and its best gate achieving less than 1 percent error rate. (Image courtesy of IBM.)
The new milestone in Quantum Volume were achieved by IBM's Q System One quantum computer, pictured above. The Q System One is equipped with a fourth-generation 20-qubit processor and has a Quantum Volume of 16. This Quantum Volume is approximately twice the size of the IBM Q 20-qubit Network devices. The IBM Q System One's performance has the lowest error rates ever measured with a 2-qubit gate error less than 2 percent, and its best gate measuring a less than 1 percent error rate. (Image courtesy of IBM.)

Quantum computing represents a frontier where extremely complicated, real-world problems could be dissected and solved. Quantum applications could theoretically include anything from simulating chemistry and space to optimizing massive supply chains and modeling financial risk.

For those who have grown impatient and more skeptical of quantum computing, IBM now believes Quantum Advantage is achievable in the next decade, if Quantum Volume could predictably double every year in a manner similar to Moore's Law. Is this possible? Well, IBM has been doubling the Quantum Volume every year since 2017. Not many years under its belt, but we'll see what happens every year from now.

Classical Computing meets quantum computing at the technological limits of Moore's Law

Moore's Law dictates that roughly double the amount of transistors can be added to computer chips every 18 months. The entire personal computing industry was built from this notion. As transistors shrink in size to the atomic level, methods of manufacturing based in classical physics cease to be effective. By next year, Moore's Law shows that transistors need to be the size of an atom. 

A classical computer's transistor represents data as bits. Each bit has two values, a zero or a one. Quantum computers represent data in the form of transistors as either a zero, a one, or both zero and one. When qubits (quantum bits) are simultaneously a zero and a one, this is called a state of superposition. This state of superposition is what makes quantum computers to achieve speeds that are millions of times faster than classical computers. (Image courtesy of Autodesk.)
A classical computer's transistor represents data as bits. Each bit has two values, a zero or a one. Quantum computers represent data in the form of transistors as either a zero, a one, or both zero and one. When qubits (quantum bits) are simultaneously a zero and a one, this is called a state of superposition. This state of superposition is what makes quantum computers to achieve speeds that are millions of times faster than classical computers. (Image courtesy of Autodesk.)

A classical computer's transistor represents data as bits. Each bit has two values, a zero or a one. Quantum computers represent data in the form of transistors as either a zero, a one, or both zero and one. When qubits (quantum bits) are simultaneously a zero and a one, this is called a state of superposition. This state of superposition is what makes quantum computers to achieve speeds that are millions of times faster than classical computers.

Since Moore's Law shows that transistors have to shrink to the molecular level by next year to continue the scale of growth the computing industry has grown accustomed to, the laws that govern quantum mechanics have to take the reign from those that govern classical physics. 

 When the first computers were being built, electrical engineers designed and built circuits to transmit flowing electrons (electrical current), and then developed transistors that adjusted resistance the level of non-conductivity) on command. These resistors allowed engineers to create any state for specific voltage levels, but only two states were needed for mechanical classical computers at the time. As large mechanical computers gave way to electrical computers (which could do everything mechanical much faster with speedy electricity). Electronic classical computers didn't take into account the huge amount of positions electrons can inhabit at a static momentum, where a mechanical device could only have one state at that same static momentum. Electronic classical computers were designed to boil down the number of states to two, so that true or false values (via boolean logic) could be ascribed for a sequential classical computation.
When the first computers like the The Harwell computer (pictured above) were being built, electrical engineers designed and built circuits to transmit flowing electrons (electrical current), and then developed transistors that adjusted resistance the level of non-conductivity) on command. These resistors allowed engineers to create any state for specific voltage levels, but only two states were needed for electro-mechanical classical computers at the time. As large electro-mechanical computers gave way to electrical computers (which could do everything mechanical but much faster with electricity). Electronic classical computers didn't take into account the huge amount of positions electrons can inhabit at a static momentum, where a mechanical device could only have one state at that same static momentum. Electronic classical computers were designed to boil down the number of states to two, so that true or false values (via Boolean logic) could be ascribed for a sequential classical computation. (Image courtesy of public domain.)

A quantum computation uses quantum-mechanical properties like superposition, entanglement and interference to crunch data. By storing and altering values of subatomic properties of atoms. As Moore's Law shrinks the transistor to the atomic level, quantum mechanics as applied to computing is basically the only way forward, and IBM is looking to make that first quantum leap over classical computers. 

How would a quantum computer leave a classical computer in the dust?

Say you have a database filled with different supply chain data about thousands of different products, each with a different SKU number, unit cost, price and so on. If you wanted to search for specific information, say the name of a product or it's SKU number, a classical computer would work sequentially through each row and column of the database until it found the information you requested it and displayed it in the search bar. 

A quantum computer would search the whole database instantly. Qubits give a quantum computer the ability to read every line of every row and column at the same time. Any system of information that involved optimizing logistically complex tasks, such as those found in supply chain optimization, would be performed thousands of times faster than a classical computer.

Bottom Line

It's important to understand how nascent quantum computing is compared to classical computing. IBM first rolled out public access to quantum computers in May 2016 through its IBM Q experience quantum cloud service. IBM developed Quantum Volume as a metric to better understand when Quantum Advantage is achieved. The IBM Q System One quantum computer, with its fourth-generation 20-qubit processor has a Quantum Volume of 16, doubled from last year, and the year before that. Building a large-scale universal and fault-tolerant computer requires doubling Quantum Volume every year into the next decade while keeping coherence times long and error rates low.

The potential use cases offer only the possibility of technological breakthroughs in altruistic endeavors like simulating battery-cell chemistry for electric vehicles to make them an order of magnitude more efficient. Of course, this technology will also be used for weapons engineering.

Right now, quantum computing has had basically zero practical impact on global industry, science or engineering, and it certainly has its detractors. Quantum computing has to advance in a very significant manner in order to compete with classical computers, which are still raising the bar to higher and higher levels. 

IBM continues to make progress, and the notion that Quantum Volume has to double every year for at least ten years in order to achieve Quantum Advantage over classical computers is compelling food for thought.




Recommended For You