Science & Technology


Speaking at the IEEE Industry Summit on the Future of Computing in Washington D.C. on 10th November, Dario Gil, IBM Research VP of AI and IBM Q, announced the development of a quantum computer capable of handling 50 qubits (quantum bits), the largest and most powerful quantum computer ever built. IBM also has a 20-qubit quantum computing system that’s accessible to third-party users through a cloud computing platform.

Seen by experts as the future of advanced computing, quantum computing is a difficult area of technology to understand. Put simply, instead of processing information using binary bits of zeroes and ones in on/off states, a quantum computer uses “qubits”, single units of quantum information, which can simultaneously be zeroes and/or ones. This is made possible by quantum effects known as “entanglement” and “superposition”.

One of the main challenges that comes into play when dealing with quantum states is sustaining the life of qubits. They exist for only a short period of time in a process known as “coherence”. This basically means that there is only a brief window of time before the qubits revert to a classical computing state of zeroes and ones. When researchers started looking at this in the late 90s, coherence time was just a few nanoseconds. Last year, they were able to achieve coherence times of 47-50 microseconds for 5-qubit machines. IBM has managed to maintain the quantum state for both the 50-qubit and 20-qubit systems for a total of 90 microseconds. That may seem short but it’s an important step forward.

A 50-qubit machine can perform extremely difficult computational tasks and Google has suggested that this many qubits could outclass the most powerful supercomputers. That said, IBM’s machine isn’t yet ready for widespread commercial or personal use. Like all of today’s quantum computers, IBM’s 50- and 20-qubit systems still require highly specialized conditions to operate. Speaking to MIT Tech Review, Andrew Childs, a professor at the University of Maryland, has pointed out that IBM hasn’t yet published the details of its new machine in a peer-reviewed journal. “IBM’s team is fantastic and it’s clear they’re serious about this, but without looking at the details it’s hard to comment,” he said. Additional qubits do not necessarily translate to a leap in computational ability. “Those qubits might be noisy, and there could be issues with how well connected they are.” he said.

IBM’s Dario Gil says the increased number qubits is only part of the story. The more qubits you deal with, the more complex the qubit interactions become. If a machine has more qubits but there is a high error rate as they interact, then that machine might not be any more powerful than a machine with less qubits but a lower error rate. Gil says IBM researchers have managed to achieve the higher qubit number with low error rates, making them highly useful to researchers. “We have more qubits and less errors, which is combined to solve more problems,” Gil said. The ultimate goal of quantum computing is a fault-tolerant universal system that automatically fixes errors and has unlimited coherence. “The holy grail is fault-tolerant universal quantum computing” Gil explained. “Today, we are creating approximate universal, meaning it can perform arbitrary operations and programs, but it’s approximating so that I have to live with errors and a window of time to perform the operations.”

IBM aren’t the only ones in the race to build working quantum computers. Google and Intel are also developing their own quantum computing systems, and San Francisco-based startup, Rigetti, wants to revolutionize the field. The Canadian quantum computing company, D-Wave, has already developed quantum computers which have been used by NASA and Google.

Short link :

Related Stories

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button