Episode – 2115 : Quantum Computing

podcast Transcript
One of the most exciting areas of computing research right now is quantum computing.
A quantum computer is totally unlike the traditional computer you are familiar with. It solves problems in a completely different way that has the potential to revolutionize certain fields.
However, the promise of quantum computing has led people to make outrageous claims and assumptions that sometimes border on the magical.
Learn more about quantum computing, how it works, and what it can and cannot do on this episode of Everything Everywhere Daily.
The subject of quantum computers is one that I’ve been working on for quite a while, and I felt it was finally time to pull the trigger and do the episode. I’m also aware that, given the current state of research, I’ll probably have to do another episode in a year or so to cover the latest updates in the technology.
There has been a lot of talk about quantum computing, and many people discussing it really don’t understand much about it. They often just use quantum computing as a synonym for magic and assume it can do anything.
To be sure, quantum computers have great potential, but there are limits to what they can do and how they can be used.
Before I get into how quantum computers work, let’s briefly review how digital computers work.
All digital computers, and every one you have ever used, manipulate and store data in binary form. Everything is represented as a series of 1s or 0s, known as bits. All of the improvements in computing over the years have been about processing these bits faster and storing more of them.
Quantum computers do not work this way. Instead of bits, they take advantage of properties that manifest at the quantum level and manipulate what are called qubits, or quantum bits.
A qubit can be a one, a zero, or can be both at the same time. This property is known as quantum superposition. It can only exist at the quantum level and is one of the foundations of quantum computing.
The idea of something being two things at the same time can be difficult to get your head around, but the best analogy I’ve heard is that it is like a spinning coin. A coin can be heads or tails, but while it is spinning, before it stops, it has properties of both heads and tails.
Another quantum phenomenon, core to the workings of a quantum computer, is quantum entanglement. Quantum entanglement is a phenomenon where two or more particles become linked so that measuring one instantly determines the correlated state of the other, even when separated by great distances.
Using our coin analogy, imagine two coins spun together so that when one lands heads, the other instantly lands tails, no matter where it is.
Quantum superposition and quantum entanglement are very difficult concepts to grasp because we don’t experience these effects in our day-to-day lives. The physicist Richard Feynman once said, “I think I can safely say that nobody understands quantum mechanics.”…..and he won a Nobel Prize for his work in quantum mechanics.
If you remember back to my episode on the ultraviolet catastrophe, our understanding of quantum effects doesn’t come from intuition and direct observation as in Newtonian physics, but rather from the confirmation of mathematical models.
The idea of using quantum effects for computing came from Richard Feynman himself. In 1981, Feynman argued that a machine built according to quantum rules could model nature far better than ordinary computers, and in the 1980s, the British physicist David Deutsch expanded on this, developing the concept of a general-purpose quantum computer.
The reason why a quantum computer can, in theory, do things that a normal digital computer can’t is that a quantum computer doesn’t just try one solution at a time. It can explore many possibilities simultaneously.
A digital computer can only process things sequentially. To use another analogy, let’s assume you have hundreds of closed boxes and one of them has a gold bar. A digital computer would solve the problem of finding the bar by opening one box after another until it found it.Even if a digital computer uses parallel processing, it is just several processors working sequentially.
A quantum computer would be able to check all of the boxes at the same time. So, using our analogy, instead of opening every box, imagine shaking the tables that the boxes are sitting on. The box with the gold brick, being heavier, would behave differently from the other boxes, allowing you to identify it.
Again, using the analogy, shaking the table can help find the gold brick, but it can’t tell you what is inside every box. It can just help us solve this one problem quickly.
Theoretical quantum computing gained momentum in the 1990s when the theoretical computer scientist, Peter Shor of MIT, developed Shor’s algorithm, showing that a quantum computer could factor large numbers far faster than known classical methods.
Soon after, Stanford’s Lov Grover introduced Grover’s algorithm, demonstrating faster database search. These breakthroughs proved that quantum computers could offer genuine computational advantages.
Theorizing a quantum computer was much easier than actually building one. Digital computers went from vacuum tubes to transistors to integrated circuits. Because quantum effects only really manifest at the atomic level or smaller, the engineering challenges for building a quantum computer were, and are, enormous.
Building a quantum computing device is one of the hardest engineering projects ever attempted because it requires controlling matter at the smallest scales while preventing outside disturbances.
In effect, engineers must build a machine that is both extremely powerful and extraordinarily fragile.
The first challenge is creating reliable qubit hardware. Unlike an ordinary transistor, a qubit must preserve delicate quantum states such as superposition and entanglement
The next problem is environmental noise. Heat, stray electromagnetic fields, vibration, radiation, and tiny material defects can disrupt a qubit’s state through quantum decoherence.
That means engineers must isolate the processor from nearly everything around it. Many machines operate inside dilution refrigerators colder than outer space, with extensive shielding and vibration control.
Precision control is another major hurdle. Quantum operations require exquisitely accurate pulses of microwaves, lasers, or voltages delivered at exactly the right time and with the correct strength. Small errors accumulate quickly. A useful machine may need millions or billions of perfectly timed control operations, all coordinated with almost no drift.
Scaling is perhaps the hardest challenge. Demonstrating a few qubits in a lab is very different from building thousands or millions that can work together. As qubit counts grow, wiring, cooling, signal routing, calibration, and crosstalk between neighboring qubits become serious obstacles. In some systems, the control hardware can become larger and more complex than the quantum chip itself.
Error correction adds another layer of difficulty. Individual qubits are too error-prone for large computations. This means that for each logical qubit that is used for computation, it might require dozens to hundreds of physical qubits simply to provide error correction.
However, as quantum computers develop and require longer processing time, they might require as many as thousands of physical qubits per logical qubit.
Qubits can be made from superconducting circuits, trapped ions, neutral atoms, photons, or other systems, but each approach has tradeoffs in speed and stability. Simply producing qubits that behave consistently across devices is difficult.
From the 2000s onward, laboratories and companies began building real hardware using these qubit technologies. In 2019, Google announced it had achieved “quantum supremacy,”
Quantum supremacy sounds really impressive, but it has a very specific meaning. It means a quantum computer has completed at least one specific task that would be impractical for a classical supercomputer to reproduce in a reasonable time.
Soon after the Google announcement, IBM argued that improved classical computing methods could do the same benchmark in days, not thousands of years, like Google claimed, which would put the entire quantum supremacy claim in question.
The state of quantum computing as of the recording of this episode is still very, very early despite press releases to the contrary. If you wanted to compare this to the evolution of digital computing, we would probably be in the 1940s when the very first computers were created.
Several different types of quantum computing systems are being developed today because no one yet knows which technology will scale best. The most mature approach today are the previously mentioned superconducting quantum computers, used by companies such as IBM and Google, which use ultra-cold electrical circuits as qubits and benefit from chip-manufacturing techniques.
Another leading design is trapped-ion quantum computers, pursued by IonQ and Quantinuum, where charged atoms are suspended and controlled with lasers, offering excellent accuracy but slower speeds. Neutral-atom systems, developed by firms such as QuEra, use arrays of laser-trapped atoms and are considered promising for scaling to large numbers of qubits.
Other approaches focus on different strengths. Photonic quantum computers, pursued by Xanadu and others, use photons as qubits and may be useful for networking and for room-temperature operation. Silicon spin qubits use the quantum spin of electrons inside semiconductor chips and could potentially leverage existing chip factories.
So, how much better would quantum computers have to get to do something practical, like break contemporary cryptography?To break most modern public-key cryptography, you would need a large-scale, fault-tolerant quantum computer. Not a small, noisy prototype, nor today’s machines that make headline with their raw qubit counts. It would need stable, logical qubits protected by quantum error correction, capable of running long, highly accurate computations.
It would require several orders of magnitude more physical qubits than are currently capable. A cryptography-breaking system would probably need thousands of logical qubits, potentially millions of physical qubits.
There is also the issue of quantum decoherence. Quantum coherence is the ability of a quantum system to maintain its delicate superposition and phase relationships long enough to perform meaningful quantum operations before environmental noise destroys them.
Superconducting qubits used by IBM and Google typically maintain coherence for tens to hundreds of microseconds, with some devices reaching into the millisecond range.
To break modern cryptography, the relevant question is not just how long a single qubit remains coherent, but whether the entire quantum computation can preserve usable quantum information long enough to complete billions or trillions of operations.
In a fully fault-tolerant machine, individual qubits would still decohere quickly, but continuous quantum error correction would refresh and protect the logical state, allowing the computation to run for hours, days, or potentially longer.
One assumption many people have made about the future of quantum computers is that it will advance in a way similar to how digital computers have advanced. However, this isn’t guaranteed. Quantum computers are not digital computers. It is an entirely different technology that just happens to use the word “computer.”
Even if the technology continues to advance rapidly, the first quantum computer capable of breaking modern cryptography would probably cost billions of dollars.
However, that future isn’t guaranteed. There are many serious physicists who think that the problems with scaling can’t be overcome, or at least can’t be overcome affordably.
Each logical qubit can experience errors when enough underlying physical qubits suffer noise, decoherence, or faulty operations.
The more physical qubits in a system, the larger these problems and engineering challenges become.
To keep heat and noise down, quantum computers require extreme cooling to operate at temperatures several thousanths of a degree above absolute zero.
The other big problem is that there aren’t many use cases for a quantum computer. Outside of certain areas, such as factoring large numbers, chemical simulation, and some optimization problems, skeptics question whether there are enough commercially valuable workloads.
Assuming that quantum computers can, in fact, scale and tackle large problems, they will impact the entire economy. Many people worry about its impact on Bitcoin, which will be covered in a future episode, but if a quantum computer can break cryptography, every bank, financial institution, online retailer, major technology company, and government will be at risk.
The takeaway from this episode is that quantum computers are real and can, in theory, solve problems that would be extremely difficult or impossible for normal digital computers.
However, the current state of quantum computing is still very primitive, despite press releases to the contrary. It will be necessary to increase the number of physical qubits by two to three orders of magnitude to get to a point where real problems can be solved.
To get to that point, many important engineering obstacles will have to be overcome, which might involve the invention of entirely new technologies or techniques.
Quantum computing isn’t the next phase of general computing. You will not have a quantum computer on your desk, nor will Microsoft make Windows Quantum.
However, if the engineering obstacles can be overcome, years from now, quantum computing might play an important role in the world.

This episode can be found at: https://everything-everywhere.com/quantum-computing/