Noise: It Turns Out You Need It

We don’t know whether quantum physics proves the universe is truly a strange place or that we are living in a virtual reality simulation, but we know it turns a lot of common sense into garbage. Take noise, for example. Noise — as in random electrical noise — is bad, right? We spend a lot of time designing to minimize noise. Researchers in Austria, Germany, and Australia recently published a paper that shows that noise can actually improve the flow of energy. While the paper is behind a paywall, the Focus article is available and, of course, you can probably find a copy of the paper if you want to read the entire thing.

The paper, titled “Environment-Assisted Quantum Transport in a 10-qubit Network” uses trapped calcium atoms to study an effect suspected of being a key factor in high-efficiency energy transfer such as the transfer observed in optical fibers and photosynthesis.

Continue reading “Noise: It Turns Out You Need It”

Quantum Computing For Computer Scientists

Quantum computing is coming, so a lot of people are trying to articulate why we want it and how it works. Most of the explanations are either hardcore physics talking about spin and entanglement, or very breezy and handwaving which can be useful to get a little understanding but isn’t useful for applying the technology. Microsoft Research has a video that attempts to hit that spot in the middle — practical information for people who currently work with traditional computers. You can see the video below.

The video starts with basics you’d get from most videos talking about vector representation and operations. You have to get through about 17 minutes of that sort of thing until you get into qubits. If you glaze over on math, listen to the “index array” explanations [Andrew] gives after the math and you’ll be happier.

Continue reading “Quantum Computing For Computer Scientists”

Flawed Synthetic Diamonds May Be Key For Quantum Computing

If you’ve followed any of our coverage of quantum computing, you probably know that the biggest challenge is getting quantum states to last very long, especially when moving them around. Researchers at Princeton may have solved this problem as they demonstrate storing qubits in a lab-created diamond. The actual publication is behind a paywall if you want to learn even more.

Generally, qubits are handled as photons and moved in optical fibers. However, they don’t last long in that state and it is difficult to store photons with correct quantum information. The impurities in diamonds though may have the ability to transfer a photon to an electron and back.

Continue reading “Flawed Synthetic Diamonds May Be Key For Quantum Computing”

Google Ups The Ante In Quantum Computing

At the American Physical Society conference in early March, Google announced their Bristlecone chip was in testing. This is their latest quantum computer chip which ups the game from 9 qubits in their previous test chip to 72 — quite the leap. This also trounces IBM and Intel who have 50- and 49-qubit devices. You can read more technical details on the Google Research Blog.

It turns out that just the number of qubits isn’t the entire problem, though. Having qubits that last longer is important and low-noise qubits help because the higher the noise figure, the more likely you will need redundant qubits to get a reliable answer. That’s fine, but it does leave fewer qubits for working your problem.

Continue reading “Google Ups The Ante In Quantum Computing”

Quantum Weirdness In Your Browser

I’ll be brutally honest. When I set out to write this post, I was going to talk about IBM’s Q Experience — the website where you can run real code on some older IBM quantum computing hardware. I am going to get to that — I promise — but that’s going to have to wait for another time. It turns out that quantum computing is mindbending and — to make matters worse — there are a lot of oversimplifications floating around that make it even harder to understand than it ought to be. Because the IBM system matches up with real hardware, it is has a lot more limitations than a simulator — think of programming a microcontroller with on debugging versus using a software emulator. You can zoom into any level of detail with the emulator but with the bare micro you can toggle a line, use a scope, and hope things don’t go too far wrong.

So before we get to the real quantum hardware, I am going to show you a simulator written by [Craig Gidney]. He wrote it and promptly got a job with Google, who took over the project. Sort of. Even if you don’t like working in a browser, [Craig’s] simulator is easy enough, you don’t need an account, and a bookmark will save your work.

It isn’t the only available simulator, but as [Craig] immodestly (but correctly) points out, his simulator is much better than IBM’s. Starting with the simulator avoids tripping on the hardware limitations. For example, IBM’s devices are not fully connected, like a CPU where only some registers can get to other registers. In addition, real devices have to deal with noise and the quantum states not lasting very long. If your algorithm is too slow, your program will collapse and invalidate your results. These aren’t issues on a simulator. You can find a list of other simulators, but I’m focusing on Quirk.

What Quantum Computing Is

As I mentioned, there is a lot of misinformation about quantum computing (QC) floating around. I think part of it revolves around the word computing. If you are old enough to remember analog computers, QC is much more like that. You build “circuits” to create results. There’s also a lot of difficult math — mostly linear algebra — that I’m going to try to avoid as much as possible. However, if you can dig into the math, it is worth your time to do so. However, just like you can design a resonant circuit without solving differential equations about inductors, I think you can do QC without some of the bigger math by just using results. We’ll see how well that holds up in practice.

Continue reading “Quantum Weirdness In Your Browser”

Intel Rolls Out 49 Qubits

With a backdrop of security and stock trading news swirling, Intel’s [Brian Krzanich] opened the 2018 Consumer Electronics Show with a keynote where he looked to future innovations. One of the bombshells: Tangle Lake; Intel’s 49-qubit superconducting quantum test chip. You can catch all of [Krzanch’s] keynote in replay and there is a detailed press release covering the details.

This puts Intel on the playing field with IBM who claims a 50-qubit device and Google, who planned to complete a 49-qubit device. Their previous device only handled 17 qubits. The term qubit refers to “quantum bits” and the number of qubits is significant because experts think at around 49 or 50 qubits, quantum computers won’t be practical to simulate with conventional computers. At least until someone comes up with better algorithms. Keep in mind that — in theory — a quantum computer with 49 qubits can process about 500 trillion states at one time. To put that in some apple and orange perspective, your brain has fewer than 100 billion neurons.

Of course, the number of qubits isn’t the entire story. Error rates can make a larger number of qubits perform like fewer. Quantum computing is more statistical than conventional programming, so it is hard to draw parallels.

We’ve covered what quantum computing might mean for the future. If you want to experiment on a quantum computer yourself, IBM will let you play on a simulator and on real hardware. If nothing else, you might find the beginner’s guide informative.

Image credit: [Walden Kirsch]/Intel Corporation

Shor’s Algorithm In Five Atoms

If you want to factor a number, one way to do it is Shor’s algorithm. That’s a quantum algorithm and finds prime factors of integers. That’s interesting because prime factorization is a big deal of creating or breaking most modern encryption techniques.

Back in 2001, a group at IBM factored 15 (the smallest number that the algorithm can factor) using a 7 qubit system that uses nuclear magnetic resonance. Later, other groups duplicated the feat using photonic qubits. Typical implementations take 12 qubits. However, recent work at MIT and the University of Innsbruck can do the same trick with 5 atoms caught in an ion trap. The researchers believe their implementation will easily scale to larger numbers.

Each qubit is an atom and LASER pulses perform the logic operations. By removing an electron to make each atom positively charged, an electric field can exactly hold the positively charged ions in position only microns apart.

We’ve covered quantum computing before. We’ve even talked about the effect of practical quantum computing on encryption. You might also want to read more about the algorithm involved.

Photo credit: Jose-Luis Olivares/MIT