Ion Trap Makes Programmable Quantum Computer

The Joint Quantum Institute published a recent paper detailing a quantum computer constructed with five qubits formed from trapped ions. The novel architecture allows the computer to accept programs for multiple algorithms.

Quantum computers make use of qubits and trapped ions–ions confined with an electromagnetic field–are one way to create them. In particular, a linear radio frequency trap and laser cooling traps five ytterbium ions with a separation of about 5 microns. To entangle the qubits, the device uses 50 to 100 laser pulses on individual or pairs of ions. The pulse shape determines the actual function performed, which is how the device is programmable. The operations depend on the sequence of laser pulses that activate it. Continue reading “Ion Trap Makes Programmable Quantum Computer”

Single Photon Source for Quantum Computing and Experimentation

One challenge to building optical computing devices and some quantum computers is finding a source of single photons. There are a lot of different techniques, but many of them aren’t very practical, requiring lots of space and cryogenic cooling. Recently, researchers at the Hebrew University of Jerusalem developed a scalable photon source on a semiconductor die.

Using nanocrystals of semiconductor material, the new technique emits single photons, and in a predictable direction. The nanocrystals combine with circular nanoantennas made of metal and dielectric produced with conventional fabrication technology. The nanoantennas are concentric circles resembling a bullseye and is used to ensure that the photons travel the correct direction with little or no angular deviation.

A single IC could contain many photon sources and they operate at room temperature. We’ve talked about quantum tricks with photons before. Quantum mechanics is another popular topic.

Shor’s Algorithm In Five Atoms

If you want to factor a number, one way to do it is Shor’s algorithm. That’s a quantum algorithm and finds prime factors of integers. That’s interesting because prime factorization is a big deal of creating or breaking most modern encryption techniques.

Back in 2001, a group at IBM factored 15 (the smallest number that the algorithm can factor) using a 7 qubit system that uses nuclear magnetic resonance. Later, other groups duplicated the feat using photonic qubits. Typical implementations take 12 qubits. However, recent work at MIT and the University of Innsbruck can do the same trick with 5 atoms caught in an ion trap. The researchers believe their implementation will easily scale to larger numbers.

Each qubit is an atom and LASER pulses perform the logic operations. By removing an electron to make each atom positively charged, an electric field can exactly hold the positively charged ions in position only microns apart.

We’ve covered quantum computing before. We’ve even talked about the effect of practical quantum computing on encryption. You might also want to read more about the algorithm involved.

Photo credit: Jose-Luis Olivares/MIT

Shmoocon 2016: Computing In A Post Quantum World

There’s nothing more dangerous, so the cryptoheads say, than quantum computing. Instead of using the state of a transistor to hold the value of a bit as in traditional computers, quantum computers use qubits, or quantum information like the polarization of a photon. According to people who know nothing about quantum computers, they are the beginning of the end, the breaking of all cryptography, and the Rise of the Machines. Lucky for us, [Jean-Philippe Aumasson] actually knows a thing or two about quantum computers and was able to teach us a few things at his Shmoocon talk this weekend, “Crypto and Quantum and Post Quantum”

This talk is the continuation of [Jean-Philippe]’s DEF CON 23 talk that covered the basics of quantum computing (PDF) In short, quantum computers are not fast – they’re just coprocessors for very, very specialized algorithms. Quantum computers do not say P=NP, and can not be used on NP-hard problems, anyway. The only thing quantum computers have going for them is the ability to completely destroy public key cryptography. Any form of cryptography that uses RSA, Diffie-Hellman, Elliptic curves is completely and totally broken. With quantum computers, we’re doomed. That’s okay, according to the DEF CON talk – true quantum computers may never be built.

The astute reader would question the fact that quantum computers may never be built. After all, D-Wave is selling quantum computers to Google, Lockheed, and NASA. These are not true quantum computers. Even if they’re 100 Million times faster than a PC, they’re only faster for one very specific algorithm. These computers cannot simulate a universal quantum computer. They cannot execute Shor’s algorithm, an algorithm that finds the prime factors of an integer. They are not scalable, they are not fault-tolerant, and they are not universal quantum computers.

As far as true quantum computers go, the largest that has every been manufactured only contain a handful of qubits. To crack RSA and the rest of cryptography, millions of qubits are needed. Some algorithms require quantum RAM, which nobody knows how to build. Why then is quantum computing so scary? RSA, ECC, Diffie-Hellman, PGP, SSH and Bitcoin would die overnight if quantum computers existed. That’s a far scarier proposition to someone hijacking your self-driving car or changing the display on a smart, Internet-connected thermostat from Fahrenheit to Celsius.

What is the verdict on quantum computers? Not too great, if you ask [Jean-Philippe]. In his opinion, it will be 100 years until we have a quantum computer. Until then, crypto is safe, and the NSA isn’t going to break your codez if you use a long-enough key.

Quantum Computing Kills Encryption

Imagine a world where the most widely-used cryptographic methods turn out to be broken: quantum computers allow encrypted Internet data transactions to become readable by anyone who happened to be listening. No more HTTPS, no more PGP. It sounds a little bit sci-fi, but that’s exactly the scenario that cryptographers interested in post-quantum crypto are working to save us from. And although the (potential) threat of quantum computing to cryptography is already well-known, this summer has seen a flurry of activity in the field, so we felt it was time for a recap.

How Bad Is It?

If you take the development of serious quantum computing power as a given, all of the encryption methods based on factoring primes or doing modular exponentials, most notably RSA, elliptic curve cryptography, and Diffie-Hellman are all in trouble. Specifically, Shor’s algorithm, when applied on a quantum computer, will render the previously difficult math problems that underlie these methods trivially easy almost irrespective of chosen key length. That covers most currently used public-key crypto and the key exchange that’s used in negotiating an SSL connection. That is (or will be) bad news as those are what’s used for nearly every important encrypted transaction that touches your daily life.

Continue reading “Quantum Computing Kills Encryption”

Quantum Mechanics in your Processor: Quantum Computing

Not long after [Hitler] took control of Germany, his party passed laws forbidding any persons of Jewish descent from holding academic positions in German Universities. This had the effect of running many of the world’s smartest people out of the country, including [Albert Einstein]. Einstein settled into his new home in Princeton, and began to seek out bright young mathematicians to work with, for he still had a bone to pick with [Niels Bohr] and his quantum theory. It wasn’t long until he ran into an American, [Nathan Rosen] and a Russian, [Boris Podolsky]. The trio would soon lay before the world a direct challenge that would strike at the very core of quantum theory’s definition of reality. And unlike the previous challenges, this one would not be so easily dismissed by [Bohr].

Need a bit of catching up? You can check out Complimentarity as well as Tunneling and Transistors but  that is just some optional background for wrapping your head around Quantum Computing.

The EPR Argument

On May 4th, 1935, the New York Times published an article entitled “Einstein Attacks Quantum Theory”, which gave a non technical summary of the [Einstein-Podolsky-Rosen] paper. We shall do something similar.

Continue reading “Quantum Mechanics in your Processor: Quantum Computing”

Quantum Mechanics in your Processor: Complementarity

Monday | 24 October 1927 | Brussels

While the official title of the 5th Solvay conference was “on Electrons and Photons”, it was abundantly clear amongst the guests that the presentations would center on the new theory of quantum mechanics. [Planck], [Einstein], [Bohr], [de Broglie], [Schrodinger], [Heisenberg] and many other giants of the time would be in attendance. Just a month earlier, [Niels Bohr] had revealed his idea of complementarity to fellow physicists at the Instituto Carducci, which lay just off the shores of Lake Como in Italy.

The theory suggested that subatomic particles and waves are actually two sides of a single ‘quantum’ coin. Whichever properties it would take on, be it wave or particle, would be dependent upon what the curious scientist was looking for. And asking what that “wave/particle” object is while not looking for it is meaningless. Not surprisingly, the theory was greeted with mixed reception by those who were there, but most were distracted by the bigwig who was not there – [Albert Einstein]. He couldn’t make it due to illness, but all were eager to hear his thoughts on [Bohr’s] somewhat radical theory. After all, it was he who introduced the particle nature of light in his 1905 paper on the photoelectric effect, revealing light could be thought of as particles called photons. [Bohr’s] theory reconciled [Einstein’s] photoelectric effect theory with the classical understanding of the wave nature of light. One would think he would be thrilled with it. [Einstein], however, would have no part of [Bohr’s] theory, and would spend the rest of his life trying to disprove it.

Complementarity – Wave , Particle or both?

einstein and bohr
[Niels Bohr] contemplates one of [Einstein’s] many challenges to quantum theory.
For more than a century it was thought that light was a wave. In 1801, [Thomas Young] had discovered interference patterns when shining a light through two very close slits. Interference is a well known property of waves. This combined with [Maxwell’s] equations, which predicted the existence of electromagnetic radiation put little doubt into anyone’s mind that light was nothing more, or less, than a wave. There was a very odd issue, however, that puzzled physicists during the 18th century. When shining light upon a metallic surface, electrons would be ejected from that surface. Increasing the intensity of the light did not translate to an increase in speed of the expelled electrons, like classical mechanics says it should. Increasing the frequency of the light did increase the speed. The explanation of this phenomenon could not be had until 1900, when [Max Planck] realized that physical action could not be continuous, but must be a multiple of some small quantity. This quantity would lead to the “quantum of action”, which is now called [Planck’s] constant and birthed quantum physics. It would have been impossible for him to know that this simple idea, in less than two decades, would lead to a change in understanding of the nature of reality. It only took Einstein, however, a few years to use [Planck’s] quantum of action to explain that mind-boggling issue of electrons releasing from metal via light and not following classical law with the incredibly complex equation:

E = hv

Where E is the energy of the light quanta, h is Planck’s constant and v is the frequency of the light.  The most important item to consider here is this light quanta, later to be called a photon.  It is treated as a particle. Now, if you’re not scratching your head in confusion right about now, you haven’t been paying attention. How can light be a wave and a particle? Join me after the jump and we’ll travel further down this physics rabbit hole.

Continue reading “Quantum Mechanics in your Processor: Complementarity”