Quantum computing isn’t limited to binary encoding of information, it can use the principals of quantum mechanics to store information in superposed states that can consider all possibilities at once. Quantum computers exist today, but the hardware and software is still in a stage of infancy.
Quantum Computers Aren’t Limited to Binary fact
Quantum Computing Basics – What are Qubits?
Quantum computers, unlike classic computers, aren’t limited to binary bits (which are always 0 or 1); they use qubits (which can be 0 and 1 at the same time).
Qubits can be translated to binary bits when “measured” and removed from their state of “quantum superposition” (being a variable, existing as more than one probable outcome at once, being 0 and 1).
Qubits can exist in a state of “quantum superposition” because quantum computing literally uses quantum particles, like the photon, to store information (information is typically stored in the charge or spin, aka angular momentum, of the particle). [1][2][3]
In laymen’s terms, very small particles exist in a state of uncertainty (1 and 0) until they exist in a state of certainty (1 or 0). This may seems strange, but you’ll have to blame Heisenberg or Mother Nature for that one. This is the way quantum particles work (sort of like how Pi never ends, but is still very useful).
Anyway, we cover this in detail below, you should watch the following video for a visual, as that makes understanding quantum computing much easier.
TIP: Check out IBM’s new online quantum cloud computer. After signing up you’ll be able to program actual qubits!
A How Does a Quantum Computer Work? Veritasium explains quantum computing.TIP: Quantum computing and quantum physics are both based on the same principle. Things exist in a state of probability until they “quantize” to a discrete state. This is best understood by looking at Quantum Field Theory (although that gets a bit heady). Learning more about the different types of infinity would be helpful too. Get a better understanding or see the standard model for how quantum particles work. Or, try: Microsoft explains quantum computing so even you can understand.
TIP: Quantum computers are working in labs (for instance NASA and Google’s D-Wave), but the technology is still in development. There is no solid proof we will be quantum computing any time soon. Furthermore, there is little agreement as to what methods are working or what progress has been made. Make sure to check the news and see the videos to know where we are at at this moment in quantum computing.
Classic Computing Versus Quantum Computing
- In classical computing, we are always limited to things being “on” or “off” (0 or 1).
- In quantum computing things can be “on” and “off” at the same time (0 and 1).
In more technical terms, quantum computers use the principles of quantum superposition and quantum entanglement to perform computations by considering all variables at once (or being undecided between states).
A quick explainer of qubits and quantum computing. Keep in mind new quantum computing news is coming up every day. Since this video was created new breakthroughs have been made. NASA and Google even have a functioning quantum computer.Quantum Processing Versus Binary Processing
In theory, quantum computers aren’t limited to sequential binary processing (considering one thing at a time in a ordered chain of events).
Rather, quantum computers can theoretically complete multiple processes at once (something that can only be done using multi-core processors with binary computers, with each processor handling one chain of processes at a time).
A discussion of quantum computing breakthroughs by SciShow.What are Superposition and Entanglement?
In its simplest form, superposition for the purposes of quantum computing means that a piece of information remains undecided until it’s requested. Let’s say we want to program a light switch to be either on or off.
Qubit Versus Bit Light Switch Example
Thinking as “ON” as “1” and “OFF” as “0”.
- In classical binary computing we have two variables that need to be programmed to turn a single light switch on or off: [ON], [OFF] – (Expressed: 1, 0)
- In quantum computing we only need one variable to handle a single switch: [ON and OFF] – (Expressed: |0} and |1} )
In other words, unlike a bit, a qubit can be in a state of quantum superposition being undecided between being “0” or “1” until a state is requested. When the quantum computer is asked to actually turn the light on or off, it takes the one quantum variable and translates the superposed state into a classical binary [ON], [OFF] state for each switch.
Exponential Growth: Bits and Qubits
When we start increasing the number of light switches the classical computing method starts taking up exponentially more time and space. However, quantum computing does the opposite and gives us the ability to store twice as many states per qubit.
- For every extra bit, you get, you can store one extra variable pertaining to the “on” or “off” state. To control two switches we need 4 bits, one representing each state of the two switches: [ON ON], [ON OFF], [OFF ON] or [OFF OFF]. For 10 switches we need 1024 bits: [0000000000], [0000000001], etc. The number of bits needed grows exponentially.[2]
- For every extra qubit you get, you can store twice as many states. For example with 3 qubits, you get coefficients for |000}, |001}, |010}, |011}, |100}, |101}, |110} and |111}.[1] the number of states grows exponentially with qubits, essentially the opposite of the limitation of bits.
Quantum Entanglement?
Entanglement is simply a way to describe a natural phenomenon of very small things like electrons being able to exist in more than one state at once. Entangled quantum particles can exist in a state of superposition until “observed” (measured) at which point they act as if they were in an absolute state.
What is the Actual Technology Behind a Qubit?
In simple terms, the Qubit works by trapping one electron, affecting its spin, and measuring its behavior. An electron is a quantum particle that naturally lives in a state of superposition and entanglement until measured. The natural behavior of a quantum particle like an electron is both what is used to power quantum computing and where the inspiration for creating quantum computing comes from.
NOTE: Photons and other quantum particles can be and are used for quantum computing. In fact, new technology is showing the photon may be a more useful quantum particle for quantum computing.
Why is Quantum Computing Important?
Theoretically, quantum computing can be used to solve complex calculations that would be impossible with classical computing in a practical sense due to time restrictions.
Quantum computing can also revolutionize computer security. Right now cryptography sequences that use large numbers and complex algorithms are difficult to “crack” due to time restrictions. Quantum computing can theoretically crack them instantly. Quantum computing also creates new opportunities for inventing new security technologies (even down to encoding a message on a single photon).[7]
Quantum computing can also be used for slightly less nerdy and important purposes like testing drugs and medications, or climate change scenarios, or generally testing and predicting things with lots of variables that we can’t do today because of time constraints.
Perhaps most importantly, quantum computing is important because we are running out of ways to make transistors smaller. This means our ability to build faster and faster computers is going to reach an end without new technological advances that address size, cooling, heating, and the amount of room it takes to store something.
Problems With Building Computers (Moore’s Law’s Problem)
The problem with building powerful quantum computers is not as much our understanding of the technology, but rather the logistics of making and cooling (or preventing heating) in regards to very small things. This can be seen through Moore’s law and the limitations we’ve run into with transistors.
In 1965 paper Gordon E. Moore, the co-founder of Intel and Fairchild Semiconductor, theorized that over the history of computing the number of transistors in a dense integrated circuit had doubled approximately every two years and would continue to do so.[4] This turned out to be true (hence it being a “law”) until we started “running out of room”.
A discussion of Moore’s Law by SciShow.As of 2015, the smallest transistor we have is 5nm (five nanometers). Transistors smaller than 7nm are predicted to experience quantum tunneling through their logic gates (i.e. quantum particles, due to being small, can leave the chip via “gates” that determine “off” and “on” states when the gates are too “thin”).
NOTE: Science is already presenting potential solutions to the above problems. It’s unlikely we have truly hit a sticking point. See the next section.
Size of transistors over time:[5]
10 µm – 1971
6 µm – 1974
3 µm – 1977
1.5 µm – 1982
1 µm – 1985
800 nm – 1989
600 nm – 1994
350 nm – 1995
250 nm – 1997
180 nm – 1999
130 nm – 2001
90 nm – 2004
65 nm – 2006
45 nm – 2008
32 nm – 2010
22 nm – 2012
14 nm – 2014
10 nm – 2016–2017
7 nm – 2017–2018
5 nm – 2020–2021
Can We Create Quantum Computers?
Versions of quantum computers started being produced in labs in the 2000’s.[6] The trick to quantum computing is that it uses sub-atomic phenomena and that has posed difficulty in constructing a useful, working, quantum processor.
Our progress is still restricted by technology, but so far a successful two-qubit silicon logic gate has been created, the 1000 qubit barrier has been broken[6] (although many contest this), and technology for building quantum computers that don’t overheat is in full swing. Scientists are moving beyond electrons and starting to look at how photons (light) can be used as a basis for the Qubit since it helps solve heating and cooling issues.
So, “we can technically create quantum computers”, but they aren’t useful for anything of note at the moment. This is a work in progress, some doubt the quantum computers will ever replace binary ones (due to the way processing on large scales works).
In short, be excited, but not over excited.
Michio Kaku: How to Program a Quantum Computer. More processing power, Moore quantum problems.New Discoveries in Quantum Light Sources
As of November 2015, a research team from UTS Science have even found a material that emits a single pulse of a quantum light on demand at room temperature (unlike electrons which pose heating / cooling issues). The material – layered hexagonal boron nitride (boron and nitrogen atoms that are arranged in a honeycomb structure) is atomically thin and can emit quantized pulses of light. Using this material it may be possible to do things like using a single photon as a qubit to store information. Theoretically storing a piece of information in a completely secure photon (as theoretically an eavesdropper can’t observe the information without changing it).[7]
Google and NASA’s D-Wave X2 Computing System
In 2013, Google and NASA began work on their D-Wave quantum computing system a version of the D-Wave “quantum annealing” processor created in 2007. In early December 2015, the team announced that they had successfully tested a quantum annealing algorithm (a program that uses quantum variables to predict outcomes of probable events) using the quantum computer. Tests showed that the quantum computer ran the algorithm faster than simulated programs on classical computers could (simulated annealing and a quantum Monte Carlo algorithm both run on a classical computer).[8][9][10]
This shows that the potential of quantum computing seems to be what we thought. It can calculate the probability of large random outcomes faster than classical computing due to utilizing the nature of quantum mechanics.
A look at the D-Wave Lab.Quantum Computing in 2016 and Beyond
Quantum computers are coming to life in the lab, but the technology isn’t there yet. We could find that quantum computers are out of our reach, or we could find them in our living room within 5 years along with other cutting edge technology like cognitive computers (also a Google / Alphabet side-project).
Quantum computing is one of the most important technologies for mankind, so whatever you knew yesterday may be old news tomorrow. The concepts stay the same, but the limitations of technology change every day.
A documentary on Quantum computing.- “Quantum vs Classical Computation“. Thephys.may.ie. Retrieved Jan 14, 2016.
- “Quantum Computing Primer“. Dwavesys.com. Retrieved Jan 14, 2016.
- “Quantum Computing“. Wikipedia.org. Retrieved Jan 14, 2016.
- “Moore’s law“. Wikipedia.org. Retrieved Jan 14, 2016.
- “5 nanometer“. Wikipedia.org. Retrieved Jan 14, 2016.
- “Timeline of quantum computing“. Wikipedia.org. Retrieved Jan 14, 2016.
- “A step closer to quantum communications“. Uts.edu.au. Retrieved Jan 14, 2016.
- “Quantum Computing How D-Wave Systems Work“. Dwavesys.com. Retrieved Jan 14, 2016.
- “When can Quantum Annealing win?“. Googleresearch.blogspot.com. Retrieved Jan 14, 2016.
- “Google: We have proof that our quantum computer really works“. wap.engadget.com. Retrieved Jan 14, 2016.