For many of you this will probably bring back memories of a certain early 90’s TV show (well, quantum leaping at least), but Quantum Physics as a theory has been around for a very long time (since the turn of the 20th century) and the theory of parallel universes may not be as fictional as you might think… I’m sure I’ll be corrected on this but here goes on a gross oversimplification of quantum theory as I understand it;
Particles obey certain physical laws until you get down small enough to the atomic and sub atomic levels, then they behave in a way that cannot be explained or predicted by classical physics.
Granted this doesn’t sound especially enlightening but it is very interesting if you look into it as there are many theories as to why this is (one of them being the theory of parallel universes). Its pretty fascinating (in my opinion anyway!) and there are a few things that are quite startling, especially the fact these particles behave differently under direct observation than they do when only the results are observed… (check out the double slit experiment here and for the mentally challenged like me, here’s something worth seeing on about it on youtube here – shock horror! yes something of value on youtube!
Is this how Ziggy works?
So what exactly does this have to do with computing, and how is it different to what is used now? Well I found an interesting, fairly simplistic answer (that I needed) whilst looking around on the net after reading that Quantum computers had actually now been built.
“Classical computing relies, at its ultimate level, on principles expressed by Boolean algebra, operating with a (usually) 7-mode logic gate principle, though it is possible to exist with only three modes (which are AND, NOT, and COPY). Data must be processed in an exclusive binary state at any point in time – that is, either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. While the time that the each transistor or capacitor need be either in 0 or 1 before switching states is now measurable in billionths of a second, there is still a limit as to how quickly these devices can be made to switch state. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply. Beyond this, the quantum world takes over, which opens a potential as great as the challenges that are presented.
The Quantum computer, by contrast, can work with a two-mode logic gate: XOR and a mode we’ll call QO1 (the ability to change 0 into a superposition of 0 and 1, a logic gate which cannot exist in classical computing). In a quantum computer, a number of elemental particles such as electrons or photons can be used (in practice, success has also been achieved with ions), with either their charge or polarization acting as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and behavior of these particles form the basis of quantum computing.”
If you are thinking “what on earth?!” after that, I don’t blame you 😉
What does this mean?
Qubits represent atoms, ions, photons or electrons and their respective control devices that are working together to act as computer memory and a processor. Because a quantum computer can contain these multiple states simultaneously, it has the potential to be millions of times more powerful than today’s most powerful supercomputers which can only work in sequence. The practical implications of this increased processing power bears an important implication for cryptography. The most secure of encryption would be childs play for a quantum computer to break and will mean that we need to replace all encryption with computations that are much tougher to crack.
The reg had an article about potentially the worlds first commercial working quantum computer (including pictures, woo!) here