Quantum Computing's potential: impossible to ignore
What's the average time you need to find a certain item in an unsorted list of N=1,000,000 items ? O(N/2) = 500,000. That means on average you have to look at half of them before finding your desired item.
What's the minimum number of steps you need to solve a N-by-N system of linear equations ? at least N steps.
What's the minimum time you need to factor an integer of N bits ? 2^N steps. Cryptographic security relies on this fact.
However all these answers are wrong in quantum computing !
In quantum computing to find an item in an unsorted list of 1,000,000 items you need only Sqrt(1,000,000) = 1000 steps! [1,4]
In quantum computing to solve a system of N linear equations you need Log(N) steps ! [2]
In quantum computing to factor a N-bits integer you need only N steps (logarithmic to the number) ! So if a quantum computer with enough bits (called qubits in quantum computers) all today's encryptions are totally useless !! Your bank account can be stolen in one day. [3,5,6]
Several quantum computes of 5 and 7 qubits have been made to prove it is possible[7]. However not yet practical because they don't have enough qubits yet.
Some quantum cryptography networks have been deployed [8]. Ordinary cryptographic systems promises you that your encrypted text will not be decrypted in less than 1 million years for example. However these quantum cryptography methods are totally unbreakable no matter how much time given (personal opinion of the author, that can be wrong).
Even google is considering using quantum search algorithms and have already bought a quantum computer[1].
They are becoming less and less ignorable nowadays. One day we will wake up to find a complete revolution in computing. They are exponentially faster than ordinary computers in general.
To find a good and easy (and more complete) starting point refer to [9]. However I am going to write a small introduction here.
A quantum bit is the building block of quantum computers as much as the ordinary bit is the basic block of ordinary computers. Although it can not deliver or carry more than 1 bit of data in normal cases[11] (1), but in the intermediate calculation steps it can carry zero and one at the same time.
Basically what that means is that using an array of n qubits, you can deliver or carry any number between 0 and 2^(n-1) just like ordinary computers, but also means that in intermediate steps ur quantum register can carry ALL these numbers at the same time, and you can do calculations on ALL of them simultaneously. That's the concepts used in all quantum algorithms.
For more details I redirect the interested reader to the references mentioned below.
--------FOTENOTES------
(1) Other cases like superdense coding as in [10] uses more advanced techniques and tricks.
-------REFERENCES------
[1] http://www.popsci.com/technology/article/2009-12/google-algorithm-uses-quantum-computing-sort-images-faster-ever
[2] Davide Castelvecchi, _Warp-Speed Algebra_, Scientific American January 2010 Issue, Pages 22-23.
[3] http://spectrum.ieee.org/computing/networks/mother-of-all-quantum-networks-unveiled-in-vienna
[4] http://en.wikipedia.org/wiki/Grover's_algorithm
[5] http://en.wikipedia.org/wiki/Shor's_algorithm
[6] http://alumni.imsa.edu/~matth/quant/299/paper/node19.html
[7] http://en.wikipedia.org/wiki/Timeline_of_quantum_computing
[8] http://spectrum.ieee.org/computing/networks/mother-of-all-quantum-networks-unveiled-in-vienna
[9] http://alumni.imsa.edu/~matth/quant/299/paper/paper.html
[10] http://en.wikipedia.org/wiki/Superdense_coding
[11] http://en.wikipedia.org/wiki/Quantum_information_theory; last modified on 13 November 2009 at 06:53; Quote: "However, despite this, the amount of information that can be retrieved in a single qubit is equal to one bit. It is in the processing of information (quantum computation) that a difference occurs"