Classical computers, which we use on a daily basis, are built upon the concept of digital logic and bits. A bit is simply an idea, or an object, which can take on one of two distinct values, typically labeled 0 or 1. In computers this concept is usually embodied by transistors, which can be charged (1) or uncharged (0). For arguments sake, classical coins are also perfectly good bits. A computer encodes information in a series of bits, and performs operations on them using circuits called logic gates. Logic gates simply apply a given rule to a bit. For example, an OR gate takes two bits as input, and outputs a single value. If either of the inputs is 1, the gate returns 1. Otherwise, it returns 0. Once the operations are finished, the information regarding the output can be decoded from the the bits.
Engineers can design circuits which perform addition and subtraction, multiplications… almost any operation that comes to mind, as long as the input and output information can be encoded in bits. And from these ideas modern computers were developed, and have led to a revolutions in both the consumer market and the scientific world. In the latter, because computers can perform calculations so fast that rooms full of people would fall centuries behind in a race. For some applications, however, even computers are not fast enough. Some tasks are so laborious to complete that computers wouldn’t be able to do so even given the entire age of the universe.
Paths set the speed
The most important characteristics of a computer is how fast it can perform calculations (the clock speed in our domestic PCs is a measure of this). This is determined by two main factors. One is the processor, which determined how many operations can be carried out on bits in a given time interval. The other is the nature of the calculation: how many operations on bits it takes to carry it out. This is why it is key to have optimized algorithms, you want to complete a given task in as few steps as possible. The problem is that even the most sophisticated versions of some tasks, like integer factoring, require an enormous amount of operations to be completed. These are the kind of tasks that could take billions of years to complete even on the best computers.
The freedom of qubits
This inconvenience largely stems from the limitations of bits, forced to be either being in a 0 or a 1 state. This introduces restrictions on what you can do to these units of information. Enter the quantum computer. Quantum systems can exist in a superposition of states. The quantum counterpart to the bit, the qubit, can have two values as well, |0> and |1> (this is called bra-ket notation, and was introduced by the great Paul Dirac). But it can also be in any state in between, like the quantum coin. For example, 50% of |0> and 50% of |1>. If we have two qubits, again like in the situation with two quantum coins, we have four special states |00>, |01>, |10> and |11>. And combinations of all the above. Like in the case of classical bits, scientists and engineers have designed quantum gates, which perform operations on qubits. Circuits can be designed which carry out operations on families of qubits. More interestingly, qubits can also be entangled, leading to mysterious connections between these objects.
It isn’t at all obvious, but quantum computers, with their qubits and quantum gates, are ideal for some of these very laborious problems mentioned above. Qubits have more freedom, thanks to superposition, and can be linked due to entanglement. As a consequence, quantum logic gates can perform a different variety of operations on qubits, and more importantly result in a wider range of outputs. This provides algorithm designers with alternative paths to arrive at the desired results, which can turn out to be extremely more efficient. The difference is so important that, with enough qubits, billion-year operations on classical computers can take days or hours on quantum devices.
The future and quantum computing
Quantum computing is still in its infancy, but some devices already exist which can support tens of bits. An important amount of research effort is currently dedicated to improving these capacities, and sooner rather than later quantum computers will take a leap forward. Recently IBM announced its 50 qubit quantum computer , an amazing feat of engineering and science. When quantum computers reach the order of hundred qubits, humanity will be equipped with unprecedented computing power. This sounds amazing, and it is. However, it is a double edged sword.
Many cybersecurity techniques are based on the inability of classical computers to perform certain tasks in feasible times, making trying to surpass them impractical. These classically unsurmountable tasks include those which can be made vastly more efficient by using quantum computers(see, for example,  for a convenient review of Shor’s algorithm). Cryptography is truly under threat from quantum computers, and with it modern day tools like Internet banking, credit card transactions, e-mail, chat services and more. New encryption techniques and solutions are needed to ensure the continuing survival of the Internet infrastructure which makes our lives easier. Like a snake poison’s antidote is often created using the lethal substance itself, the solution to quantum hacking lies in the origin of the problem, quantum mechanics.
Thisscience article is based mainly on two important references, which deserve special mention:
 Claude Cohen-Tannoudji, Bernard Diu, and Frank Laloë. 1991, Quantum Mechanics. Vol. I & II, 1991, Wiley, New-York
 Michael A. Nielsen and Isaac L. Chuang , Quantum Computation and Quantum Information: 10th Anniversary Edition, Cambridge University Press; 10th Anniversary ed. edition (9 Dec. 2010)