US ORNL laser test
Readers who follow the tech press may be familiar with the concept of quantum computing. Computers use binary bits: on/off, yes/no, represented by 0 or 1. A quantum bit, or qubit, can be 1, or 0… or both. Whereas 111 = 7 in binary, and each number is a single choice among all the possibilities in the number of binary digits, 3 qubits can hold all 8 possibilities (0-7), which means you can do calculations on all of them at once. The more qubits used, the more computation, so 32 qubits theoretically gets you 2 to the 32nd power computations (about 4.3 billion) at once – much more power than conventional computing, and it keeps on rising exponentially.
It’s worth noting that quantum computing has limits, and areas where it will not be suitable for computing tasks. They are not fully understood yet, but have been shown to exist at the theoretical level. So far, all we can say is that certain kinds of problems will be solved much, much more quickly. The uses of such a system for searching large domains of information, cracking codes, creating codes, or running simulations that include the quantum level (as a number of modern physical and medical science applications do) are clear. As an additional benefit, quantum cryptography methods benefit from quantum principles. Eavesdropping is not only incredibly difficult, it will create noticeable interference.
Various American agencies continue to be interested in the field, which has also begun finding commercial applications.
Continue Reading… »