1999: Quantum Computing

Photo: Quantum Computing. Applying the principles of Quantum Physics to computing can do for computing what the assembly line did for manufacturing. Traditional computing is based on bits which present just one of two possibilities such as 0 or 1, or yes or no. Quantum Physics has demonstrated that a subatomic particle can have different states simultaneously. The goal of Quantum Computing is to incorporate this notion of an array of possibilities into computing.

Quantum Computing's most basic unit has been defined as a Qubit. Unlike a bit which can represent only a single possibility, a Qubit can represent an array of possibilities that can be calculated simultaneously, while taking probabilities into account. A working Quantum Computing model will dramatically increase the speed of information processing and will be a catalyst for a wave of breakthroughs in a wide range of fields.