Inevitable ascension or a distant dream?

The smaller computer parts get, the more efficient they become – this has been the rule of thumb when it comes to computers and electronics in general.

But these parts have come to a state where they cannot get any smaller without losing the very properties used to build machines like modern computers, becoming a barrier to technological progress.

Computers and technological advances, in general, are reaching that physical barrier as processors, transistors, and other computer parts are getting close to the size of an atom.

Modern-day electronics have silicon-based transistors as small as 10 nm, which is almost a hundred times smaller than the size of red blood cells in the human body.

Any smaller than that, transistors start to play fast and loose with the laws of classical mechanics, since at this subatomic level, classical properties that modern computers are built upon, do not sustain. This is where quantum mechanics steps in.

For the uninitiated, quantum mechanics is the study of subatomic particles like electrons, neutrons, and protons. Unlike the physical objects around us, particles in the subatomic scale act differently.

While bits or binary digits are the building blocks of classical computing, quantum computing uses much more efficient subatomic Qubits for calculation. Bits in classical computing can be either 0 or 1, basically an ‘on’ or ‘off’ switch for the transistor to either let electrons pass or block. But qubits, on the other hand, can be any combination of 0 and 1.

Imagine a glass of lemonade where the lemon juice is 1, and the water is 0. The glass of lemonade is a solution of both lemon juice and water, and until the solution is distilled in a lab, there is no way to say what proportion they are in.

Qubits are like that. 1 and 0 both exist in a qubit in some proportion and like the lab test, only when the qubit is observed or measured do they collapse in a steady state of either 1 or 0, giving us a definite result. This uncertainty of the state is called Quantum Superposition.

Apart from being in this uncertain state, qubits are also mathematically entangled with the qubits near them. This means upon measuring when a qubit collapses in a state of 1 or 0, the neighboring qubit’s state is affected by the outcome. This property is known as Quantum Entanglement. Due to this entanglement, measuring one qubit can tell us what state the neighboring qubits are in.

Quantum computers are built based on these two fundamental principles of quantum mechanics: superposition and entanglement.

Nobel Prize-winning American physicist Richard Feynman first realized that classical computers are not scalable to handle intricate, especially quantum simulations while working on one of his projects. He added that the two principles of quantum mechanics could be harnessed to build a much better and more efficient computing system.

In 1986, Feynman introduced the early version of the quantum circuit notation, based on which Peter Shor developed a quantum algorithm in 1994. Later, Isaac Chuang, Neil Gershenfeld, and Mark Kubinec developed the world’s first known working quantum computing tool with only two qubits in 1991. Even though it was a very early rendition of a primitive computing device, it was quite a leap in the advancement of this nascent technology.

Quantum computers are computing devices that operate by controlling the behavior of particles on the subatomic scale. As the components and building blocks of quantum computers are orders of magnitude smaller than that of classical computers, they are exponentially faster and use a fraction of the energy required by traditional computers.

However, unlike their portrayal in the sci-fi genre, quantum computers are not an upgrade to the classical computers we have in our homes. That’s because they work very differently than the computers we have now. They are also exponentially better at complex computations than the supercomputers that most tech companies like Google, IBM, and Microsoft are using for their R&D.

Comparing classical computers and supercomputers with quantum computers would be like comparing bicycles to motorcycles. Classical computer upgrades often refer to the multiplication of capacity or efficiency. 1 GB RAM used to be enough for a PC a decade ago. But now, the 2 GB RAM is the bare minimum in modern computers, which is two 1-GB RAM bundled together.

Unlike the RAM in classical computers, no matter how many bicycles are bundled together, they cannot become a motorcycle as motorcycles are much more efficient and work differently than bicycles. The same goes for quantum computers as they are fundamentally different from traditional computers.

That is why physicists and researchers behind this technology insist that quantum computers are not an upgrade from supercomputers but rather a completely different superclass of computers that will change the course of the computational algorithm for the future.

These computing devices are so advanced that they take a fraction of time and energy to solve a problem that even modern supercomputers will take hours. A simple example would be how efficient they are in a database search.

For example, if there is a database with 1 trillion names and a search is performed, classical computers and supercomputers will check each and every name on the database against the search, which is one trillion operations for just a simple search.

On the other hand, using qubit’s properties, a quantum computer can perform the same operation within significantly fewer steps. For that same search operation with 1 trillion names, quantum computers would have to only complete 1 million operations which are one million times fewer operations than what classic or supercomputers would take for the results.

Anything supercomputers can do, quantum computers would also be able to do with a fraction of the resources. However, the progress in this technology is plodding. Even though in recent years, companies like IBM, Google, and Microsoft have invested heavily in developing quantum computing tools, we are not even close to a full-fledged prototype for commercial or household use.

News of prototypes from several Chinese and American researchers breaks every few years. Still, the closest we came to a quantum computing device was when Google AI partnered with NASA in October 2019. They claimed to have performed computation on a quantum level that is seemingly impossible on any classical or super-computer. But even that claim is questioned by many.

Needless to say, commercial and household use of quantum computers is a dream for the distant future, especially since harnessing quantum properties of particles on a subatomic level can only be possible in a controlled environment, unlike the classical computer components we use. However, in a decade or two, primitive quantum computing tools might chime into various research and simulation that’ll give us a more accurate insight into atoms and molecular structure.

This level of intricate insight and powerful computation will help the medicine and nutrition industry understand the elements better. Any industry or sector that relies on research and simulation would benefit greatly from this hyper-efficient computing technology. This includes space exploration, manufacturing, engineering, molecular analysis, cryptography, chemical engineering, etc.

Cyber ​​security or encryption is another sector where quantum computing will break the norm and revolutionise. Thanks to the quantum uncertainty of qubits, decrypting encryption from a quantum computer would be close to impossible.

Leave a Comment