Beyond the limits of Classical Computing – Photonick

By CoperNick

Valerio Pagliarino

 

March 13, 2019


 

Looking at the development of computers in the past 30 years, it is very impressive how fast their performance has grown, allowing us to run operations on our smartphones that before would have required huge computer clusters, such as, for example, the tridimensional rendering engines employed in gaming apps. However, if we look more closely at the devices engineered in the last few years, we may notice decreasing differences with previous models, with regards to new functions, as well as to hardware computing power.

What is happening? Is development slowing in computing down?

Moore’s law, written in 1965, predicted that the number of transistors in new microchips would double every two years. This observation has been valid for many years, but we are now approaching a fundamental physical limit to the possible miniaturization of transistors.

 

Transistor number growth according to Moore’s Law.

License: Creative Commons Attribution-Share Alike 4.0 International

 

The first aspect to consider in order to understand the reason behind this physical limit is related to the energy density being reached in modern Central Processing Units (CPUs). Even the most complex computing in any digital integrated circuit is run by decomposing the operation to a set of basic logic operations like “AND”, “OR” etc. To execute these simple logic calculations, CPUs contain a variable number of transistors, electronic components which act as “light switches” allowing or blocking the flow of electric current through a specific path. Therefore, they are adopting one of two possible states: 1 and 0. With a sufficient number of transistors it is possible to compute anything, just as it is possible to transform every mathematical operation into a composition of Boolean algebra sums or multiplications. The problem is that transistors are not “ideal switches”, but have leakage currents as well as an internal resistance that dissipates energy due to the Joule effect. Every CPU is equipped with a heat dissipation system (the computer fan), but if the energy density were increasing today at the same rate as it was from 1990 to 2000, the CPUs on our laptops would be emitting the same amount of heat of a nuclear reactor!

While energy density constitutes a challenge to transistor miniaturization, the main issue is the miniaturization itself. If we consider transistor lithography sizes (up to 10 nm), it is clear that we are approaching scales where quantum mechanics phenomena begin to have a significant effect: just imagine that in certain conditions electrons can flow through a “closed” transistor because of the tunnel effect.

 

If, on the one hand, quantum effects constitute a limit for classical computing development, their study in recent years has led to the birth of a revolutionary computing technology: quantum computing. Quantum computing should not be considered as the future replacement of classical computing, but instead as a complementary architecture, that could allow us to efficiently run particular algorithms that are now considered unusable, opening new scenarios in various research fields.

Quantum mechanics is an extremely complex field of study whose understanding requires specific mathematical tools, so we will now just touch upon some interesting concepts, simplifying and approximating many aspects.

 

Let’s start from the principle of “superposition”: in a classical system that adopts two states (like the digital transistors mentioned above, that work on bits) we can always definitely distinguish between “0” and “1”. This is not true in a quantum system (that works on qubits). Until a measurement is performed upon it, it can assume any state among “0”, “1” or a superposition of these two states. But if we try to read its state, the wave function that describes the system collapses and we read “0” or “1”. To imagine this superposed state, we can think of a common nuclear phenomenon: a gamma ray that interacts and produces a pair of particles, an electron and a positron. Until we measure which one is which, the particles are in a “superposition” of these two states, but when we check the nature of a particle, it will be either an electron or a positron. We will therefore be able to identify the other particle. This particular configuration of “paired” systems is called entanglement.

 

Considering the superposition, a qubit cannot be represented by a scalar value, instead, what is used is a complex-coefficient vector that express the probability amplitude of the qubit collapsing into a certain state. Then, the Bloch sphere (see below) is what we obtain if we represent this vector in a spherical coordinate system. Quantum interactions between qubits appear on the Bloch sphere as geometric transformations that influence the qubit probability of collapsing into different states. It is therefore possible to run logic operations using qubits. From a mathematical point of view, qubit operations consist of complex matrix applications on the qubits’ complex vectors.

 

Keep in mind that a quantum computer is different from a classical probabilistic computer! Note that the superposed state is a physical state of the system, and not due to the observer’s lack of knowledge of the state of the system, as opposed to, for example, the result of a coin toss. It is thanks to the superposition that a quantum computer does not have to execute a list of operations in sequence, allowing, for example, to find an item in a list without having to systematically check the correspondence of each entry with the search word.

 

A Bloch sphere where the vector represents the qubit state: |0> and |1> are the possible collapsing states.

 

 


Schematic used to program quantum operations between qubits:

|0> is the initial state.

[H] (Hadamard ), [X] [Y] [Z] (Pauli-X, Pauli-Y, Pauli-Z) are examples of “quantum gates”: operations on qubits represented by a complex matrix application that corresponds to a geometric transformation on the Bloch Sphere. Quantum gates on a qubit can be controlled by other qubits.

The measurement block makes the bit collapsing and extracts the “result”.

 

Despite the fact that building a quantum computer is a very ambitious challenge, this field seems to be very promising, and both universities and private companies are developing their first prototypes. IBM, for example, has created a cloud interface that allows everyone to understand and run basic algorithms in a quantum computing environment. Quantum computing is a very recent technology, which still requires a lot of work before it can produce practical results and some researchers have raised doubts on the possibility of implementing them. Nevertheless, horizons are extremely broad and potentially game-changing for many fields like maths, medicine, cybersecurity and complex system simulations, going beyond our limits and transforming them in opportunities.

 

 

https://commons.wikimedia.org/wiki/File:D-Wave_Two_512_qubit_Vesuvius_chip.jpg

License: Creative Commons Attribution 2.0 Generic

A D-Wave 512 qubit chip: quantum computers usually operate inside cryostats at temperatures of few Kelvin degrees.
Physical systems that can work as qubits include single atoms, molecules, crystals and nonlinear optical systems.

 

To know more…

 

In this article we presented the basic features of quantum computers. To know more, have a look at the pages mentioned in the references. IBM also created an open web platform that allows everyone to execute simple algorithms on simulated or real quantum computers.

(https://quantumexperience.ng.bluemix.net/qx/experience)

 

References

 

  1. Rigetti_Computing, “http://docs.rigetti.com/en/stable/intro.html,” [Online].
  2. R. W. Keyes, “Physical limits of silicon transistors and circuits,” Rep. Prog. Phys. 68 (2005) 2701–2746 doi:10.1088/0034-4885/68/12/R01.
  3. W. P. Eleanor Rieffel, “Quantum Computing – A Gentle Introduction”.
  4. Intel, “https://www.intel.it/content/www/it/it/history/museum-gordon-moore-law.html,” [Online].
  5. IBM-Research, “https://www.research.ibm.com/ibm-q/learn/what-is-quantum-computing/,” [Online].
  6. DWave-Systems, “https://www.dwavesys.com/tutorials/background-reading-series/introduction-d-wave-quantum-hardware,” [Online].

 

 

READ MORE ARTICLES AT THIS LINK