Scientist standing at scientific insturment
Advertisement

We have all seen the sci-fi movies and perhaps heard real-world discussion of Quantum computing, it sounds very futuristic and is definitely a groundbreaking perhaps world changing technology, so what exactly is it?

There’s some important background and overall historical information that needs to be gleaned about traditional computing devices to then be able to move onto Quantum computing, particularly the way both architectures process information.

As computers evolved from the very first machine being the ENIAC which was invented by J. Presper Eckert and John Mauchly at the University of Pennsylvania, it took three years to build (from 1943 to 1946), occupied some 1,800 square feet of floor space and weighed almost 50 tons. The machine was huge and very complex for the time – it used 18,000 vacuum tubes, they each had a working logic value of 0 or 1 with the state of 0 being off, 1 being on. The language it spoke is called computer binary language and is still in use today for our computers, home and smart automation devices, industrial control systems and other electronic and computing hardware around us. Computers now have been whittled down into simple devices such as those of remote controls and microwave ovens. The world wide web (www) and overall internet is run on powerful computers, normally dedicated servers, for the main part, these all use binary logic processing and code, without the simple zeroes and ones none of this would be possible.

The electronic world is moving along just fine by the sounds of it, why add different computer platforms?

Let’s go back again to the early computers, they were only conceived, built and operated solely as calculating devices – for easing the load on humans with monotonous and sometimes complex but tedious tasks, soft celled humans can easily make mistakes, computers – if programmed correctly cannot (in theory). The first computer used the aforementioned vacuum tubes, one-logic-process-per-vacuum-tube was possible and so the original machine had many avenues (logic gates) to process data.

In the late 1940s, the first semiconductor transistors were invented, then came along silicon-based MOFSET, further forward to Monolithic Integrated Circuits (IC) in the late 1950s, then the 1970s hit and along came it the microprocessor and microcomputer revolution. From the huge and very weighty first computer down to our handheld devices, here we are – yup, you guessed it, still using those simple but clever zeroes and ones.

The speed, power and versatility of computers have increased dramatically as well the actual workings becoming very small with the usable devices that assist us in our everyday lives, often without realising it. There is a lot of computing power all squished down nicely, in fact, “Moore’s law” which is an observation by Gordon Moore who was the co-founder of Fairchild semiconductor as well as the CEO of Intel. In 1965 his published paper predicted that there would be a doubling every year with the number of components per circuit board – he also said this evolution would last at least a decade, the original prediction has very much held to and aided the digital revolution during the late 20th and early 21st centuries.

We all know what a computer is, conventionally it consists of at least one processing element, typically a Central Processing Unit (CPU) in the form of a metal-oxide-semiconductor (MOS) microprocessor, it also has some type of computer memory, both short term and for longer storage, then there are the display output options as well as the peripheral input devices like the keyboard and mouse.

The “brain” of the system is the CPU, this is the electronic circuitry part that does most of the work in a computer system using arithmetic,  logic processing instruction data sets, controlling with input-output (IO) operations specified by the instructions – it is absolutely packed full of integrated circuits, it may also contain some memory or caching system for often used instructions as well as processor registers. This is the part that has been reduced massively from the first computer full of vacuum tubes and valves, no longer does the processing power take up 18,00 square feet but a processor chip of today can fit in the palm of your hand, with a monumental gaining of power, some chips are “multi-socket” meaning they essentially have more than one processing “core”.

The form, design, and implementation of CPUs have changed over the course of their history, but their fundamental operation remains almost unchanged. Moore’s law still remains consistent but due to the ongoing micronisation of circuits, barriers are being reached to continue the data and component packing density trend.

Just to give you an idea of scale; with the current data pathways of CPUs being the transistor elements, using a 10-nanometre feature size: 8,000 of them laid side to side would equate to the cross-section of a single human hair which is pretty amazing but as conversed earlier a barrier of miniaturization is nearly upon us.

So, what is the answer, then?

Quantum computing, of course.

These machines are still in the design and early prototype stages, they take an incredible amount of money and resources to design then actually build with the initial manufacturing process, let alone implementation, rather than a Central processing unit, they utilise a “quantum data plane”, to function, they need to be super-cooled, down to -460 degrees Fahrenheit and colder using liquid hydrogen with other really cold gases, the biggest challenges are with isolating the chip from unwanted noise and interference, these included electrical, mechanical, magnetic and thermal discrepancies.

And the point of all this extra workload is that with classical computing, a bit is a single piece of information that can exist in two states – remember the binary 01s and 1s? So each time an instruction is set, the answer in binary data is yes or no, off or on, 0 and 1. With a quantum computer using Quantum bits or ‘qubits’ instead, they are very different in that the information is not only 0 and 1 but they can exist in any ‘superposition’ of these values.

A classical computer and architecture processes logic with binary rotation as:

0 or 1

A quantum computer with superposition as its key feature can process logic as:

0 and 1
1 and 0
1 and 1
0 and 0

Or all at the same time.

Quantum computers don’t use “old tech” microelectronic logic gates as we know but in fact, each qubit, represents a single atom, ion, photon or electron for their operation to act as the computer processor and memory – different elemental structures react uniquely under varying environments and with further manipulation. As the Quantum computing devices are in the foundation stage, not all the optimized elements have been brought to light, yet.

There are different methods for these magnificent machines such as ion traps, polarization of light, Bowes Einstein condensate, Quantum dots and Fuller rings.

The process of building a Quantum computer out of a single element is a tricky process as well as a massive as well as miniscale engineering challenge, first, the physicists and engineers have to capture then encapsulate in a resonator a single super small particle, they then have to manipulate it without losing its initial properties and individual characteristics.

Within a very clean environment, this is done by building layer by layer the different containers for the elements and particles using silicone with other advanced materials –  there are then various ways to control the quantum states, manipulate and measure them using fine metal tips and electrical signals to capture permanently, some use nuclides to bombard the source with radioactive disintegrations to get a reaction (Nuclear magnetic resonance) or even change the initial chemical state, all this while protecting it from interference due to outside molecules.

When it all works as it should do, it is a bit like the old logic gate method for the basic binary code but carried out with a very different state using something called ‘entanglement’.

Entanglement is a theory Einstein could not get his head around and actually disregarded it saying it was not even possible, it means an elemental state, called “optimized quantum annealing” which means a particle, atom or element can be in different places at once, when this is applied to our 0s and 1s, the pace is really stepped up – for a traditional computer to carry out either a yes or no logic request it is one cycle on the processing thread, on a quantum computer this would equate to 16 concurrent processing threads (but still only one cycle) thereby occupying  multiple particle states. Controlling this quantum effect where an atom can be in two different places at once is very hard to control but is theoretically possible. Another feature of entangling qubits is that they can affect each other from anywhere in the universe making supercomputers as we know them but a thing of the past.

IBM has unveiled a 50 Qubit computer, for representation a 500 Qubit computer could observe more data that there are atoms in the entire universe, couple this with entanglement and we start to get to a state called ‘Quantum supremacy’, this is the point at which humans would not accurately be able to predict the machines overall usefulness due to their absolute overwhelming power.

Once functioning correctly, Quantum data planes can be daisy chained together to allow for an upscaling of power and functionality creating super (super) computers which is actually the plan as there are just handfuls of qubits available right now, the next stage is to grow to an industrial scale, remember that ‘normal’ computers run their code once at a time, with a quantum computer all the code is possible to run – all the time. Multi number crunching can therefore be achieved at a faster rate.

There would be many benefits to this due to multi number crunching at the same time with applications in A.I. to help learn faster, the medical industry with single virus cell analysations (and the help of MEMS – more on those in another article), heavy industry like for car and transport designs could be optimized, chemical analysis would be instantaneous, codebreaking/cryptography for government agencies would be a possibility as well as weather prediction but not from space on satellites right now due to interference with the various states from space radiation as well as U.V. – these barriers will be overcome in the quantum mechanics race and general computing world at some point, maybe even with simulations to solve these problems, run on Qubit computers..

Even though the basic building blocks for Quantum computers are alien concepts for our brain to comprehend, they are very real.

Advertisement

LEAVE A REPLY

Please enter your comment!
Please enter your name here