The History And Future Of Quantum Computing

Reviewing the topic quantum computer, make a person new to the field think it's a buzz word, but a study into it opens the mind to a whole new dimension. One can hardly talk about quantum computing without delving a bit into quantum mechanics and computer science. Quantum computing is the methodology required using quantum mechanical system to solve information processing task. Different fields have contributed to the ideas and fundamentals that make up the quantum computing which also includes information theory and cryptography. The study of quantum mechanics started with challenges of predicting absurdities of ultraviolent catastrophe, and it brought about better understanding of atoms and radiation, over time with more study brought about the modern theory of quantum mechanics. But what then is quantum mechanics?

Quantum mechanics is a set of rules or mathematical framework for physical theories. Despite the interest in the quantum computing field, which has produced modest success to date, it's still a challenge to build a major quantum information processing system. Small quantum computers which have the capability to run dozen operations on a few quantum bits are the current state of art with respect to quantum computing. There has also been experimental demonstration in the field of quantum cryptography, which focused on sending encrypted message over a long distance. The quantum cryptography are said to be close to ready implementation in the real world scenarios but also faces the challenge developing a large scale quantum information processing unit. The computer science perspective to quantum computing starts with Alan Turing. In a remarkable paper he wrote in 1936 developed an abstract notion, of what we currently refer to as a programmable computer, this aided the construction of the first computers with electronic components. The model from the notion is now known as the Turing matching. Alan Turing also showed that there is a universal Turing machine that can be used to simulate other Turing machines.

Jon Von Newman then developed theoretical models on how to implement in a practical fashion all the components needed to a computer to be as efficient as the universal Turing machine. Next was the development of the transistor by John Bardeen, Walter Brattain, and Will Shockley in 1947 which marks a major improvement in the hardware development journey. With this major improvement comes Gordon Moore who brought about the Moore's law which state that power will double for constant cost roughly once every two years This law has held its truth over time but also poses as a problem, as electronic devices are being made to be smaller. A possible and potential solution to the failure of Moore's law will be the need to switch to a new computing paradigm, one of which is provided by quantum computing, to which the ideas use quantum mechanics to perform computation instead of classic physic approach. A regular computer can be used to simulate a quantum computer but this can't be implemented efficiently, reason being that a quantum computer offers speed advantage over classical computers. The speed advantage is of importance as researchers believe there is a major gap between the power of the classical computer and the power of the quantum computer and no amount of progress in classical computation can overcome. For the future of quantum computing the generation of quantum computers will make the current generation of computer feel really slow, IBM Quantum computing research lab has successfully stored data on a single atom.

Announced by Dairo GIL

Today's technology uses 100,000 atoms to represent 1 or 0 IBM Quantum but let's remember atomic quantum bit can be both 0 and 1 based on the particle's magnetic state with atomic quantum bits called qubits often compared to the direction of a compass meter, on top of that orientation is another layer of probability which in itself is another form of information. When you gather an array of such particles their density of storage and their computing power scales exponentially, this gives a chance for algorithm, calculations, AI learning that are impossible today, this can make weather forecast a guaranty or cancer detection a certainty. The quantum computing technology is not expected to come to your devices (phones, laptops) just yet, but it is envisioned as a cloud based machine that can store a large amount of data and do incredible calculation, but this begs the question of pipeline delivery to end users (individual, company or organization), could this be when 6G come into the big picture

14 May 2021
close
Your Email

By clicking “Send”, you agree to our Terms of service and  Privacy statement. We will occasionally send you account related emails.

close thanks-icon
Thanks!

Your essay sample has been sent.

Order now
exit-popup-close
exit-popup-image
Still can’t find what you need?

Order custom paper and save your time
for priority classes!

Order paper now