Quantum Computing Basics (Part I)

Updated: Sep 24, 2020

As big-name tech companies like Google and IBM are making the push for quantum computing, the advent of this technology seems just around the corner.

I don't know about you, but whenever something has the word "quantum" in it, the term suddenly becomes futuristic or too complicated to grasp (I'm looking at you quantum physics). This is probably the reason why not many people know what quantum computers are or how they work in the first place. Despite this, there has been a staggering amount of progress that was made in the past decade by tech companies and we now have working prototypes of these elusive devices. Whether scientists and engineers can scale them down for commercial use anytime soon though is a question only time can tell.

In this article and the following ones to come, I'll be sharing the basics of quantum computing in parts to keep your reading experience short and enjoyable. Do take note however that this is an academic article so there may be some technical jargon and a whole bunch of references. Nevertheless, I hope that you'll be able to gain a deeper insight on the quantum computing industry and learn about the potential impacts quantum computers would have on our future.


For several decades, the rapid development of integrated circuits has mainly been predicted by the observations that Gordon Moore made and later revised in 1975 [1] stating that the number of transistors on an integrated circuit would double once in every two years [2]. “Moore’s Law” has therefore been the leading benchmark for semiconductor manufacturers to race against as they compete with one another to release chips that are getting denser year after year.

Even though there have been impressive breakthroughs such as the invention of MOSFETs (metal–oxide–semiconductor field-effect transistors) that have preserved the validity of Moore’s Law [3], progress of semiconductor manufacturers has been slowly declining with only two of them, Samsung Electronics and TSMC, being able to keep pace by having both 5 nm and 7 nm transistors in production.

As transistors gets smaller in order to accommodate more of them in the same amount of space on integrated chips, several challenges that hinders our advancements starts to appear. As this series will mainly be focused on the quantum mechanical aspects of computing, I shall only make a short digression here in order to introduce the other issues that we face when shrinking transistors.

Firstly, according to a 2006 study [4], smaller transistors are more susceptible to electrical leakage which leads to heat generation that wears the transistor down over time. Reducing the voltage amount taken in or throttling the usage of transistors to prevent overheating would subsequently limit the processing capabilities of the chip which is counter-intuitive to its development. Furthermore, the doubling of transistors and extra heat production would also mean that more energy and newer systems are needed in order to keep it cool. This may lead to an economical issue where the lack of demand and research funds ultimately halts the development of smaller transistors [5].

One can see that the rise of these problems is starting to undermine the efforts of manufacturers in producing ever more compact integrated circuits. However, the implication of these challenges isn’t the main reason for the apparent end of Moore’s Law in the upcoming decade [6]. The biggest inhibitor of producing smaller transistors stems from the physical limitations of the device itself.

As the gate of a transistor starts to fall below the size of 1-3 nm, the phenomena of quantum properties such as quantum tunneling becomes more apparent and ultimately depletes the functionality of the transistor [7]. As such, the field of quantum computing seeks to address this issue by developing a new computational model that makes use of the strange and often counter-intuitive mechanics of the quantum realm.

End of Part I

Like what you're reading? Subscribe to The Weekly Learner below to stay up to date with our posts and get notified whenever Part II is published.


[1] G. E. Moore, Progress in Digital Integrated Electronics, 1975.

[2] D. C. Brock, "Chapter 7: Moore's law at 40", in Understanding Moore's Law—Four Decades of Innovation. Philadelphia: Chemical Heritage Foundation, 2006, ch. 7, pp. 67-84.

[3] K. Siozios, D. Anagnostos, D. Soudris and E. Kosmatopoulos, “On Accelerating Data Analytics: An Introduction …”, in IoT for Smart Grids: Design Challenges and Paradigms. Berlin: Springer, 2018, ch.9, pp. 167.

[4] E. Pop, S. Sinha and K. E. Goodson, “Heat Generation and Transport in Nanometer-Scale Transistors”, Proceedings of the IEEE, vol. 94, no. 8, pp. 1588-1594, Aug. 2006.

[5] J. Loeffler, No More Transistors: The End of Moore’s Law, Interesting Engineering, Nov. 29, 2018.

[6] M. M. Waldrop, “The chips are down for Moore’s law”, Nature, vol. 530, no. 7589, pp. 144-147, Feb. 11, 2016.

[7] E. Sperling, Quantum Effects At 7/5nm And Beyond, Semiconductor Engineering, May 23, 2018.

Recent Posts

See All