# Quantum Computing

**Topics:**Quantum information science, Qubit, Quantum mechanics

**Pages:**11 (3768 words)

**Published:**May 6, 2013

With more scientific breakthroughs both in algorithmic and physical side, the study of quantum computing which is a new comprehensive and cross science of quantum mechanics and computer science has become a more sought-after and highly charged issue in recent years. This essay aims to give a brief overall introduction of quantum computing that is simple but still conveys the essential ideas and principles to general engineers who have some basic knowledge of linear algebra. In this essay, five primary parts of quantum computing are covered in order. First of all, three reasons for developing quantum computing, including its marvellous performance on some certain computational tasks, high efficiency on simulating other quantum systems and some physical limits to classical computing, will be illustrated in detail. After that, three basic concepts namely quantum bit, quantum gate and quantum computing process will be explained convincingly by comparing their classical counterparts. The different parts between them such as ‘superposition’ and ‘probabilistic’ will be focused on, because they are where the unique properties of quantum computing come from. In the third part, Shor’s factorizing algorithm will be set as an example to demonstrate its prominent computational power. In the final part, its rapid development and tremendous potentials both in physical research and market promise a visible future.

Table of Content

Introduction…………………………………………………………………………………...1 Why quantum computing………………………………………………………………...2 Basic Concepts…………………………………………………………………………….3 Computational Power…………………………………………………………………….9 Quantum Computer……………………………………………………………………..10 Future Perspectives and Potentials……………………………………………………..12 Conclusion……………………………………………………………………………………13 Reference List………………………………………………………………………………..14

Quantum Computing

Introduction

Combining quantum mechanics and computer science, quantum computing as an interdisciplinary and cutting-edge science has forged ahead in the past three decades and is forecast to usher in the next scientific and technological revolution in the field of computer industry [1]. The concept of quantum computing came from the work of Richard Feynman [2] in 1982 when he attempted to simulate a physical quantum system in a classical computer and found the physical limits of the classical computing. However, at that point, the lack of idea about how to use quantum effects to speed up computation brings it to a standstill until a quantum algorithm developed by Peter Shor in 1994 proved that the speed of quantum computing is exponentially higher than that of classical computing [3]. It is these facts above that facilitate the development of quantum computing. In this essay, more detailed reasons for why quantum computing will be firstly discussed. In the second section, three basics of quantum computing will be clarified. In line with them, the third section will demonstrate the power of quantum computing by an example algorithm. Following that, that how to put theory into practice, namely the quantum computer, will be considered. Finally, some fascinating future perspectives and potentials of quantum computing will be depicted.

1. Why quantum computing

1.1 Moore’s Law Has Physical Limits

The physical limit to classical computing seems to be a critical negative incentive for quantum computing. In 1960s, an astounding prediction was made by the Intel co-founder Gordon Moore [4]: in the light of his observation, he anticipated that the density of transistors per chip on an integrated circuit would increase exponentially every year. Though the actual amount of transistors on chips has merely doubled every 18 months, this prediction is somewhat correct that the maturity of classical computer technique keeps soaring. Hence, it is an inescapable fact that such components will eventually be minimized to the atomic scale which is dominated by some bizarre quantum effects [5]. To...

Please join StudyMode to read the full document