Only available on StudyMode
  • Download(s) : 985
  • Published : January 15, 2011
Open Document
Text Preview
Discussion Question

1. Briefly describe Moore law. What are the implications of this law? Are there any practical limitations to Moore law?

Moore’s Law is a hypothesis stating that transistor densities on a single chip double every two years. Moore's law describes a long-term trend in the history of computing hardware. The number of transistors that can be placed inexpensively on an integrated circuit has doubled approximately every two years.

Moore's law is a rule of thumb in the computer industry about the growth of computing power over time. Attributed to Gordon E. Moore the co-founder of Intel, it states that the growth of computing power follows an empirical exponential law. Moore originally proposed a 12 month doubling and, later, a 24 month period. Due to the mathematical nature of doubling, this implies that within 30-50 years computers will become more intelligent than human beings.

The implications of many digital electronic devices are strongly linked to Moore's law: processing speed, memory capacity, sensors and even the number and size of pixels in digital cameras.All of these are improving at (roughly) exponential rates as well. This has dramatically increased the usefulness of digital electronics in nearly every segment of the world economy. Moore's law precisely describes a driving force of technological and social change in the late 20th and early 21st centuries.

Transistors per integrated circuit. The most popular formulation is of the doubling of the number of transistors on integrated circuits every two years. At the end of the 1970s, Moore's law became known as the limit for the number of transistors on the most complex chips. Recent trends show that this rate has been maintained into 2007. Density at minimum cost per transistor. This is the formulation given in Moore's 1965 paper. It is not just about the density of transistors that can be achieved, but about the density of transistors at which the cost per transistor is the lowest. As more transistors are

put on a chip, the cost to make each transistor decreases, but the chance that the chip will not work due to a defect increases. In 1965, Moore examined the density of transistors at which cost is minimized, and observed that, as transistors were made smaller through advances in photolithography, this number would increase at "a rate of roughly a factor of two per year".

Power consumption. The power consumption of computer nodes doubles every 18 months.Hard disk storage cost per unit of information. A similar law (sometimes called Kryder's Law) has held for hard disk storage cost per unit of information. The rate of progression in disk storage over the past decades has actually sped up more than once, corresponding to the utilization of error correcting codes, the magnetoresistive effect and the giant magnetoresistive effect. The current rate of increase in hard drive capacity is roughly similar to the rate of increase in transistor count.

Recent trends show that this rate has been maintained into 2007.Network capacity. According to Gerry/Gerald Butters, the former head of Lucent's Optical Networking Group at Bell Labs, there is another version, called Butter's Law of Photonics, a formulation which deliberately parallels Moore's law. Butter's law says that the amount of data coming out of an optical fiber is doubling every nine months.

Thus, the cost of transmitting a bit over an optical network decreases by half every nine months. The availability of wavelength-division multiplexing (sometimes called "WDM") increased the capacity that could be placed on a single fiber by as much as a factor of 100. Optical networking and dense wavelength-division multiplexing (DWDM) is rapidly bringing down the cost of networking, and further progress seems assured. As a result, the wholesale price of data traffic collapsed in the dot-com bubble. Nielsen's Law says that the bandwidth available to users increases by 50% annually.

tracking img