History of Logarithms
Logarithms were invented independently by John Napier, a Scotsman, and by Joost Burgi, a Swiss. Napier's logarithms were published in 1614; Burgi's logarithms were published in 1620. The objective of both men was to simplify mathematical calculations. This approach originally arose out of a desire to simplify multiplication and division to the level of addition and subtraction. Of course, in this era of the cheap hand calculator, this is not necessary anymore but it still serves as a useful way to introduce logarithms. Napier's approach was algebraic and Burgi's approach was geometric. The invention of the common system of logarithms is due to the combined effort of Napier and Henry Biggs in 1624. Natural logarithms first arose as more or less accidental variations of Napier's original logarithms. Their real significance was not recognized until later. The earliest natural logarithms occur in 1618. It can’t be said too often: a logarithm is nothing more than an exponent. The basic concept of logarithms can be expressed as a shortcut…….. Multiplication is a shortcut for Addition: 3 x 5 means 5 + 5 + 5 Exponents are a shortcut for Multiplication: 4^3 means 4 x 4 x 4 Logarithms are a shortcut for Exponents: 10^2 = 100. The present definition of the logarithm is the exponent or power to which a stated number, called the base, is raised to yield a specific number. The logarithm of 100 to the base 10 is 2. This is written: log10 (100) = 2. Before pocket calculators — only three decades ago, but in “student years” that’s the age of dinosaurs — the answer was simple. You needed logs to compute most powers and roots with fair accuracy; even multiplying and dividing most numbers were easier with logs. Every decent algebra books had pages and pages of log tables at the back. The invention of logs in the early 1600s fueled the scientific revolution. Back then scientists, astronomers especially, used to spend huge amounts of time crunching numbers on...
Please join StudyMode to read the full document