THE COMPUTER AGE AND THE INVENTOR
by Dr Farag Moussa ©
President of the International Federation of Inventors' Assocations (IFIA) (e-mail: firstname.lastname@example.org)
Keynote speech given at the
International Invention Symposium
"How Invention & Innovation Open New Business"
(Hong Kong, November 27, 1998)
Different eras of political history are frequently identified with royal dynasties, or great wars and revolutions.
Eras in the history of art and architecture may be distinguished by styles such as Renaissance, Gothic, Impressionist or Surrealist, and so on.
Techniques too have marked different eras over the centuries: from the primitive tools of the Stone Age, to the Industrial Age marked by steam and electrical power and the discovery of turbines, and engines.
Today, we have entered a new era: the computer age – an age which owes everything to inventors.
Charles Babbage, an English mathematician, is considered to be the great-grandfather of the computer. Over 150 years ago, in 1840 to be exact, he invented a sophisticated calculating machine, and called it the "Analytical Engine." As with many inventions, his creation was far in advance of its time.
It took another 100 years before the first computers were built, and as you know, they were huge and incredibly heavy. Take, for instance, the famous Mark I. It was the world’s first electro-mechanical computer and was used during World War 2 by the U.S. Navy. In comparison to 20th-century systems, it could be likened to a battleship: 2.6 meters high, 16 meters wide, 2 meters deep, and weighing a massive 5 tons!
The machine – the hardware – could not develop without the software to match, of course. In this respect, two women mathematicians played key roles.
Ada Lovelace Byron, daughter of the poet Lord Byron, wrote in 1843 what today we'd call programs for Charles Babbage’s "Analytical Engine." She was a pioneer and is considered to be the very first programmer in history. That's why 130 years later, the U.S. Department of Defence gave her forename – Ada – A-D-A – to one of the most important computer programs in the world. It is used not only by the U.S. Army, Navy and Air Force but also by big industry, universities, and other centers of research.
Grace Hopper, an American woman, invented in 1952 the very first compiler of all times, a program which translates a programming language so that it can be understood by computers. It was a sensational breakthrough which opened doors to automatic programming and thus directly to contemporary personal computers (PCs).
Today, computers are at the center of thousands upon thousands of other inventions. They are the heartbeats of the modern world. Computers are every-where – from kitchens to concrete mixers, from planes to pockets. They listen. They speak. They act. Never in world history has one invention had such an influence on humanity as a whole. Without the computer age, there would be no global awareness.
Internet, in particular, has created a brand new environment. A new culture has been born – free, rapid, and universal – where people share their knowledge and expertise. Information and communication techniques have been turned upside down, distance has been eliminated, frontiers abolished. A tremendous interactive potential is burgeoning on our planet Earth today. Like it or lump it – none can stop it!
I would like to mention something concerning Internet. The inventors in 1990 of the World Wide Web (WWW), which revolutionized the contemporary computer world, did not become millionaires. British Tim Berners-Lee and Belgian Robert Caillau, both researchers at European Centre for Nuclear Research (CERN) in Geneva, did not make any money through their invention of the WWW. They refused to patent it. They feared that in so doing, the use of the Web would prove prohibitively expensive preventing its use worldwide. Thus, they passed up a fortune so that our world can...
Please join StudyMode to read the full document