Thermometers measure temperature, by using materials that change in some way when they are heated or cooled.
What can be considered the first modern thermometer, the mercury thermometer with a standardized scale, was invented by Daniel Gabriel Fahrenhei[->0]t in 1714.
The Celsius temperature scale is also referred to as the "centigrade" scale. Centigrade means "consisting of or divided into 100 degrees". In 1742, the Celsius scale was invented by Swedish Astronomer Anders Celsius[->1].
Lord Kelvin[->2] took the whole process one step further with his invention of the Kelvin Scale in 1848. The Kelvin Scale measures the ultimate extremes of hot and cold.
When you look at a regular outside bulb thermometer, you'll see a thin red or silver line that grows longer when it is hotter. The line goes down in cold weather. This liquid is sometimes colored alcohol but can also be a metallic liquid called mercury. Both mercury and alcohol grow bigger when heated and smaller when cooled. Inside the glass tube of a thermometer, the liquid has no place to go but up when the temperature is hot and down when the temperature is cold. Numbers are placed alongside the glass tube that mark the temperature when the line is at that point.
The quartz thermometer is a high-precision, high accuracy temperature sensor. It measures temperature by measuring the frequency of a quartz crystal oscillator[->3]. The oscillator contains a specially cut crystal that results in a linear temperature coefficient of frequency, so the measurement of the temperature is essentially reduced to measurement of the oscillator frequency.The high linearity makes it possible to achieve high accuracy over an important temperature range that contains only one convenient temperature reference point for calibration, the triple point of water[->4]. Introduced by Hewlett-Packard in 1965.
Please join StudyMode to read the full document