A voltmeter is an instrument used for measuring the electrical potential difference between two points in an electric circuit. Analog voltmeters move a pointer across a scale in proportion to the voltage of the circuit; digital voltmeters give a numerical display of voltage by use of an analog to digital converter. Voltmeters are made in a wide range of styles. Instruments permanently mounted in a panel are used to monitor generators or other fixed apparatus. Portable instruments, usually equipped to also measure current and resistance in the form of a multimeter, are standard test instruments used in electrical and electronics work. Any measurement that can be converted to a voltage can be displayed on a meter that is suitably calibrated; for example, pressure, temperature, flow or level in a chemical process plant. General purpose analog voltmeters may have an accuracy of a few per cent of full scale, and are used with voltages from a fraction of a volt to several thousand volts. Digital meters can be made with high accuracy, typically better than 1%. Specially calibrated test instruments have higher accuracies, with laboratory instruments capable of measuring to accuracies of a few parts per million. Meters using amplifiers can measure tiny voltages of micro volts or less. Part of the problem of making an accurate voltmeter is that of calibration to check its accuracy. In laboratories, the Weston Cell is used as a standard voltage for precision work. Precision voltage references are available based on electronic circuits. and supply voltage variations. To ensure that a digital voltmeter's reading is within the manufacturer's specified tolerances, they should be periodically calibrated against a voltage standard such as the Weston cell. Digital voltmeters necessarily have input amplifiers, and, like vacuum tube voltmeters, generally have a constant input resistance of 10 megohms regardless of set measurement range. A digital voltmeter, or DVM, is used to take highly accurate voltage measurements. These instruments measure the electrical potential difference between two conductors in a circuit. DVMs are electric voltmeters, and the preferred standard, as they offer several benefits over their analog counterparts. Voltmeters are used to measure the gain or loss of voltage between two points in a circuit. The leads are connected in parallel on each side of the circuit being tested. The positive terminal of the meter should be connected closest to the power supply. In turn, the negative terminal would be connected after the circuit being tested. The analog dial or digital display will exhibit the voltage measurement. In this circuit, Atmel AT89C51 microcontroller is used for controlling the ADC and LCD display devices. ADC 0804 is used for analog to digital conversion. Port2 of AT89C51 microcontroller is used to interface with LCD display. The control lines of LCD are interfaced with Port3 Register Select – P3.0 Read/Write – P3.1 Enable – P3.2 The 11.0592Mhz crystal is used. Theory behind calculation of Voltage divider circuit The input voltage to ADC should not increase 5V and the maximum I/p Voltage to voltmeter is 15V only. So, the design of voltage divider circuit as follows where Vmax = Maximum i/p voltage to voltmeter, Vip = i/p voltage to ADC R1 and R2 are resistance of voltage divider circuit.
Digital voltmeters (DVM)
The first digital voltmeter was invented and produced by Andrew Kay of Non-Linear Systems (and later founder of Kaypro) in 1954. Digital voltmeters are usually designed around a special type of analog-to-digital converter called an integrating converter. Voltmeter accuracy is affected by many factors, including temperature
Microcontroller Based Digital Voltmeter
Lets take R1 as 200K and R2 will be 100K Maximum current: R5 Imax = (Vmax-Vipmax)/R1 (approx) Vmax = 15V, Vipmax = 5V, R1 = 200K P+ PImax =...
Please join StudyMode to read the full document