Everything for Electronics

Tech Forum





October 2013

Analog vs. Digital Meter

I have an old Triplett analog multimeter and a new, no-name digital multimeter. A friend told me that, even though I can read out voltage to the second decimal place, my old Triplett is more accurate. I'm confused about the accuracy/precision difference. Can you clarify?

#101310
Katy Aricanli
Phoenix, AZ



Answers

Your question reflects common confusion between the interrelated terms accuracy, resolution, linearity, and offset with respect to measuring instruments. Application ought to also be considered.


Your Triplett uses a moving-coil instrument having a pointer attached to the coil and a set of printed scales behind the pointer: The position of the pointer has a direct, analogous relationship to the amount of current flowing through the meter movement, hence the name "analog" in describing the instrument.  The value of the current, voltage, or resistance being measured is interpolated from the appropriate printed scale at the point directly under the pointer position.


The digital voltmeter samples the applied voltage and produces a digital reading proportional to that voltage; the digital value presented is derived from a binary counter/register. Conversion from the analog measurement to digital presentation is commonly effected by feeding the digital readout value back through a digital-to-analog converter and comparing the converter voltage output to the (scaled) voltage of the original sample; the readout count is advanced until the comparison is "equal" (see next), at which point the digital readout indicates — as best as possible — the voltage being measured. In this implementation, the internal system counts in binary and the least-significant counter bit is not displayed, whence the displayed value cannot be closer to the true value than ±½ the value of the least-significant digit in the display.


Both systems are affected by accuracy and linearity. The accuracy of the instrument expresses the limitation on its ability to indicate the true value of that which is being measured.   "Accuracy" is expressed as a fractional deviation from unity: For a voltmeter, for example, its accuracy is equal to [1 - (Vindicated/Vmeasured)] — thus, if a DC voltage is truly 100 volts, but the instrument indicates 101 volts (or 99 volts), then the instrument is inaccurate by a factor of ±1/100 and its accuracy is said to be ±1%. Accuracy is affected by environmental conditions as well as inherent inaccuracies in its component parts (e.g., internal voltage dividers, etc.).


The linearity of the instrument expresses its ability to maintain its stated accuracy at any measurement value.   Digital-to-analog converters have linearity issues and contribute this problem in digital measuring instruments. Analog instruments depend upon a linear relationship between pointer position and the restoring torque of the coil-position return spring.


Accuracy specifications commonly include the worst effects of linearity in the instrument for a measurement of any value within the specified measurement range, and within the stated environmental conditions.


Resolution is all too often confused with accuracy. Resolution relates to the ability of the observer to identify the measurement value being presented by the instrument. In an analog instrument, the resolution is the smallest value printed on the instrument scale (the least-significant "tick" on the scale). If your Triplett is like mine (a model 630PL), it has a voltage-measurement scale for the 0-10/0-50/0-250 -volt ranges, and there are ten "ticks" between each numeric value: The resolution of the instrument depends, therefore, upon the voltage range in use, and is 0.2 volts on the 10-volt scale, 1 volt on the 50-volt scale, and 5 volts on the 250-volt scale. Stated another way, as there are a total of 50 ticks on the 10/50/250-volt scale, the instrument resolution is 1/50th (or 2%) of full-scale, and this is generally the way in which resolution is expressed for an analog instrument.


In a digital instrument, the resolution is equal to the number of digit positions being displayed, and herein lies the confusion between resolution and accuracy. Let us consider a 4-digit display (= its resolution), having an accuracy of ±1%. If a DC voltage of 50 volts (true value) is measured, the display might read "50.37". For ±1% accuracy, the instrument reading must be within the interval 49.5 to 50.5 volts. The best that we can expect is three digits' accuracy, and thus a reading of "50.37" creates an expectation of 0.1% accuracy, when in reality we should interpret the reading as 50.4 volts — that is, the displayed value rounded to three digits. Resolution and accuracy are independent qualities.


But this brings up the matter of application: Digital instruments use a continuous sequence of sampling and measurement to produce their readings.  Digital instruments are best used for steady-state conditions — e.g., a voltmeter on a bench power supply — else they will attempt to read changing values and produce a blur of digits that is frequently impossible to read. I'm partial to analog instruments because I rarely need extreme resolution but I like the inherent ability of the analog instrument to average over variations in the measurement value. For example, my old Prius has passed its warranty date and I'd like to tinker with the electrical system —at least to the point of inserting a zero-center ammeter measuring shunt into the high-voltage bus at the battery terminal.  The current demand on the propulsion battery is rarely constant and can shift in value and even polarity from moment to moment. A moving-coil instrument is perfect for this, as it will give me an indication of magnitude and direction of current flow on a continuous basis. This would never be possible using a digital instrument.


Finally, the matter of offset: Offset is a constant value difference between true value and indicated value. It is most important that your Triplett instrument be zeroed. With the pointer at rest and the selector switch in the OFF position, and with the instrument oriented in the position commonly used (either standing up or lying down), use the setscrew over the coil pivot point to adjust the pointer so that it lies directly over the "0" mark on the scale. When the instrument is used for resistance measurement, use the Ohms Adjust control with the test leads shorted together to ensure that the pointer lies over the "0" ohms mark on the scale.


Offsets are much less likely to occur in digital instruments because suitable compensations can be built into the design. The one exception that I can think of would be associated with high-input-impedance instruments (solid-state or vacuum-tube voltmeters, for example) in which electrochemical differences between measurement probes and the surfaces being probed might cause slight offset voltages for which internal compensation would not apply.


I hope the above discussion answers your question.


One of my frequent gripes is the marketing of the term "accurate". In my local hardware store there is a shelf containing outdoor thermometers. The advertising blurbs on the packages all state that the instrument has "guaranteed accuracy". "Accuracy" without a stated value is a meaningless term. It's easy to guarantee a meaningless statement. And of course, for the digital varieties of these instruments, the manufacturer is quite content to let the buyer equate resolution with accuracy. (The best way to buy a thermometer at the hardware store is to examine all of the specimens on the shelf of the model desired, and choose the one that best represents the group consensus, excluding those having markedly different readings.)

Peter Goodwin
Rockport, MA

This question reminds me of when I was a physics lab instructor when I was a graduate student. But, I'll spare you the calculator and significant digits issue.


In the case of your digital instrument we have a similar situation in that just because the meter shows a voltage to two or three decimal places, it does not mean it can actually measure that accurately or that it was calibrated to that level accuracy.


Most meters come with specifications which tell us how accurate the measurement is likely to be. I have here in my desk at work a meter, DT-830B, so I went to the web for the "user's manual for DT830B" and found one.


For resistance it says:
Range: 200 Ohms
Resolution: 0.1 Ohm
Accuracy: +/- 1.2% +/- 5D


At 200 Ohms 1.2% is +/- 2.4 Ohms so when the meter reads 100.0 Ohms, we can only know that the value is somewhere between: 98.8 - 101.2 Ohms +/- 5D


Nowhere is +/- 5D explained. It should be +/- 5 in the least significant digit which means in this case it would change the measured value to between: 98.3 - 101.7 Ohms.


That being the case, then if the meter reads 2.0 Ohms we have +/- 1.2% or 1.96 - 2.04, but we can't see that since the meter only goes to #.#, BUT +/- 5D would mean that the meter would show anything between 1.5 - 2.5 Volts which is +/- 25% of the measured value! You see, the +/-5D becomes significantly more important the smaller the measured value becomes, where as the % follows the measured value.


To know which meter is better you would have to find these specifications for each meter. But also know this, "Accuracies are guaranteed for 1 year, 23degress C +/- 5 degrees, less that 75% RH" or whatever your meter manufacturer claims.


Also, it all depends on if we can trust the standards the instrument was calibrated against as well. With an unknown meter, made who knows where, we don't know what standards they are calibrating against. Which is why I test anything I buy against my best meter.


Now, how would we create our own voltage, and resistance standards?

Philip Karras
via email

The difference between accuracy and precision is important. I have read several technical articles lately which confuse the two terms. Accuracy is a measure of how close the indicated value is to the actual value.


For instance, a 100 ohm 1% must be between 99 and 101 ohms. If you measure it with a DVM and it reads 99.9 ohms, you have measured it with a precision of 0.1 ohms. If your DVM reads 200.53 ohms you have measured it with a precision of 0.01 ohms but the accuracy is terrible.


A good analog meter with a mirror scale can usually be read with a precision of three digits, for instance: using a 5V range you should be able to resolve within 0.01 volts. An average DVM, 3 1/2 digits, should be able to resolve within 1 millivolt on a similar range.


Both of these examples refer only to the precision of the device - not the accuracy. The accuracy is determined by other mechanisms. For the DVM it is both the A/D and its reference. For the analog meter, it is the linearity of the meter movement. For both devices, the accuracy is also determined by the other system components such as the resistors which form the voltage dividers.

Larry Cicchinelli
via email

Precision is a subjective term when relating to meters, especially when comparing analog (i.e., your Triplett with moving needle movement) and digital meters. The main differences between the two meters are these:

  1) Display readability (numbers are easier to read than a meter needle)
  2) Input impedance (The Triplett's is a few tens of kilohms to a couple hundred kilohms vs. a couple MEGOHMS for the digital meter - this directly affects the absolute accuracy of the voltages you're measuring)


For a typical digital meter with 3-1/2 digits (a leading 1 plus 3 whole digits, the precision is defined thusly for each range:

  200mV range = +/- 100 uV (microvolts)
  2V range       = +/- 1mV (millivolts)
  20V range     = +/- 10 mV
  200V range   = +/- 100 mV
  2000V range = +/- 1V

As you can calculate, the precision of the digital for all ranges will be "0.05%".


For the Triplett, your precision will be determined by the THICKNESS of the meter needle next to the scale, the SIZE of the scale and its' gradation, and YOUR EYESIGHT as follows (I'm guessing as to the available ranges here!):

  1.5V range  = +/- 0.01V (10 millivolts)
  15V range   = +/- 0.1V (100 millivolts)
  150V range    = +/- 1V
  1000V range  = +/- 10V

As you can calculate, the "precision" for the Triplett will be "0.67%" for all ranges.


Therefore, with the figures above, it looks like the digital is more precise, measurement-wise, than the Triplett.  HOWEVER, this doesn't take into account the environment you're measuring in, the quality of test leads, the quality of the measured signal (i,e, "noise"), and other factors that can make the Triplett more precise than the digital because analog meters generally aren't as susceptible to noise in the signal like digitals are. Again, with analog meters (i.e., the Triplett), if you don't/can't read the meter properly, any precision will be wasted on a poor measurement (i.e., using the AC scale for a DC measurement!). Therefore, it's up to the meter user to ensure things are done right.

Ken Simmons
Auburn, WA

First of all, few analog VOMs (volts-ohm-milliameter) are as accurate/precise as even the cheapest DMM (digital multimeter). Accuracy refers to how close a meter's reading is to the actual voltage. Precision refers to how finely you can resolve this reading. The terms are not interchangeable. A DMM with 8-1/2 digits of readout may be precise, but isn't necessarily accurate if it is incorrectly calibrated. A DMM with 2-1/2 digits of readout may not be that precise, but could be more accurate than that poorly-calibrated 8-1/2 digit DMM. High precision allows you to track small changes in the quantity being measured.  But for all practical purposes, it's difficult to find a high-precision DMM that is not also very accurate if it's been calibrated to specifications recently.


You may contact me through the N&V Forum and request an article I wrote comparing all sorts of meters: VOMs, VTVMs, TVMs, DMMs and differential voltmeters. It's a freebie in .pdf form that I'd be glad to e-mail to you.  If enough readers request it, I'll simply post it on the N&V Forum, so stay tuned!  The Forum is at http://forum.nutsvolts.com where you click on the "General Discussion" forum.

Dean Huster
via email

First, any time you measure a voltage relative to ground you will have a current flow through the meter and an error inversely proportional to the resistance of the meter. The older analog meter has a higher resistance per measured volt, and causes a smaller error. Typically an expensive analog may have 2 meg ohms/volt, a cheap digital might have 20K ohms/volt.


Accuracy is the average of a large number of measurements, precision is getting the same measured value every time. The difference is kind of like looking at shots on a target. Accuracy is like a large pattern around the center and precision is like a tight group somewhere on the target. Ideally, you want both, but it is easier to compensate for the difference between the center of the target (the real value) and the shot group (the measured value), than it is to compensate for an unknown value and unknown direction from center.

Jack Mowery
Amarillo, TX