|
Accuracy
D.E.V.I.C.E. is an encyclopedia of terms used by manufacturers of test and measurement equipment. T&M Atlantic created this service to better explain the functionality of instruments it offers, and to highlight the latest developments in the world of measurement equipment. We are using such tools as animation to bring words and pictures to life and to create not just an understanding but also an appreciation for technology that goes into the design of every instrument.
D.E.V.I.C.E. on Request
If you are searching for a particular term or definition, please contact us and our engineers will be glad to explain it to you.
The accuracy of a digital tester is defined as the difference between the reading and the true value for a quantity measured in reference conditions. Accuracy is specified in the format: (±xx% rdg ±xx dgt). The first portion identifies a percentage error relative to the reading, which means it is proportional to the input. The second portion is an error, in digits, that is constant regardless of the input. "Rdg" is for reading and "dgt" is for digits. Dgt indicates the counts on the last significant digit of the digital display and is typically used to represent an error factor of a digital tester.
Back to the list
|
|