The accuracy of a digital tester is defined as the difference between the reading and the true value for a quantity measured in reference conditions. Accuracy is specified in the format: (±xx% rdg ±xx dgt). The first portion identifies a percentage error relative to the reading, which means it is proportional to the input. The second portion is an error, in digits, that is constant regardless of the input. "Rdg" is for reading and "dgt" is for digits. Dgt indicates the counts on the last significant digit of the digital display and is typically used to represent an error factor of a digital tester.
Back to the list