how to calculate error, tolerence & accuracy in calibrating Digit multimeter by Fluke 5500A Calibrator.

how to calculate error, tolerence & accuracy in calibrating Digit multimeter by Fluke 5500A Calibrator.

i am looking for a manual calibration procedure for (how to calibrate) Digital Multimeter by Fluke 5500A Calibrator. Does anyone have any experience in calibrating these. l need a sample of calibration Test Result sheet also.Presently iam having one calibration Test Result sheet which is attached and highlighted in yellow color my doubted areas. I need detailed explanation about calculating the same.

rvrnly:

This is your third question in the series of question about calibrating digital multimeters. As I had noted in my previous answers Fluke has very good documentation on the subject. It appears that you gotten some of that information, but are not understanding if fully.

Let me see if I can assist you a little further by defining the terms you appear to be confusing you.

ERROR – The difference between the expected value and the value as measured. Example – If the Calibrator is outputting 75.0 volts DC the multimeter should measure 75.0 volts DC. If it actually measures 75.1 Volt DC then the measurement is 0.1 volt or 75.1 – 75.0.

TOLERANCE – Usually provide as a percentage of the expected value. It can be plus or minus. Tolerance = (Measured Value – Expected Value)/Expected Value. In the above case the Tolerance is (75.1-75.0) / 75 = 0.13%.

Tolerance is measurement of accuracy. Typically it is defined or specificed by the manufacturer of the device in question.