The Accuracy for most Process Instruments is usually specified in % of Span or simply % Span. The calibration Span is defined as Upper Range Value (URV) minus Lower Range Value (LRV). For Zero-based instruments, % Span is also known as % of Full Scale (% FS). Note that some instruments may be specified in % of Reading or % of Reading + % of Span, so be careful.
The equation for % Span is:
% Span = ((INST – STD) / Span) * 100
INST is the Instrument reading, or output, in engineering units.
STD is the value of the Calibration Standard (or Reference Standard) Instrument.
Span is the Instrument’s Upper Range Value – Lower Range Value (or simply the Upper Range Value for Zero-based ranges).
% Span should be calculated at every calibration test point from 0 to 100% of Span (3 point minimum, 5 or more points is better for checking linearity.
Note that the % Span will be negative for Instrument readings less than the Standard.
Example:
-20 to 120° F range
Instrument reads 49° F with a 50° F Standard for this example
Calculate % Span error at 50° F (midscale):
Span = URV – LRV = 120° F – (-20° F) = 140° F
% Span = ((INST – STD) / Span) * 100 = ((49° F – 50° F) / 140° F) * 100 = -0.71%
Conclusion:
Error Calculations can be tedious, let E & I Tech CalReportTool do them for you.
See also: Reading Accuracy Specifications by Transcat for information about reading and comparing accuracy specifications.
Nice and thanks for the such website….