What is calibration ?
There are as many definitions of calibration as there are methods. According to ISA’s the word calibration is defined as “A test during which known values of measurand are applied to the transducer and corresponding output readings are recorded under specified conditions.”
The definition includes the capability to adjust the instrument to zero and to set the desired span.
Typically, calibration of an instrument is checked at several points throughout the calibration range of the instrument. The calibration range is defined as “the region between the limits within which a quantity is measured, received or transmitted, expressed by stating the lower and upper range values.” The limits are defined by the zero and span values. The zero value is the lower end of the range. Span is defined as the algebraic difference between the upper and lower range values. The calibration range may differ from the instrument range, which refers to the capability of the instrument.
For example, an electronic pressure transmitter may have a nameplate instrument range of 0–750 pounds per square inch, gauge (psig) and output of 4-to-20 milliamps (mA). However, the engineer has determined the instrument will be calibrated for 0-to-300 psig = 4-to-20 mA. Therefore, the calibration range would be specified as 0-to-300 psig = 4-to-20 mA. In this example, the zero input value is 0 psig and zero output value is 4 mA. The input span is 300 psig and the output span is 16mA.
Typically, calibration of an instrument is checked at several points throughout the calibration range of the instrument. The calibration range is defined as “the region between the limits within which a quantity is measured, received or transmitted, expressed by stating the lower and upper range values.” The limits are defined by the zero and span values. The zero value is the lower end of the range. Span is defined as the algebraic difference between the upper and lower range values. The calibration range may differ from the instrument range, which refers to the capability of the instrument.
For example, an electronic pressure transmitter may have a nameplate instrument range of 0–750 pounds per square inch, gauge (psig) and output of 4-to-20 milliamps (mA). However, the engineer has determined the instrument will be calibrated for 0-to-300 psig = 4-to-20 mA. Therefore, the calibration range would be specified as 0-to-300 psig = 4-to-20 mA. In this example, the zero input value is 0 psig and zero output value is 4 mA. The input span is 300 psig and the output span is 16mA.
CHARACTERISTICS OF A CALIBRATION:
Calibration Tolerance:
Every calibration should be performed to a specified tolerance. The terms tolerance and accuracy are often used incorrectly. The definitions for each are as follows:
Accuracy:
The ratio of the error to the full scale output or the ratio of the error to the output, expressed in percent span or percent reading, respectively.
Tolerance:
Permissible deviation from a specified value; may be expressed in measurement units, percent of span, or percent of reading. By specifying an actual value, mistakes caused by calculating percentages of span or reading are eliminated. Also, tolerances should be specified in the units measured for the calibration.
For example, you are assigned to perform the calibration of the previously mentioned 0-to-300 psig pressure transmitter with a specified calibration tolerance of ±2 psig. output tolerance would be:
2 psig÷300 psig ×16 mA= 0.1067 mA
The calculated tolerance is rounded down to 0.10 mA, because rounding to 0.11 mA would exceed the calculated tolerance. It is recommended that both ±2 psig and ±0.10 mA tolerances appear on the calibration data sheet if the remote indications and output milliamp signal are recorded.
Note the manufacturer’s specified accuracy for this instrument may be 0.25% full scale (FS). Calibration tolerances should not be assigned based on the manufacturer’s specification only. Calibration tolerances should be determined from a combination of factors. These factors include:
- Requirements of the process
- Capability of available test equipment
- Consistency with similar instruments at your facility
- Manufacturer’s specified tolerance
Example:
The process requires ±5°C; available test equipment is capable of ±0.25°C; and manufacturer’s stated accuracy is ±0.25°C. The specified calibration tolerance must be between the process requirement and manufacturer’s specified tolerance. Additionally the test equipment must be capable of the tolerance needed. A calibration tolerance of ±1°C might be assigned for consistency with similar instruments and to meet the recommended accuracy ratio of 4:1.
Why calibration is important?
It makes sense that calibration is required for a new instrument. We want to make sure the instrument is providing accurate indication or output signal when it is installed. But why can’t we just leave it alone as long as the instrument is operating properly and continues to provide the indication we expect. Instrument error can occur due to a variety of factors: drift, environment, electrical supply, addition of components to the output loop, process changes, etc. Since a calibration is performed by comparing or
applying a known signal to the instrument under test, errors are detected by performing a calibration. An error is the algebraic difference between the indication and the actual value of the measured variable.
Typical errors that occur include:
Span Error
Zero Error
Combined Zero and Span Error
Linearization Error
Zero and span errors are corrected by performing a calibration. Most instruments are provided with a means of adjusting the zero and span of the instrument, along with instructions for performing this adjustment. The zero adjustment is used to produce a parallel shift of the input-output curve. The span adjustment is used to change the slope of the input-output curve. Linearization error may be corrected if the instrument has a linearization adjustment. If the magnitude of the nonlinear error is unacceptable and it cannot be adjusted, the instrument must be replaced.
To detect and correct instrument error, periodic calibrations are performed. Even if a periodic calibration reveals the instrument is perfect and no adjustment is required, we would not have known that unless we performed the calibration. And even if adjustments are not required for several consecutive calibrations, we will still perform the calibration check at the next scheduled due date. Periodic calibrations to specified tolerances using approved procedures are an important element of any quality system.
When should you calibrate your measuring device?
A measuring device should be calibrated:
According to recommendation of the manufacturer. After any mechanical or electrical shock. Periodically (annually, quarterly, monthly) Hidden costs and risks associated with the un-calibrated measuring device could be much higher than the cost of calibration. Therefore, it is recommended that the measuring instruments are calibrated regularly by a reputable company to ensure that errors associated with the measurements are in the acceptable range.
No comments:
Post a Comment