
Instrument Calibration
20 Most Common Interview Questions & Answers
1. What is calibration?
Calibration is the process of comparing the measurement values of a device under test (DUT) with those of a calibration standard of known accuracy. The goal is to detect, correlate, report, or eliminate by adjustment any variation in the accuracy of the instrument being calibrated.
2. Why is calibration important?
Calibration is crucial for ensuring the accuracy, reliability, and safety of processes. It ensures that measurements are consistent and trustworthy, which is vital for quality control, regulatory compliance (e.g., ISO 9001), process efficiency, and preventing equipment failure or accidents.
3. What is the difference between accuracy and precision?
- Accuracy: How close a measured value is to the true or accepted value.
- Precision: How close repeated measurements are to each other (repeatability). An instrument can be precise without being accurate.
4. What is a calibration standard?
A calibration standard is a highly accurate instrument or artifact that has a known and documented relationship to a national or international standard (like those from NIST or NPL). It is used as a reference to calibrate other, less accurate instruments.
5. What is traceability in calibration?
Traceability is an unbroken chain of comparisons that links a measurement from a field instrument back to a primary national or international standard. Each step in the chain includes information about the measurement uncertainty, ensuring the validity of the final calibration.
6. What is the difference between "As Found" and "As Left" data?
- As Found: The measurement data recorded before any adjustments are made to the instrument. This shows the instrument's performance since its last calibration.
- As Left: The data recorded after the instrument has been cleaned, repaired, or adjusted. This shows its condition at the end of the calibration process.
7. What is meant by "zero and span" adjustment?
This is a common two-point calibration method:
- Zero Adjustment: Corrects the instrument's reading at the lowest point of its measurement range.
- Span Adjustment: Corrects the instrument's reading at the highest point of its range. Adjusting the span affects the slope of the instrument's response curve.
8. What is hysteresis in an instrument?
Hysteresis is the difference in an instrument's output for the same input value, depending on whether the input was reached by increasing or decreasing from a previous value. It's typically checked during a multi-point calibration by taking readings both upscale and downscale.
9. How do you determine the calibration interval for an instrument?
The interval depends on several factors: the manufacturer's recommendation, the instrument's criticality, its history of stability (drift), the harshness of the operating environment, and any regulatory requirements. It can be shortened if the instrument is found out of tolerance frequently or lengthened if it consistently holds its calibration.
10. What is a 3-point or 5-point calibration?
This refers to the number of test points used across the instrument's range.
- 3-Point: Typically checks at 0%, 50%, and 100% of the range.
- 5-Point: A more thorough check, typically at 0%, 25%, 50%, 75%, and 100%. This is better for identifying non-linearity issues.
11. What is measurement uncertainty?
Measurement uncertainty is a parameter that quantifies the doubt about the result of a measurement. It is a range within which the true value is believed to lie with a certain level of confidence. It accounts for all potential sources of error in the measurement process, including the standard, the technician, the environment, and the DUT itself.
12. What information should be on a calibration certificate?
A certificate should include: the unique ID of the instrument calibrated, the date of calibration, the calibration results ("As Found" and "As Left"), the standards used and their traceability, the environmental conditions, the measurement uncertainty, and the signature of the technician who performed the work.
13. What is a TUR (Test Uncertainty Ratio)?
TUR is the ratio of the accuracy of the device under test to the accuracy of the calibration standard. A common rule of thumb is to have a TUR of at least 4:1, meaning the standard is at least four times more accurate than the instrument being calibrated. This ensures the standard contributes minimally to the overall uncertainty.
14. What is instrument drift?
Drift is the gradual change in an instrument's measurement over time, even when the process variable is constant. It is a primary reason why periodic recalibration is necessary. The "As Found" data from a calibration helps to quantify this drift.
15. What steps would you take if you find an instrument is significantly out of tolerance?
First, I would document the "As Found" readings. Then, I would notify the relevant supervisor or quality department. This is critical because products or processes controlled by that instrument since its last good calibration may be suspect. After notification, I would proceed with troubleshooting, adjustment, or repair, and then perform the "As Left" calibration.
16. What is linearity in the context of instrumentation?
Linearity is a measure of how well an instrument's output follows a straight line over its entire operating range. A non-linear instrument might be accurate at its zero and span points but inaccurate in the middle of its range. A 5-point calibration is good for checking linearity.
17. What is a bench calibration versus a field calibration?
- Bench Calibration: The instrument is removed from the process and taken to a controlled environment (a lab or workshop) for calibration. This allows for more stable conditions and the use of more precise equipment.
- Field Calibration: The instrument is calibrated in place, while it is still installed in the process. This is faster but can be affected by environmental factors.
18. What is a HART communicator and how is it used in calibration?
A HART (Highway Addressable Remote Transducer) communicator is a handheld device used to configure, diagnose, and trim "smart" instruments. For calibration, it can be used to read the sensor's output digitally, re-range the transmitter, and perform digital trims for zero and span, which is often more accurate than analog potentiometer adjustments.
19. What is a loop calibration?
A loop calibration checks the entire measurement circuit, from the primary sensor to the final control element or display in the control room. This involves simulating an input at the transmitter and verifying that the correct value is shown on the HMI or DCS. It validates the integrity of the entire control loop, including wiring and I/O cards.
20. What safety precautions are essential during calibration?
Safety is paramount. Key precautions include following Lock-Out/Tag-Out (LOTO) procedures, wearing appropriate PPE, being aware of the process fluids and pressures, and communicating with control room operators before making any changes to a live instrument to avoid upsetting the process.