Pressure & temperature calibration notes
Electronic transmitters
The Importance of Calibration
Properly operating instruments are critical to plant safety and product quality. To calibrate process instruments, it is essential that controlled conditions and measurement signals present in the actual process installation are simulated.
Input and Output Measurement Standards
For an analog electronic pressure transmitter a variable signal source provides a pressure input at the same values as used in the process, and a precision input standard measures the calibration input. A regulator is used to precisely control the input pressure. The range of the transmitter determines the input measurement standard. The output of the transmitter is connected in a series circuit that furnishes the 24 VDC transmitter power and measures the output signal from the transmitter. With the output signal range of 4-20 mA, the most appropriate output measurement standard would be a milliammeter. The instrument should be mounted in the same position as it is installed in the process.
Connections
The input is connected so the precision gage indicates the pressure to the instrument under test. Locate the precision gage on a tee between the regulator and the transmitter. A bleed valve connected between the regulator and the transmitter releases pressure from the pneumatic input calibration circuit and allows the system to return to zero. The output of the transmitter is connected in series with the milliammeter and the power source.
Five Point Check
Calculate the input test points for the upscale and downscale check and the expected output values. The recommended five test points are typically 10%, 30%, 50%, 70%, and 90% of input span. Because 0% and 100% are extremes of the transmitter range, they are not recommended as test points.
Accuracy of the Instrument
With the five point check complete, use the resulting values to determine if any errors are present and calculate the accuracy.
Accuracy = (Deviation / Span ) * 100
Deviation = Expected Value - Actual Value
The zero shift is corrected first, since adjusting the span requires an accurate base point, or zero. Zero is typically adjusted at 10% input until 10% output is produced. Next, adjust the span with 90% of input pressure applied. With the input pressure at 90%, the span is adjusted to provide 90% of the output. Zero and span adjustments interact, continue correcting until both zero and span are correct.
2.Smart Pressure Transmitter Calibration
Configuration
Smart transmitters can communicate with a hand held configuration instrument. Smart transmitters are calibrated at the factory. It is usually necessary to configure the transmitter to meet your process requirements.
Reranging
Configuration enables the user to select the appropriate range for a specific process within the transmitter's wide span. Reranging the pressure transmitter may be necessary to meet measurement requirements when the application of the transmitter changes or when a smart pressure transmitter is replaced. At the beginning of the configuration procedure it is usually necessary to enter information about the instrument and its function in the process. Once the upper and lower range values are entered and verified they can be downloaded directly to the transmitter.
Changing the Inputs
Enter the applicable measurement units by pressing the UNITS key on the interface device. Then, press ENTER and PROCEED, to confirm the units selected. Next, enter the upper and lower range values. Press the LRV key to select the lower range value. Then key in the value and press ENTER. The interface device will display the lower range value and is represented in the process as a 4 mA signal. The upper range value is set the same way, using the URV key. This selected value will now be represented in the process by a 20 mA signal
Differential pressure
Introduction
A differential pressure transmitter compares a high pressure value to a low pressure value.
Input and Output Measurement Standards
The differential pressure transmitter can be calibrated with a low pressure calibrator. A low pressure calibrator contains a regulator to adjust pressure output values to the transmitter, and a digital precision gage that can measure the calibrator output to the instrument under test. The output circuit for the calibration includes a milliammeter to monitor the output signal and a 24 VDC power source.
Connections
Connect the pressure source from the calibrator to the high pressure port on the transmitter. The transmitter low pressure port is vented to atmosphere. This allows differential pressure to be read directly without any calculations. The transmitter output is connected in a series circuit.
Five Point Check
Start the calibration with a five point upscale and downscale check. After you have collected the data, evaluate the readings to determine the instrument accuracy.
Accuracy of the Instrument
Correct the zero shift first by adjusting the input to 10% of the transmitter range. To correct the span error, first raise the input to 90% of the instrument's span. Then adjust the span until the appropriate milliamp output is indicated. Zero should be checked again after making a span adjustment because span and zero may interact. Continue rechecking and adjusting zero and span until the transmitter is calibrated within specifications.
Gages
Introduction
Pressure changes applied to the gage cause the elastic element to expand and contract. The movement of the element is translated into movement of the pointer through links, levers, and gears. Calibrating a pressure gage includes adjustment of these components until the gage reading accurately represents the input.
Input and Output Measurement Standards
The instrument under test ( in this case, the gage ) determines the calibration standards. First you need a source of pressure, which is provided best by a regulator. An input standard to measure the pressure applied is also needed. An appropriate input measurement standard for this calibration is precision.
Connections
Use a tee to connect the precision gage to the source of pressure and the gage under test. Be sure the gage under test is mounted in the same orientation as it is in the process.
Five Point Check
Determine the five test points used for the upscale and downscale checks of the gage under test. With any link and lever instrument it is important that your entire upscale check be done in an upscale direction and your entire downscale check be done in a downscale direction.
Accuracy of the Instrument
The test results should be checked for an accuracy within the manufacturer's specifications. If the results are outside the manufacturer's specifications, determine the type of errors present. On most motion balance instruments, try to adjust linearity first. Linearity is corrected at the midpoint of the range, apply 50% PSI. Use a template to check the 90 Deg. angle, matching the linkage angle with the template. With linearity adjusted, position the pointer so the gage reads midscale. You may need to remove the pointer and reposition it on the shaft. Lower the input to 10%, and adjust the zero so the gage reading equals the 10% input value. Now correct the span error. The input pressure is increased to 90%. Adjust the span until the gage reads this same 90% value. Repeat the zero and span adjustments until the readings at 10% and 90% are accurate. Zero and span interact, rechecking is required for best results.
Thermocouples
Introduction
Thermocouples measure temperature and are used quite often in process controls.
How They Work
When wires of two different thermoelectrically homogeneous materials are joined at one end and placed in a temperature gradient, a thermoelectric voltage (EMF) is observed at the other end. The connection is called the measuring junction. On all thermocouples, the red lead is negative. The color of the other wire indicates the thermocouple type. For example, on a J-type thermocouple, the positive wire is white. Tables for each type of thermocouple list the voltages produced at various temperatures. Thermocouples should be checked whenever there are indications that the output is not accurate. It may also be necessary to check a thermocouple that will be used for a measurement standard.
Input and Output Measurement Standards
A temperature bath provides controlled temperatures for testing a sensor. A well in the temperature bath is used to hold the sensor during the accuracy check. Another well is used to hold a measurement standard thermometer, it is used to confirm the actual bath temperature. A second measurement standard thermometer is used to read the ambient temperature at the reference junction. The sensor output signal and ranges determine the thermocouple output measurement standard. Since the output is measured in millivolts, a millivolt meter is used to read the output.
Connections
Set up the temperature bath as a temperature input standard to the thermocouple. Select the output standard with the appropriate range for reading millivolts. Connect the red lead to the negative millivolt meter input and the white lead to the positive millivolt meter input.
Three Point Checks
Because no adjustments are possible, we can only check the calibration of a thermocouple sensor. This check is generally done at three test point input values: ambient temperature, mid-range temperature, and upper value of the application range. Recall that in a thermocouple, it is the difference in the temperature between the reference and the measuring junction that produces a millvoltage output. Before inserting the thermocouple into the bath, determine the ambient temperature, which represents the temperature at the measuring junction of the thermocouple. When using look-up tables that are referenced to 0 Deg., you must compensate for ambient temperature. The millivolt value in the tables for the ambient temperature is added to the value from the sensor. This compensated millivolt value is used to determine from the tables the correct temperature.
R.T.D.'s
Introduction
Accuracy checks for a resistance temperature device (RTD) is often necessary to validate the accuracy of a new RTD.
How They Work
There are three basic types of RTDs: 2-wire, 3-wire, and 4-wire units all of which are wired in a bridge configured circuit. An RTD is a metallic element whose resistance varies with temperature. By connecting it in one leg of a Wheatstone bridge, its resistance can be measured. 2-wire RTDs are susceptible to errors caused by changes in lead wire resistance. To compensate for these and other errors, 3-wire RTDs and 4-wire RTDs may be used when accuracy is required.
Input and Output Measurement Standards
The application range of the RTD determines the measurement standards to be used to check this instrument. An appropriate input standard would be a temperature bath. A standard thermometer must be placed in one of the wells to confirm the accuracy of the bath. RTDs do not require temperature compensation. A volt-ohmmeter or decade resistance box can be used as the output measurement standard, because they measure resistance. To check an RTD with a volt-ohmmeter, look-up tables that relate RTD resistance to temperature are needed.
Connections
The two red RTD leads are connected to the positive meter leads, and the black RTD lead is connected to the negative lead.
Three Point Check
Test RTDs at: ambient temperature, mid-range temperature and high end of range temperature. RTD's cannot be calibrated. When they are not within the manufactures specifications, the unit must be replaced.
Thermocouple transmitters
Introduction
Calibration of temperature transmitters should be checked on a periodic basis.
Input and Output Measurement Standards
The temperature transmitter discussed here receives an input signal from a thermocouple. A millivolt input signal will be needed for calibration, so a millivolt source can be used for the input standard. A milliammeter can be used to measure the transmitter output. Use a standard thermometer to calculate the input signal compensation for the ambient temperature. Finally, a power supply for the transmitter is necessary.
To calibrate a temperature transmitter with a millivolt meter as an input standard, you must compensate for any reference temperature other than 0 Deg. C ( 32 Deg. F.).
Connections
To make the input connections, the location of the reference junction must first be determined. When thermocouple wires are used to connect the millivolt source to the transmitter, the reference junction is at the transmitter connection. So, ambient temperature is measured in the transmitter housing. If copper wires are used, the reference junction is at the connection to the millvolt source, so measure ambient temperature at the millivolt source. Always observe the polarity of the leads. Connect the negative output from the millivolt source to the positive transmitter terminal. Connect the milliammeter in series with the transmitter and the power supply.
Setting the Equipment
Adjust the millivolt source and the milliammeter to the proper values as required, turn on equipment and begin calibration.
Five Point Check
Perform a five point check to determine if the transmitter is accurate according to specifications.
Accuracy of the Instrument
Adjust the zero shift first. It should be set with an input value of 10%. With the zero properly set, a 10% input results in a 10% output. Adjust the span using a 90% input. The zero and span may interact, check and readjust as required.
R.T.D. transmitters
A multifunction temperature calibrator is used to provide all the necessary input values and output measurements. The calibrator provides the RTD simulation for the transmitter inputs and a milliammeter to measure the transmitter outputs. RTD tables are not necessary because temperature can be input directly. Connect the RTD transmitter input and output terminals according to the diagram provided with the calibrator. The calibrator simulates the RTD resistance for the transmitter and displays the resulting milliamp output. Perform a five point upscale and downscale test. Zero is corrected at 10% input and adjusted for 10% output as shown on the milliammeter. Correct the span at 90%. Zero and span interact, recheck and adjust as required.
Flow & Level Calibration Notes
Differential Pressure Transmitter Calibration
Introduction
In the Differential Pressure Transmitter, as flow increases, the differential pressure increases, and when flow decreases the differential pressure decreases. For example, an orifice plate is placed in a pipe to restrict the fluid flow. This restriction creates a pressure drop that can be converted to flow rate. The differential pressure transmitter measures the pressure drop created, by measuring pressure at two different points, upstream and downstream. Differential pressure, then is the difference between the higher pressure or upstream reading and the lower pressure or downstream reading. Differential Pressure = High Pressure - Low Pressure.
Input and Output Measurement Standards
Differential pressure is usually measured in inches of water.
Use a low pressure calibrator to both furnish and measure the input pressure.
A milliammeter is an appropriate output standard to measure the transmitter's output.
Differential Pressure Transmitter Connections
Connect the transmitter to the pressure calibrator as shown in the manufacturer's instructions for the calibrator. The air supply requirements for the calibrator are also found in the manufacturer's instructions. Connect the output from the pressure calibrator to the high pressure port on the transmitter to provide signal pressure. Vent the transmitter's low pressure port to atmosphere to provide a reference point for the differential pressure measurement. To measure the transmitter output, connect a milliammeter to the transmitter. Then connect a 24-volt power supply in series with the transmitter and milliammeter.
Differential Pressure Transmitter Five-Point Check
Typically, inputs at 10%, 30%, 50%, 70% and 90% of span are used as test points.
Check for Hysteresis. Hysteresis is the tendency of an instrument to give a different output for a given input, depending on whether the input resulted from an increase or decrease from the previous value.
Often the data from an instrument test is recorded on a calibration data sheet to help identify instrument errors.
Adjusting for Error Correction
Adjust the zero first, since span error is corrected only after an accurate zero is established. Zero is properly set when a 10% input produces a 10% output.
Adjust the span at 90%. Since zero and span frequently interact, after one of these errors has been corrected, the other may require readjustment.
Square Root Extractor
Flow rate which may be represented by Q, is the square root of the calculated pressure drop across a restriction. Q = square root of the Differential Pressure. Differential pressure transmitters may include an integral square root extractor, which provides a linear output signal. However, if a square root extractor is not part of the transmitter circuitry in the process, a separate square root extractor may be installed in the output signal loop.
Input and Output Measurement Standards
In a loop, a 4-20 mA output from a differential pressure transmitter provides an input to the square root extractor. So, in the calibration, a milliamp source would provide an appropriate input standard. The output measurement standard is also a milliammeter. To complete the setup, connect a power supply in series with a square root extractor and milliammeter. Manufacturer's instructions specify the input values and expected outputs.
The square root of the input determines the output.
Magnetic Flowmeter Calibration
Introduction
In measurement and control loops where the process flow is a conductive liquid, magnetic flowmeters can be used to measure flow. As fluid passes through the meter's magnetic field, the fluid acts as a conductor. The change in potential varies directly with the fluid velocity.
Input and Output Standards
Disconnect the flow tube from the transmitter. A magnetic flowmeter calibrator simulates the signal provided by the electrodes in the flow tube. The operating voltage and frequency range of the calibrator must match those of the magnetic flowmeter. Select the maximum output signal using the calibrator range switch. The signal options include 5,10, or 30 mV AC. The magnetic flowmeter calibrator has predetermined test point, so the percent output knob is used to set each output for a five-point check. Since output is in milliamps, a milliammeter is the appropriate output measurement standard for this calibration setup.
Five-Point Check
To begin the calibration of a magnetic flowmeter, calculate the input signal value. The input signal is equal to the upper range multiplied by the calibration factor and by the phase band factor. These values are indicated on the instrument's data plate.
Input Signal = Upper Range x Calibration Factor x Phase Band Factor
Record the output values at each test point, and from this data determine if the instrument is within manufacturer's specifications. The following formula tells if the range of error is within manufacturer's specifications:
Accuracy = (Deviation / Span ) * 100
Deviation = Expected Value - Actual Value
Adjust zero at the lowest point in the instrument's range by turning the zero adjust screw until the output reading is correct. Then adjust span, and , since zero and span often interact, verify both until no further adjustment is necessary.
To conclude the calibration, recheck the upscale and downscale readings to verify that the instrument is properly calibrated.
Vortex Shedding Flowmeter Calibration
Introduction
In the process line, flowing fluid strikes the bluff body and vortices are shed alternately on each side of the bluff body. The flow velocity determines the frequency at which the vortices are shed. The shedding of the vortices induces a resonance in the bluff body that is detected by the sensor. From the sensor, pulses are sent to the transmitter where the appropriate loop output is developed.
Input and Output Measurement Standards
Calibration of a vortex shedding flowmeter's transmitter requires an input standard that can simulate the electrical pulses counted by the transmitter. A frequency generator provides this input. For a more detailed explanation of a specific frequency generator's features, see the manufacturer's literature.
A milliammeter will display the output signal.
Settings and Connections
Before making the connections, determine the vortex shedding frequency. The vortex shedding frequency is usually provided by the manufacturer, but if it is not listed in the manufacturer's literature, calculate the frequencies using this formula:
Vortex Shedding Frequency = RF x CF x ( URV/TIME)
where:
The vortex shedding frequency is represented in pulses per second or PPS.
PPS = represents the alternate shedding of vortices on either side of the bluff body
RF = stands for reference factor which can be found on the transmitter's data plate and is usually represented in pulses per US gallon
CF = is the conversion factor, and is a number found in the manufacturer's conversion table, the CF converts the RF to actual volume or mass flow rate units
URV = is the upper range value in US gallons per minute
TIME = is related to the increment in which the flow is measured
Are span jumpers required to calibrate this transmitter? Refer to the manufacturer's instructions for the appropriate setting for the coarse span jumpers.
Once the PPS has been determined and span jumpers set in their proper positions, the frequency generator can be connected to the input terminals of the transmitter. The output side of the calibration loop is connected in series. After all the connections are made, set the fine span.
Adjustments and Accuracy
To set the fine span, the frequency generator is set to the upper range value of the transmitter. Adjust the fine span screw until 100% output is indicated on the milliammeter. Once the fine span adjustment has been completed, as indicated by a 20 mA reading on the output standard, adjust the zero.
Disconnect the frequency generator and connect the signal lead to the shield of the coaxial cable.
Zero is adjusted at the low range value so the generator must be set to the appropriate setting following the fine span adjustment.
The zero is adjusted until the output signal matches the input signal at the lowest input value.
The span and zero should be adjusted and readjusted until both are correct.
Perform the Five-Point check to verify the instrument's range accuracy.
Mass Flowmeter Calibration
Introduction
A mass flowmeter measures flow rate in weight per unit time rather than volume. This measurement is independent of temperature and pressure.
Input and Output Measurement Standards
Although an input and an output standard are needed, smart mass flowmeters are digital instruments, so they also require a special communication device. This device, called an interface device or communicator, is used for configuration and calibration.
The interface device must be connected to the mass flowmeter. It can be connected at any point in the loop as long as it is in parallel with the signal lines.
When the device is turned on, a self-diagnostic program is performed. After the self-test is complete, press the process variable key to display current readings from the transmitter.
Keys have the following functions:
HELP = gives an explanation of the current display and the function keys
RESTART = initiates communication with the smart transmitter
PREVIOUS FUNCTION = returns user to last decision level
ALPHANUMERIC KEYS = enter information into the interface
Configuring Mass Flowmeters
Configuration defines the parameters of the transmitter's operation. To begin on-line configuration, press the "config" key on the interface device. Then enter information to identify the instrument.
The instrument parameters are checked and displayed by the interface device. Make any corrections to these displays by following the prompts on the interface device display. With the changes made, press the restart key to download the data to the transmitter. This completes the configuration. Now verify span and zero.
Correcting Errors
To establish the zero and span values, the flow tube must be full of process liquid with no flow. Because actual process fluid is used, this procedure is typically done on an installed transmitter.
Press the "format" key followed by the "proceed" key. Then press the "autozero" key.
The autozero automatically establishes zero to the properties of the process fluid.
The resulting display will indicate that the mass flowmeter is properly calibrated.
Hydrostatic Level Calibration
Introduction
A level sensing device locates the interface between a liquid and a vapor or between two liquids. Then it transmits a signal representing this value to process measurement and control instruments. As the level in the tank changes, the output reading changes proportionally. Hydrostatic head pressure is used to measure fluid level. To determine the height or level of a liquid the head pressure is measured and by knowing the specific gravity of the liquid the height can be calculated. Hydrostatic level gaging often use a differential pressure transmitter to compensate for the atmospheric pressure on the liquid. The high pressure port senses the atmospheric pressure on the fluid in the tank. The high side also senses hydrostatic head pressure. The difference between the pressures can be converted to level. The low pressure port senses only atmosphere.
In dip pipe applications, gas flows through a pipe that is submerged in the tank's liquid. A differential pressure transmitter measures the back pressure on the tube caused by an increase in the tank level. The high pressure port senses the pressure increase caused by the back pressure in the dip pipe. The low pressure port is vented to atmosphere.
The same calibration procedure applies for any differential pressure level measuring system.
Input and Output Measurement Standards and Connections
A low pressure calibrator is the input measurement standard. It provides and measures low pressure values as required for calibrating hydrostatic level systems. A low pressure calibrator contains a pressure readout and pressure regulator.
A milliammeter measures the transmitter's output. The milliammeter, power supply, and transmitter should be connected in series. For best calibration results, mount the transmitter in the same position as it is installed in the process. At the transmitter connect the source of pressure to the high pressure port and vent to atmosphere the low pressure port.
Five-Point Check
Determine the instrument's range and test points for calibration.
For the lower range value measured in inches of water, divide the minimum height of the liquid in inches by the liquid's specific gravity. The upper range value is the maximum height of the liquid in inches of water divided by its specific gravity. The span then, is the difference between these values. Perform the five-point upscale and downscale check.
Correct the zero at 10% of input span, adjusting zero until the output produced is 10% of the output span. Next, correct the span error, applying 90% input and adjusting the span until 90% output is produced.
Closed Tank Level Gauging
The procedure used in open tank applications is also used for closed tank applications. Closed tank applications must compensate for the static pressure in the vapor above the liquid. To accurately measure the head pressure of the liquid alone a reference leg is used. The reference leg is a pipe connecting the vapor space to the low side of the differential pressure transmitter. The reference leg must be either completely dry or completely filled with liquid.
Dry Reference Leg
The low pressure port receives the pressure of the vapor space. The high side receives vapor pressure in addition to the pressure from the liquid. The value measured by the transmitter represents only the pressure of the liquid because vapor pressure is applied to both the high and low sides of the transmitter. Calibrate with pressure to the transmitter's high pressure port, and vent the low pressure port to atmosphere. Adjust the transmitter's span for the specific gravity of the liquid in the tank. The low range is equal to the minimum level in inches, and the upper range value is equal to the maximum level in inches.
Wet Reference Leg
Often it is necessary to use a reference leg filled with liquid for gaging the level in closed tanks that contain volatile fluid. The column of fluid in the reference leg imposes additional hydrostatic pressure on the pressure side of the transmitter. This additional pressure must be compensated for to correctly gage level.
To determine the additional pressure that the reference leg will apply, take the height of the wet leg in inches and multiply it by the specific gravity of the fluid. The reference leg fill liquid may be different from the tank contents. Connect the low pressure calibrator to both ports of the transmitter. A regulator is used to add the hydrostatic pressure of the wet leg to the low side. Then, zero the output until 4 mA of output is produced. After zero is adjusted, perform a five-point check to the high side using a second regulator. In systems where the transmitter is mounted below the minimum measuring level, compensate for the additional static pressure by lowering the zero value. In systems where the transmitter is mounted above the minimum measuring level, compensate for the decreased static pressure by raising the zero value.
Calibrate the transmitter span first before compensating zero for transmitter height location.
Displacement Level Calibration
Introduction
Buoyant force acts on a displacer that is submerged in a liquid. The displacer is reduced in weight by the weight of the amount of fluid it displaces. This movement of the displacer is typically translated and converted to an instrument signal.
Input and Output Measurement Standards
One method is to use actual liquid level as the input for calibrating a displacement level transmitter. The most appropriate liquid for replicating process conditions is a safe liquid with the same specific gravity as the process fluid.
Connect a milliammeter as the output standard and a 24 V DC power supply in a series circuit with the transmitter.
Determine the setting for the calibration dial by multiplying the specific gravity of the liquid by the correction factor. Then, set the pointer to the compensated value.
Displacement level transmitters are classified as direct or reverse acting. With direct action, an increase in level, increases the output signal, and a decrease in level decreases the output signal. With reverse action, an increase in level, decreases the output signal, and a decrease in level increase the output signal.
Calibration
When the chamber is empty, the corresponding output should be 4 mA. If the milliammeter displays a value that is greater than or less than 4 mA, adjust the zero.
To correct span, fill the chamber to the upper range value, and turn the span adjustment until 20 mA is produced.
Linearity is not always adjustable on this type of transmitter, check to manufacturers specifications.
Adjust both zero and span until transmitter performs within specifications.
Liquid-Liquid Interface
This same sensing principle used to measure liquid vapor interface, can be used to locate the interface between two liquids.
The heavier of the two liquids exerts more buoyant force, so as the lower phase rises or falls, the displacer travels with it.
To create calibration conditions that replace the process, use liquids that have the same specific gravity as the process fluids. Fill the chamber with the lighter phase to check and adjust the zero. Check zero by filling the chamber to 100% with the lighter fluid.
Check span by filling the chamber 100% full of the heavier phase. Adjust span until 20 mA output is produced.
Check mid-range output and recheck zero and span.
Test and Calibration Instrumentation
Go to Specific Subject: Test and Calibration Instrumentation | Calibration Forms | Calibration Principles | Calibration Terminology | Calibrating Fieldbus Instruments | Calibration Frequency | Calibration Advantages using Electronic Device Description Language (EDDL) Technology | Other General Test and Calibration Links | Electrical Measurement Safety | Test and Calibration Standards | Temperature Instrument Calibration | Useful Books on Instrument Calibration and Measurement |
Instrument Calibration must be carried out to maintain and verify instrument accuracy. Test and Calibration instruments are available that cover all different instruments which measure typically Flow, Pressure, Temperature etc. They are also used for setting up final elements such as control valves. Test and Calibration instruments are available in portable versions for site calibration or bench versions for the workshop. Calibration is the process of configuring an instrument to provide a result for a sample within an acceptable range. Eliminating or minimising factors that cause inaccurate measurements is a fundamental aspect of instrumentation design.
There are as many definitions of calibration as there are methods. According to ISA’s The Automation, Systems, and Instrumentation Dictionary, the word calibration is defined as “a test during which known values of measurement are applied to the transducer and corresponding output readings are recorded under specified conditions”. The definition includes the capability to adjust the instrument to zero and to set the desired span. Usually Calibration involves injecting an accurate signal from the calibrator into the instrument at 0%, 25%, 50%, 75% and 100% of range and adjusting the instrument zero and span to a point where the instrument is aligned with the calibrator for the specified range.
Test and Calibration Instrumentation
Thanks to Zedflo Australia for the following information.
Low Pressure Calibrations - When you're performing a low pressure calibration, there are some things you can do to make the job easier and better (more accurate) - From Martel Matters Newsletter.
Temperature Calibrations (Thermocouple Edition) - There are a few "gotchas" in temperature calibration when we're talking about T/Cs and RTDs. Each has its own problems, This article will focus on calibration of T/C instrumentation - From Martel Matters Newsletter.
Field Calibrators Make Everything Better - Users Are Opening Their Eyes to How Much Clearer Their Data Is - and How Much Better Their Processes Can Run - with Field Calibrators - From www.controlglobal.com.
Martel Process Calibrators Blog - Lots of Interesting Calibration Related posts.
Martel Calibrator Newsletter - Martel Matters is an e-newsletter from Martel Calibrators. It provides information on new and existing products and calibration "tips". Highlights from the Latest Newsletter include:
- Tips on performing Low Pressure Calibration
- Getting Rid Of Leaks
- Adding Volume To Your System
- Watching Out For Temperature Effects
- Considering Head Pressure Errors
“How to” Choose the Right batteries for your Calibrator - Most of our calibrators are designed to use standard replaceable Alkaline batteries. You know, the ones you can buy at the corner store. That’s a good choice because of the relatively high power density and mostly flat discharge curve of these cells. They start out with a high terminal voltage (1.5 V or more). A really strong point for these batteries is the long shelf life. If you don’t use your calibrator much, that would be a good reason to use Alkalines. New ones typically have a shelf life of 3 years or more.
Is It a Calibrator? - People actually in the calibration industry often hear things being called calibrators that obviously aren’t. The defining points for a calibrator are detailed here.
Gas Custody Transfer Calibration - Using Multivariable Temperature / Pressure Calibrators for Flowmeter Calibration - Gas custody transfer flow computers require special calibration to perform at optimum accuracy. In custody transfer applications where the buying and selling of commodities like natural gas is involved, calibration checks are performed frequently as a matter of fiduciary responsibility. For the purpose of this white paper, the use of gas custody transfer flow computers in the natural gas transmission industry is referenced.
Determining Accuracy and Stability for Digital Pressure Gauges - Like all calibration device manufacturers Martel Electronics spends a great deal of time determining what the accuracy specification is for the products we design. General convention is to publish one (1) year specifications as that is the generally accepted calibration interval for most calibration devices. Pressure measuring devices pose interesting challenges when trying to determine the one year specifications because not only must the long term stability of the electrical circuit be determined you must also determine the stability characteristics of the pressure sensor itself over time and combine those two numbers to arrive at a complete specification.
The Calibration of Smart Instrumentation - A Better Way - The assertion often arises that smart instruments don’t need calibration since they are very stable, reliable devices. This assertion ignores certain vital realities. These include: - Field instruments are exposed to harsh environments.
- Good practice calls for in situ calibration
- Safety Instrumented Systems require periodic calibration
- Custody transfer applications require periodic calibration by user demand
What to Look for in a Documenting Calibrator - A documenting calibration system can be as simple as a single multi-function calibrator with an easy to use software package or it can be a more sophisticated system with extensive database software, custom reporting and multiple brands or types of calibrators. There is a lot of choice in the marketplace, so it’s important to determine your needs and buy accordingly rather than having a system that dictates to you.
Martel LC-110H Loop Calibrator and HART Communications/Diagnostics - This white paper describes the basic functions of HART communications and the diagnostic capability of the Martel LC-110H mA (loop) calibrator.
Why Test and Calibration Equipment Is Essential - For the purpose of optimal performance and efficiency, every car needs regular oil changes. This not only helps the engine to run smoothly, it also helps to increase the car’s lifetime. As such, it goes a long in reducing the need to purchase a new car, which is a headache you could do without. Oil change is the same as the calibration. Basically, calibrating a piece of test equipment also ensures that equipment is running efficiently and fine-tuned. This is required for, test equipment to provide accurate test results in the event of default product models. So why is test and calibration equipment in Perth important in the face of advanced technology?
Why Temperature Compensation Really Matters for Pressure Measurement - Jon Sanders - Have you ever wondered how much impact environmental temperature has on your pressure sensors? Nearly every pressure sensor has some sort of environmental temperature specification on its data sheet. This technical note explains the impact of environmental temperature effects on pressure sensors, quantifying the impact and ways to minimize the impact.
The following calibration white papers are from Additel Corporation:
- Calibrating Differential Pressure Sensors - How to calibrate a DP sensor and understand some of the calibration challenges associated with them.
- Calibrating a Pressure Switch - Pressure switches are commonly used in the process industry for a wide range of applications. A pressure switch is a form of sensor that closes or opens an electrical contact when a certain pressure has been obtained either through a pressure rise or a pressure drop. Pressure switches are used to monitor, control, or provide a caution or warning for a pressure related process. The repeatability, accuracy, and functionality of a pressure switch often tie directly into the safety or efficiency of a process and thus it becomes important that pressure switches are verified and calibrated to ensure their proper function in the process - from Additel Corporation.
- Considerations for Hydraulic Pressure Calibrations - Jon Sanders - If you are doing high pressure, hydraulic calibrations there are a few things that you'll need to consider which will make your life a little easier and help you produce stable measurements. This application note focuses on considerations for pressure calibrations using a high pressure hydraulic pump to generate the pressure.
- Improved Methods for High Pressure Pneumatic Calibrations in the Field - Jon Sanders - Are you tired of dragging a nitrogen bottle and dead weight tester out to the field to perform pneumatic high pressure calibrations? Does it trouble you to use a hydraulic pump or dead weight tester for your gas gauges every time you have to go above 600 psi? This application note details the limitations to traditional methods and provides a solution to calibration of gas gauges up to 3,000 psi (200 bar) with a field-ready calibration tool.
- Understanding Accuracy Specifications for Digital Pressure Sensors - Percentage of Full Scale Versus Percentage of Reading - Jon Sanders - Specifications for digital pressure gauges can sometimes seem confusing or overwhelming, especially, if you are unfamiliar with the terminology. Some pressure sensors will specify accuracy as a percent of full scale (FS) while others provide the specification as a percent of reading. So why are there different ways of specifying the accuracy of pressure sensors and is percent of reading more accurate than percent of full scale or vise versa? This brief technical note will discuss the two differences and answer these questions.
The Following are from the very useful InstrumentationPortal.Com:
- Calibration Form - Calibration forms provide a list of actions to be done when performing instrument calibration. Instrument Calibration is required to make sure that instrument will function properly prior to installation. Before shipping, vendor has already done the calibration after setting the range to pre-determined value as requested by the end-user. It is common by contractor to re-check the instrument by performing bench calibration. However, some end-users prefer to install the instrument without undertaking this test. Following are some typical calibration forms for transmitters, gauges and control valves.
- Transmitter Calibration Form
- Control Valve Calibration Form
- Pressure Gauge Calibration Form
Calibration Principles - Details in this excellent ISA document on how to define key terms relating to calibration and interpret the meaning of each. Understand traceability requirements and how they are maintained. It describes the following:
The characteristics of a good control system technician:
- Differences between bench calibration and field calibration
- The differences between loop calibration and individual instrument calibration, listing the advantages and disadvantages of each.
- List the advantages and disadvantages of classifying instruments according to process importance-for example, critical, non-critical, reference only,OSHA,EPA, etc.
Flow and Level Calibration Notes - Thanks to INX Inc - These notes whilst being a little dated are still very useful.
Pressure & Temperature Calibration Notes - Thanks to INX Inc - These notes whilst being a little dated are still very useful.
Calibration Primer - From Omega.com - The most sophisticated industrial equipment will not be very useful unless it is calibrated. Through calibration, adjustments made to a piece of equipment ensure that it performs as expected-that it can be relied on to deliver predictable, accurate results that meet quality standards. This white paper from Omega Engineering explains what calibration is, why it is important, and how it works. NIST traceability is defined and discussed, and there is a step-by-step description of a basic calibration. This paper also discusses in-house vs. laboratory calibration, and it describes major types of calibration devices.
Some Notes on Device Calibration - From the University of Dublin - A comprehensive albeit a bit academic note.
Calibration - Calibration is the validation of specific measurement techniques and equipment. At the simplest level, calibration is a comparison between measurements-one of known magnitude or correctness made or set with one device and another measurement made in as similar a way as possible with a second device. This article from Wikipedia covers all the basics pretty well.
A Beginner’s Guide to Measurement - This 30 page Beginner’s Guide from the National Physical Laboratory explains the fundamental concepts and basic facts about measurement, and in particular accurate measurement. It includes brief accounts of the role of measurement in modern and historical societies and explains the SI system, its base units and their relation to other units. The various organisations involved in measurement are introduced and their roles in linking all measurements to the SI base units through traceability chains explained. It includes general guidance about practical issues that affect the making of measurements, gives the meanings of key measurement terms, and explains the significance of such fundamental concepts as measurement traceability and calibration.
Loop Calibrations - New, More Efficient Methods Improv Productivity - Ned Espy - The typical approach to calibration has been to regularly test instrumentation that affects successful control, safe operation, quality, or other relevant criteria. In most cases, scheduling is conservative, and methods at a particular site have evolved over time. Instrument technicians follow practices that were set up many years ago. It is not uncommon to hear, “this is the way we have always done it”. Measurement technology continues to improve and is getting more accurate. It is also getting more complex-why test a fieldbus transmitter with the same approach as a pneumatic transmitter? Performing the standard five-point, up-down test does not always apply to today’s more sophisticated applications - from the ISA and InTECH.
The Following Technical Tips are from Advanced Instruments Inc:
- What Is Instrument Calibration and What Does It Do? - Instrument calibration is one of the primary processes used to maintain instrument accuracy. Calibration is the process of configuring an instrument to provide a result for a sample within an acceptable range. Eliminating or minimizing factors that cause inaccurate measurements is a fundamental aspect of instrumentation design.
- Why is Calibration Important? - How a properly performed calibration can improve product performance.
- What Factors Affect Calibration? - Once the benefits of a properly performed calibration are understood, it becomes evident that care must be taken during the process to prevent potential error sources from degrading the results. Several factors can occur during and after a calibration that can affect its result.
Calibration Terminology
Calibration Terminology - Details on calibration terms - from Beamex.
The following links are from Fluke:
- Glossary of Loop Calibration Terms
- Glossary of Pressure Calibration Terms
- Glossary of Temperature Calibration Terms
Calibrating Fieldbus Instruments
Calibrating Fieldbus Transmitters - Fieldbus is becoming more and more common in today’s instrumentation. But what is fieldbus and how does it differ from conventional instrumentation? Fieldbus transmitters must be calibrated as well, but how can it be done? Until now, no practical solutions have existed for calibrating fieldbus transmitters - from Beamex.
Calibration Frequency
Instrument Calibration - Glenn Carlson, Technical Support, - Users frequently want to know how often they need to calibrate their In- Situ instrument. The most accurate answer to that question is “it depends”. This article addresses this - thanks to In-Situ Inc.
How Frequently should a Product be Calibrated? - The simple answer to this question, although not a very helpful one, is “when it needs it”. From a more practical standpoint, daily or periodically testing the control solutions of known values can provide a quantitative indication of instrument performance, which can be used to establish a history - from Advanced Instruments Inc.
How often should Calibrators be Calibrated? - This article discusses some of the things to be considered when specifying the calibration period and provides some general guidelines. The same guidelines that apply to a calibrator also apply to other measuring equipment in the traceability chain. These guidelines can even be used for process instrumentation - from Beamex.
How often should Instruments be Calibrated? - Plants can improve their efficiency and reduce costs by performing calibration history trend analysis. By doing it, a plant is able to define which instruments can be calibrated less frequently and which should be calibrated more frequently. Calibration history trend analysis is only possible with calibration software that provides this functionality - from Beamex.
Calibration Intervals, A Manufacturer’s Perspective - David Deaver - The analysis tools that are currently available for Calibration Intervals focus on setting intervals to achieve a desired reliability target. This paper suggests there is another perspective that these tools do not currently address; consequence cost or accumulated liability. A case is made that sometimes the reliability target is a secondary consideration to managing this consequence cost. The paper also addresses how manufacturers establish calibration intervals. The paper presents, and defends, the practice of using no analysis whatsoever in establishing the manufacturer's recommended calibration interval - from Fluke.
Calibration Advantages using Electronic Device Description Language (EDDL) Technology
The following white papers and articles are from the EDDL group:
- Calibration Trim - EDDL technology makes calibration easier thanks to user guidance such as wizards as well as know-how made available from the device manufacturer's experts. The result is lower cost of maintenance, and better performing devices.
- Intelligent Device Management Tutorial: Calibration - In the early days of smart transmitters the concept of remote range setting ("remote calibration") and re-ranging without applying input was revolutionary. It took years of education to be accepted and understood. Calibration can be carried out using a handheld communicator in the field, a laptop in the workshop, or from intelligent device management software as part of an asset management solution. Electronic Device Description Language (EDDL) is the technology used by device manufacturers to define how the system shall display the device information and functions to the technician. EDDL makes calibration of smart transmitters and other intelligent devices easier thanks to user guidance such as wizards and help, and unparalleled consistency of use.
- EDDL Solution for Field Tasks - Field communicators have existed for as long as intelligent devices. The early problem of plants having to grapple with many different communicators was solved already in the mid nineties by standard protocols like HART and Foundation fieldbus together with the Electronic Device Description Language (EDDL, formerly just known as DDL), an integral part of both technologies. A single universal field communicator supports all instruments, an arsenal of many communicators is no longer required. Use of Windows software from the central control room has since become possible thanks to multiplexers and digital communication interfaces embedded in control system I/O modules. These systems may access data in devices using HART, FOUNDATION fieldbus, PROFIBUS, WirelessHART, or a combination of two or more of these. Although many maintenance tasks can be done remotely from centrally located device management software part of Asset Management Solution (AMS), there are several other tasks that must still be carried out in the field right next to the device. Because of such field work, a compact field communicator is highly valued by maintenance technicians. A notebook computer is not ideal. EDDL is the only technology suitable for portable communicators because it works on embedded operating system used in such devices. IEC 61804-3 graphical enhancements the EDDL now make field communicators easier to use and powerful enough for complex devices.
-
Calibration Trim Wizard using EDDL - Narrated by Emerson's Harish Jayaraman this video shows how calibration trim is simplified by wizards from software or handheld communicator from different vendors using EDDL - from YouTube.
Other General Test and Calibration Links
Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results - Barry N. Taylor and Chris E. Kuyatt - This is NIST Technical Note 1297 as it was published.
U. S. Army Corps of Engineers Process Instrument And Control Checklist - This is a very useful checklist which is designed to facilitate the performance evaluation of process instrumentation and control systems used to operate and monitor treatment processes and equipment.
On Site Flow Calibration is Painful but Necessary - by David W. Spitzer - thanks to ControlGlobal.com. Some new product introductions have raised doubt about what in-situ calibration for flowmeters is, and whether it can be duplicated with simulators and calibrators with expanded diagnostics.
Calibrating and Testing Control Components on your Heat Process - What, When and How Should I Calibrate? - Arthur Holland, Holland Technical Skills - an excellent explanation on the basics of calibration - from dcnz.
Calibrating Non Destructive Testing Instruments - NDT Resource Centre-Calibration refers to the act of evaluating and adjusting the precision and accuracy of measurement equipment. In ultrasonic testing, several forms of calibration must occur. First, the electronics of the equipment must be calibrated to ensure that they are performing as designed. This operation is usually performed by the equipment manufacturer and will not be discussed further in this material. It is also usually necessary for the operator to perform a "user calibration" of the equipment. This user calibration is necessary because most ultrasonic equipment can be reconfigured for use in a large variety of applications.
A Guide to Low Resistance Measurement - You have to register to obtain this handbook which gives an overview of low resistance measurement techniques, explains common causes of errors and how to avoid them. We have also included useful tables of wire and cable characteristics, temperature coefficients and various formulas to ensure you make the best possible choice when selecting your measuring instrument and measurement technique - from Cropico.
Guide To Low Resistance Testing - Understanding and Measuring Low Resistance to Ensure Electrical System Performance - The purpose of this booklet is to help the engineer, technician or operator to understand the rationale behind low resistance, how to make a low resistance, how to select the proper instrument for the testing application and how to interpret and use the results - from Megger and Tech Rentals.
Getting Down to Earth - A Practical Guide to Earth Resistance Testing - Earth resistance is measured in two ways for two important fields of use:
- Determining effectiveness of “ground” grids and connections that are used with electrical systems to protect personnel and equipment.
- Prospecting for good (low resistance) “ground” locations, or obtaining measured resistance values that can give specific information about what lies somedistance below the earth’s surface (such as depth to bed rock). It is not the intent of this manual to go too deeply into the theory and mathematics of the subject. As noted in the references at the end, there are many excellent books and papers that cover these. Rather, the information herein is in simple language for easy understanding by the user in industry - from Megger and Weschler Instruments
The Expression of Uncertainty and Confidence in Measurement - M3003 - The general requirements that testing and calibration laboratories have to meet if they wish to demonstrate that they operate to a quality system, are technically competent and are able to generate technically valid results are contained within ISO/IEC 17025:2005. This international standard forms the basis for international laboratory accreditation and in cases of differences in interpretation remains the authoritative document at all times. M3003 is not intended as a prescriptive document, and does not set out to introduce additional requirements to those in ISO/IEC 17025:2005 but to provide amplification and guidance on the current requirements within the international standard. This 82 page document is certainly comprehensive - from www.ukas.com.
The Internet Resource for the International Temperature Scale of 1990 - Temperature Scale and General Temperature information for metrologists, scientists, calibration engineers and those with an interest in the temperature scale and its realisation.
Calibration of Test Equipment for Maintenance Purposes - The responsibility of an Approved Maintenance Organisation (AMO) is to provide within its Maintenance Organisation Manual (Policy and Procedures Manual (or equivalent document) a list of all test equipment that must be calibrated and the process to track the calibration - from Civil Aviation Safety Authority. Please note this is an archival copy and is not available from the source. However the information contained in it is very useful. Should there be any issue with it being on this website please advise
The Following Papers are from BEAMEX:
- Automated Calibration Planning Lowers Costs - Calibration is an essential element of any instrumentation maintenance program. However, sometimes calibration operations can be long and time-consuming. By planning the process and adding the right tools, efficiency can be improved and costs lowered substantially.
- Traceable and Efficient Calibrations in the Process Industry - Today’s modern process plants, production processes and quality systems, put new and tight requirements on the accuracy of process instruments and on process control. Quality systems, such as the ISO9000 and ISO14000 series of quality standards, call for systematic and well documented calibrations, with regard to accuracy, repeatability, uncertainty, confidence levels etc.
- The Safest Way to Calibrate - An Introduction to Intrinsically Safe Calibrators - There are industrial environments where calibrations should not only be made accurately and efficiently, but also safely. When safety becomes a top priority issue in calibration, intrinsically safe calibrators enter into the picture.
The Following Papers are from Fluke:
- Many Technical Calibration Papers including the following can be found thanks to Fluke, you have to register but it is worth it.
A Poor Man's Resistance Bridge
A Preliminary Assessment of the Effectiveness of 5700A Artifact Calibration
A Traceability Technique for Complex Waveform Generators
A Wheatstone Bridge for the Computer Age
An Application of the Guide to Measurement Uncertainty
An Assessment of Artifact Calibration Effectiveness for a Multifunction Calibrator
Calibration Data Management: Meeting the Reporting Requirements of ISO/IEC FDIS 17025 Future Developments in Oscilloscope Calibration
Maintenance and Calibration of HART Field Instrumentation - Why Calibrate Test Equipment? - You’re serious about your electrical test instruments. You buy top brands, and you expect them to be accurate. You know some people send their digital instruments to a metrology lab for calibration, and you wonder why. After all, these are all electronic - there’s no meter movement to go out of balance. What do those calibration folks do, anyhow - just change the battery? These are valid concerns, especially since you can’t use your instrument while it’s out for calibration. But, let’s consider some other valid concerns. For example, what if an event rendered your instrument less accurate, or maybe even unsafe? What if you are working with tight tolerances and accurate measurement is key to proper operation of expensive processes or safety systems? What if you are trending data for maintenance purposes, and two meters used for the same measurement significantly disagree?
- Measurement Uncertainty - How does DMM Accuracy affect your next Measurement? - Measurement uncertainty is an estimate of the possible error in a measurement. It's also an estimate of the range of values which contain the true value of the measured quantity. It's also the probability that the true value lies within a stated range of values.
The following Calibration Links are from Dickson:
- Are All Metrology Labs Alike? - Short answer - NO!!! - In fact, to those of us in the industry who truly know what it takes to recalibrate instruments to objectively defined standards, a better question might be - Are you using the equivalent of a meat thermometer to validate conditions in your processing plant or laboratories?
- “Before” Calibrations Count More than Many Think - As most know, temperature and humidity dataloggers and chart recorders need to be recalibrated periodically to ensure this accuracy, and competent quality managers need to establish schedules for recalibrations that reflect due diligence to monitor hot temperatures and humidity are kept within acceptable and pre-defined tolerances. But "recalibration" can mean different things, and what could be termed "recalibration on the cheap" does NOT demonstrate the accuracy of your recorded data (instruments).
- Monitoring Revisited - While most pharmaceutical quality managers realize the importance of temperature and humidity tracking to guarantee both quality and compliance, the way in which many go about it is adding hidden costs. Technology for temperature/humidity tracking continues to evolve, and there are numerous time-saving features in recent temperature/humidity data loggers that can make a difference. On one hand, some quality managers are doing too much to track temperature/humidity data, while on the other hand some are doing too little. For many, it’s timely to revisit temperature and humidity monitoring. Here are some key points to consider.
- Monitoring Temperature and Humidity - Monitoring temperature and/or humidity conditions is an essential ingredient of a wide range of quality assurance applications. There are many common methodological errors, however, in ways that this task is approached that either compromise quality standards or add unnecessary time and expense to the monitoring task. Insufficient calibration of temperature and humidity monitoring instruments is high on the list of problematic areas. Mismatching technology to the monitoring task at hand is another problematic area. This article revisits technology trends in monitoring instrumentation, provide tips on calibration and discuss common methodological errors that quality managers should avoid.
- Temperature Calibration Frequently Asked Questions - Asks the questions What is calibration?, Why should I calibrate?, Who says what is accurate? Recalibration: what and why?
Electrical Measurement Safety
Electrical Measurement Safety Program - Every day, an average of 9,000 U.S. workers suffer disabling injuries on the job. Anyone performing electrical measurements should understand the safety standards and be certain their tools meet code. This page from Fluke is an excellent Safety Resource.
Video - Electrical Measurement Safety - This hour long session provides an awareness of electrical measurement hazards; a better understanding of the safety specifications for digital multimeters and testers; an understanding of the four installation measurement categories and; how to minimize and avoid electrical measurement hazards- You will need to register to see this video - from Fluke.
10 Mistakes People Make Working on Electrical Systems - Jim White - This list gets you thinking. We go through life making small mistake after small mistake and nothing happens, until we happen to get the wrong alignment of small mistakes and have an accident. Once the accident starts, we have no control over it, so the best thing to do is to avoid the small mistakes and tighten up the way we work - from Shermco Industries and Fluke.
Test and Calibration Standards
Standards Related to Temperature and Calibration - A list of ASTM standards.
Temperature Instrument Calibration
The following temperature calibration links are compliments of ISOtech:
- A Review of Some of the Best Articles Written about Water and its Triple Point - Details of articles on this subject.
- Calibrating Thermometers - Dave Ayres and Anne Blundell - A thermometer without a traceable calibration route to recognised National Standards is fairly useless. Yet we all buy mass produced thermometers which are supplied without a calibration and use them. We all hope that the manufacturer has been conscientious and has at least carried out calibration checks on batch samples and has claimed a level of accuracy to the batch. But has the manufacturer used suitable standards for the calibration?
- Improved Sterilizer Tests - Dave Ayres and Dave Hill - Scottish Healthcare Supplies Sterilizer Test Group assessed various established methods of on-site temperature calibration and realised there might be shortcomings in commercially available "complete" systems. The guidelines require tests on sterilizer systems to be carried out within a system uncertainty of ±0.5°C but the assessment showed that in many cases "complete" systems could produce a system uncertainty of ±1.0° or worse. (Temperature monitoring in sterilizer systems is critical ensuring that microbiological viability is eliminated from the product).
- Industrial Measurements with very Short Immersion - J. P. Tavener, D. Southworth, D. Ayres, N. Davies - One major problem that keeps recurring is the request to calibrate, or in some other way to evaluate, very short industrial temperature sensor assemblies. These sensors are so short that the sensor does not attain the temperature of its surroundings. Two distinct methods are possible, in method one the assembly is immersed in a comparison bath sufficiently to eliminate the stem conduction effect, even if this method creates a different result than achieved in-situ. Method two attempts to simulate the application in practice and provide a similar stem conduction error as the assembly sees in practice.
- Automating Temperature Calibration Baths with Simple Low Cost Image Acquisition - David J. Southworth - A low cost video camera, “Web Cam” is used in conjunction with a PC and Temperature Calibration Bath to automatically calibrate handheld digital thermometers which have no provision to be connected to an external computer.
- Stem Conduction and Light Piping in ITS-90 Fixed Point Cell Assemblies At A UKAS Laboratory - J. P. Tavener & A. Blundell - Standard Platinum Resistance Thermometers (SPRTs) with length-below-handle of only 480mm are regularly submitted for calibration at ITS-90 fixed points from -200 °C to +660 °C. The length of the thermometer limits the maximum size of fixed point cell that can be used to calibrate the thermometers. Stem conduction effects have been measured at zinc and aluminium temperatures in resealable cells. These have been quantified and eliminated by adopting a cell design with a very small connection between cell and gas supply.
- Temperature Calibration: Depths of Immersion - John P. Tavener - Of all the sources of errors and uncertainties in thermal calibration by far the largest source of error and least understood effect is that of immersion of unit under test, and the reference standard.
- Primary Laboratory Comparisons - The most accurate measurements made in a Primary Temperature Laboratory are during intercomparisons of ITS-90 fixed point cells, and in particular inter-comparing water triple point cells. To assess the stability of the water triple point, a laboratory ideally needs to be able to measure differences of just one or two micro degrees. At the Northern Temperature Primary Laboratory (NTPL) we found the spread of results too large to give a satisfactory result. Consulting the literature, and in particular Tischler &Prado [3] we eventually developed a 3 current technique from which we were able to calculate the zero current resistance to within 1 or 2 micro degrees. This paper describes in detail our method.
- Gold/Platinum Thermocouple and the DC Measurement Requirements - John P. Tavener - This is an interesting article highlighting why thermocouples should not be used for high accuracy applications. Thermocouples are nasty complicated ill understood things that measure temperature differences badly and should be avoided at all cost. Calibration of thermocouples - if possible at all - is a topic fraught with measurement problems. This author has avoided them as far as possible for the past 30 years and refuses to calibrate most types of thermocouple in Isotech’s UKAS Laboratory. A great friend of mine once said of a fellow scientist “what a pity that such a great man should have devoted so much of his life to such an inferior thing”. ITS-90 removed them in favour of the Standard Platinum Resistance Thermometer. The limiting factor with all normal thermocouples is that one or both of the thermo-elements is an alloy and an alloy can not be produced that is homogeneous. This means that, because the Emf of a thermocouple is generated along the wire where there exists a thermal gradient, if the thermal gradient is moved along the thermocouple the Emf will change. This limits the best thermocouple accuracy to about ±0.3°C, 2 sigma. If both thermo-elements were of pure metal, then this limitation would not exist, giving the possibility of more accurate thermocouple measurement.
Recommended Book - Traceable Temperatures - An Introduction to Temperature Measurement and Calibration - J.V. Nicholas and D.R. White, John Wiley + Sons, 2nd Edition Download Chapter One: Measurement and Traceability Purchase from Amazon.co.uk.
Useful Books on Instrument Calibration and Measurement
Calibration: A Technician's Guide - This comprehensive review of calibration provides an excellent foundation for understanding principles and applications of the most frequently performed tasks of a technician. Topics addressed include terminology, bench vs. field calibration, loop vs. individual instrument calibration, instrument classification systems, documentation, and specific calibration techniques for temperature, pressure, level, flow, final control, and analytical instrumentation. The book is designed as a structured learning tool with questions and answers in each chapter. An extensive appendix containing sample P&IDs, loop diagrams, spec sheets, sample calibration procedures, and conversion and reference tables serves as a very useful reference. If you calibrate instruments or supervise someone that does, then you need this book.
Quality Calibration Handbook - Developing and Managing a Calibration Program - Quality calibration systems are the very foundation for improving research and development (R&D), production, and quality assurance arenas through accurate, reliable, and traceable calibrations of their test equipment. This book is about how to design, implement, maintain, and continuously improve a quality calibration system, with all the required documentation, traceability, and known uncertainty for each and every item of test equipment owned and used by any company, large or small. It will benefit companies that want to implement a program and also those that already have one in place. Calibration requirements vary across specific industries but every organization can use the quality calibration system described in this book as a foundation for its personalized program. By using the quality calibration system outlined and demonstrated, any organization can put together its own version to meet its specific requirements and/or regulations.
Loop Checking: A Technician's Guide - In today’s competitive markets, manufacturers strive to continually improve manufacturing performance to meet their business needs and goals. As process control loops have a major impact on a plant’s financial performance, focusing on loop performance is critical. This technician’s guide defines loop checking in the broader scope of control loop performance in addition to the more traditional terms of the plant startup. It discusses general methods and practices that can be applied across many processes/industries. Featured topics include: loop checking basics, factory acceptance testing, wiring and loop checks, performance benchmarking, and sustaining performance.
Troubleshooting: A Technician's Guide - Troubleshooting loops and systems is something all technicians must do, but that few truly master. This newly revised edition draws on the author’s long experience as an instrument and electrical engineer and his maintenance expertise to provide a detailed look at the skills and knowledge required for troubleshooting. Interspersed with a wealth of practical detail and real-world examples are Mostia’s no-nonsense discussions of what a good troubleshooter needs to know. He provides an in-depth discussion of the basic logical framework that underlies all troubleshooting as well as advanced troubleshooting techniques. He also explores the causes of failures and the techniques that engineers and technicians use to trace them down. This new edition covers troubleshooting methods, both basic and advanced, hints and troubleshooting aids, troubleshooting safety, basic maintenance concepts, information about training, and the developing troubleshooting skills. It also includes numerous examples of troubleshooting problems in mechanical systems, process connections, pneumatic systems, electrical systems, electronic systems, and valves. Mostia also explores test equipment, programmable electronic systems, communication circuits, transient problems, and software.
Start-Up: A Technician's Guide - When new plants or systems go online, the control systems technician (CST) faces special challenges. The author explores and explains the crucial role of a technician in this process. This book offers you a clear overview of typical start-up responsibilities. From the first team meeting to the last round of tuning and loop checking, Harris uses her extensive experience with process control plants to walk you through the issues and skills typically required. Each chapter includes self-study learning objectives, practice questions and exercises, answers, and listings of relevant standards. Written with the technician in mind, it is an application-oriented book that provides an overview of the scope of duties a technician must perform in real-world situations. Includes over 30 figures and tables: fully indexed.
Ultimate Calibration - 2nd Edition - This calibration handbook covering various topics about advanced calibration comprises 203 pages and 19 articles - Calibrators, calibration software and other related equipment have developed significantly during the past few decades in spite of the fact that calibration of measurement devices as such has existed for several thousands of years. Presently, the primary challenges of industrial metrology and calibration include how to simplify and streamline the entire calibration process, how to eliminate double work, how to reduce production down-time, and how to lower the risk of human errors. All of these challenges can be tackled by improving the level of system integration and automation. Calibration and calibrators can no longer be considered as isolated, stand-alone devices, systems or work processes within a company or production plant. Just like any other business function, calibration procedures need to be automated to a higher degree and integrated to achieve improvements in quality and efficiency. In this area, Beamex aims to be the benchmark in the industry. This book is the 2nd edition of Ultimate Calibration. The main changes to this edition include numerous new articles and a new grouping of the articles to make it easier to find related topics. The new topics covered in the edition mainly discuss paperless calibration, intelligent commissioning, temperature calibration and configuring, and calibration of smart instruments - from Beamix.