Flow & Level Calibration Notes

Differential Pressure Transmitter Calibration

Introduction

In the Differential Pressure Transmitter, as flow increases, the differential pressure increases, and when flow decreases the differential pressure decreases. For example, an orifice plate is placed in a pipe to restrict the fluid flow. This restriction creates a pressure drop that can be converted to flow rate. The differential pressure transmitter measures the pressure drop created, by measuring pressure at two different points, upstream and downstream. Differential pressure, then is the difference between the higher pressure or upstream reading and the lower pressure or downstream reading. Differential Pressure = High Pressure - Low Pressure.

Input and Output Measurement Standards

Differential pressure is usually measured in inches of water.

Use a low pressure calibrator to both furnish and measure the input pressure.

A milliammeter is an appropriate output standard to measure the transmitter's output.

Differential Pressure Transmitter Connections

Connect the transmitter to the pressure calibrator as shown in the manufacturer's instructions for the calibrator. The air supply requirements for the calibrator are also found in the manufacturer's instructions. Connect the output from the pressure calibrator to the high pressure port on the transmitter to provide signal pressure. Vent the transmitter's low pressure port to atmosphere to provide a reference point for the differential pressure measurement. To measure the transmitter output, connect a milliammeter to the transmitter. Then connect a 24-volt power supply in series with the transmitter and milliammeter.

Differential Pressure Transmitter Five-Point Check

Typically, inputs at 10%, 30%, 50%, 70% and 90% of span are used as test points.

Check for Hysteresis. Hysteresis is the tendency of an instrument to give a different output for a given input, depending on whether the input resulted from an increase or decrease from the previous value.

Often the data from an instrument test is recorded on a calibration data sheet to help identify instrument errors.

Adjusting for Error Correction

Adjust the zero first, since span error is corrected only after an accurate zero is established. Zero is properly set when a 10% input produces a 10% output.

Adjust the span at 90%. Since zero and span frequently interact, after one of these errors has been corrected, the other may require readjustment.

Square Root Extractor

Flow rate which may be represented by Q, is the square root of the calculated pressure drop across a restriction. Q = square root of the Differential Pressure. Differential pressure transmitters may include an integral square root extractor, which provides a linear output signal. However, if a square root extractor is not part of the transmitter circuitry in the process, a separate square root extractor may be installed in the output signal loop.

Input and Output Measurement Standards

In a loop, a 4-20 mA output from a differential pressure transmitter provides an input to the square root extractor. So, in the calibration, a milliamp source would provide an appropriate input standard. The output measurement standard is also a milliammeter. To complete the setup, connect a power supply in series with a square root extractor and milliammeter. Manufacturer's instructions specify the input values and expected outputs.

The square root of the input determines the output.

Magnetic Flowmeter Calibration

Introduction

In measurement and control loops where the process flow is a conductive liquid, magnetic flowmeters can be used to measure flow. As fluid passes through the meter's magnetic field, the fluid acts as a conductor. The change in potential varies directly with the fluid velocity.

Input and Output Standards

Disconnect the flow tube from the transmitter. A magnetic flowmeter calibrator simulates the signal provided by the electrodes in the flow tube. The operating voltage and frequency range of the calibrator must match those of the magnetic flowmeter. Select the maximum output signal using the calibrator range switch. The signal options include 5,10, or 30 mV AC. The magnetic flowmeter calibrator has predetermined test point, so the percent output knob is used to set each output for a five-point check. Since output is in milliamps, a milliammeter is the appropriate output measurement standard for this calibration setup.

Five-Point Check

To begin the calibration of a magnetic flowmeter, calculate the input signal value. The input signal is equal to the upper range multiplied by the calibration factor and by the phase band factor. These values are indicated on the instrument's data plate.

Input Signal = Upper Range x Calibration Factor x Phase Band Factor

Record the output values at each test point, and from this data determine if the instrument is within manufacturer's specifications. The following formula tells if the range of error is within manufacturer's specifications:

Accuracy = (Deviation / Span ) * 100

Deviation = Expected Value - Actual Value

Adjust zero at the lowest point in the instrument's range by turning the zero adjust screw until the output reading is correct. Then adjust span, and , since zero and span often interact, verify both until no further adjustment is necessary.

To conclude the calibration, recheck the upscale and downscale readings to verify that the instrument is properly calibrated.

Vortex Shedding Flowmeter Calibration

Introduction

In the process line, flowing fluid strikes the bluff body and vortices are shed alternately on each side of the bluff body. The flow velocity determines the frequency at which the vortices are shed. The shedding of the vortices induces a resonance in the bluff body that is detected by the sensor. From the sensor, pulses are sent to the transmitter where the appropriate loop output is developed.

Input and Output Measurement Standards

Calibration of a vortex shedding flowmeter's transmitter requires an input standard that can simulate the electrical pulses counted by the transmitter. A frequency generator provides this input. For a more detailed explanation of a specific frequency generator's features, see the manufacturer's literature.

A milliammeter will display the output signal.

Settings and Connections

Before making the connections, determine the vortex shedding frequency. The vortex shedding frequency is usually provided by the manufacturer, but if it is not listed in the manufacturer's literature, calculate the frequencies using this formula:

Vortex Shedding Frequency = RF x CF x ( URV/TIME)

where:

The vortex shedding frequency is represented in pulses per second or PPS.

PPS = represents the alternate shedding of vortices on either side of the bluff body

RF = stands for reference factor which can be found on the transmitter's data plate and is usually represented in pulses per US gallon

CF = is the conversion factor, and is a number found in the manufacturer's conversion table, the CF converts the RF to actual volume or mass flow rate units

URV = is the upper range value in US gallons per minute

TIME = is related to the increment in which the flow is measured

Are span jumpers required to calibrate this transmitter? Refer to the manufacturer's instructions for the appropriate setting for the coarse span jumpers.

Once the PPS has been determined and span jumpers set in their proper positions, the frequency generator can be connected to the input terminals of the transmitter. The output side of the calibration loop is connected in series. After all the connections are made, set the fine span.

Adjustments and Accuracy

To set the fine span, the frequency generator is set to the upper range value of the transmitter. Adjust the fine span screw until 100% output is indicated on the milliammeter. Once the fine span adjustment has been completed, as indicated by a 20 mA reading on the output standard, adjust the zero.

Disconnect the frequency generator and connect the signal lead to the shield of the coaxial cable.

Zero is adjusted at the low range value so the generator must be set to the appropriate setting following the fine span adjustment.

The zero is adjusted until the output signal matches the input signal at the lowest input value.

The span and zero should be adjusted and readjusted until both are correct.

Perform the Five-Point check to verify the instrument's range accuracy.

Mass Flowmeter Calibration

Introduction

A mass flowmeter measures flow rate in weight per unit time rather than volume. This measurement is independent of temperature and pressure.

Input and Output Measurement Standards

Although an input and an output standard are needed, smart mass flowmeters are digital instruments, so they also require a special communication device. This device, called an interface device or communicator, is used for configuration and calibration.

The interface device must be connected to the mass flowmeter. It can be connected at any point in the loop as long as it is in parallel with the signal lines.

When the device is turned on, a self-diagnostic program is performed. After the self-test is complete, press the process variable key to display current readings from the transmitter.

Keys have the following functions:

HELP = gives an explanation of the current display and the function keys

RESTART = initiates communication with the smart transmitter

PREVIOUS FUNCTION = returns user to last decision level

ALPHANUMERIC KEYS = enter information into the interface

Configuring Mass Flowmeters

Configuration defines the parameters of the transmitter's operation. To begin on-line configuration, press the "config" key on the interface device. Then enter information to identify the instrument.

The instrument parameters are checked and displayed by the interface device. Make any corrections to these displays by following the prompts on the interface device display. With the changes made, press the restart key to download the data to the transmitter. This completes the configuration. Now verify span and zero.

Correcting Errors

To establish the zero and span values, the flow tube must be full of process liquid with no flow. Because actual process fluid is used, this procedure is typically done on an installed transmitter.

Press the "format" key followed by the "proceed" key. Then press the "autozero" key.

The autozero automatically establishes zero to the properties of the process fluid.

The resulting display will indicate that the mass flowmeter is properly calibrated.

Hydrostatic Level Calibration

Introduction

A level sensing device locates the interface between a liquid and a vapor or between two liquids. Then it transmits a signal representing this value to process measurement and control instruments. As the level in the tank changes, the output reading changes proportionally. Hydrostatic head pressure is used to measure fluid level. To determine the height or level of a liquid the head pressure is measured and by knowing the specific gravity of the liquid the height can be calculated. Hydrostatic level gaging often use a differential pressure transmitter to compensate for the atmospheric pressure on the liquid. The high pressure port senses the atmospheric pressure on the fluid in the tank. The high side also senses hydrostatic head pressure. The difference between the pressures can be converted to level. The low pressure port senses only atmosphere.

In dip pipe applications, gas flows through a pipe that is submerged in the tank's liquid. A differential pressure transmitter measures the back pressure on the tube caused by an increase in the tank level. The high pressure port senses the pressure increase caused by the back pressure in the dip pipe. The low pressure port is vented to atmosphere.

The same calibration procedure applies for any differential pressure level measuring system.

Input and Output Measurement Standards and Connections

A low pressure calibrator is the input measurement standard. It provides and measures low pressure values as required for calibrating hydrostatic level systems. A low pressure calibrator contains a pressure readout and pressure regulator.

A milliammeter measures the transmitter's output. The milliammeter, power supply, and transmitter should be connected in series. For best calibration results, mount the transmitter in the same position as it is installed in the process. At the transmitter connect the source of pressure to the high pressure port and vent to atmosphere the low pressure port.

Five-Point Check

Determine the instrument's range and test points for calibration.

For the lower range value measured in inches of water, divide the minimum height of the liquid in inches by the liquid's specific gravity. The upper range value is the maximum height of the liquid in inches of water divided by its specific gravity. The span then, is the difference between these values. Perform the five-point upscale and downscale check.

Correct the zero at 10% of input span, adjusting zero until the output produced is 10% of the output span. Next, correct the span error, applying 90% input and adjusting the span until 90% output is produced.

Closed Tank Level Gauging

The procedure used in open tank applications is also used for closed tank applications. Closed tank applications must compensate for the static pressure in the vapor above the liquid. To accurately measure the head pressure of the liquid alone a reference leg is used. The reference leg is a pipe connecting the vapor space to the low side of the differential pressure transmitter. The reference leg must be either completely dry or completely filled with liquid.

Dry Reference Leg

The low pressure port receives the pressure of the vapor space. The high side receives vapor pressure in addition to the pressure from the liquid. The value measured by the transmitter represents only the pressure of the liquid because vapor pressure is applied to both the high and low sides of the transmitter. Calibrate with pressure to the transmitter's high pressure port, and vent the low pressure port to atmosphere. Adjust the transmitter's span for the specific gravity of the liquid in the tank. The low range is equal to the minimum level in inches, and the upper range value is equal to the maximum level in inches.

Wet Reference Leg

Often it is necessary to use a reference leg filled with liquid for gaging the level in closed tanks that contain volatile fluid. The column of fluid in the reference leg imposes additional hydrostatic pressure on the pressure side of the transmitter. This additional pressure must be compensated for to correctly gage level.

To determine the additional pressure that the reference leg will apply, take the height of the wet leg in inches and multiply it by the specific gravity of the fluid. The reference leg fill liquid may be different from the tank contents. Connect the low pressure calibrator to both ports of the transmitter. A regulator is used to add the hydrostatic pressure of the wet leg to the low side. Then, zero the output until 4 mA of output is produced. After zero is adjusted, perform a five-point check to the high side using a second regulator. In systems where the transmitter is mounted below the minimum measuring level, compensate for the additional static pressure by lowering the zero value. In systems where the transmitter is mounted above the minimum measuring level, compensate for the decreased static pressure by raising the zero value.

Calibrate the transmitter span first before compensating zero for transmitter height location.

Displacement Level Calibration

Introduction

Buoyant force acts on a displacer that is submerged in a liquid. The displacer is reduced in weight by the weight of the amount of fluid it displaces. This movement of the displacer is typically translated and converted to an instrument signal.

Input and Output Measurement Standards

One method is to use actual liquid level as the input for calibrating a displacement level transmitter. The most appropriate liquid for replicating process conditions is a safe liquid with the same specific gravity as the process fluid.

Connect a milliammeter as the output standard and a 24 V DC power supply in a series circuit with the transmitter.

Determine the setting for the calibration dial by multiplying the specific gravity of the liquid by the correction factor. Then, set the pointer to the compensated value.

Displacement level transmitters are classified as direct or reverse acting. With direct action, an increase in level, increases the output signal, and a decrease in level decreases the output signal. With reverse action, an increase in level, decreases the output signal, and a decrease in level increase the output signal.

Calibration

When the chamber is empty, the corresponding output should be 4 mA. If the milliammeter displays a value that is greater than or less than 4 mA, adjust the zero.

To correct span, fill the chamber to the upper range value, and turn the span adjustment until 20 mA is produced.

Linearity is not always adjustable on this type of transmitter, check to manufacturers specifications.

Adjust both zero and span until transmitter performs within specifications.

Liquid-Liquid Interface

This same sensing principle used to measure liquid vapor interface, can be used to locate the interface between two liquids.

The heavier of the two liquids exerts more buoyant force, so as the lower phase rises or falls, the displacer travels with it.

To create calibration conditions that replace the process, use liquids that have the same specific gravity as the process fluids. Fill the chamber with the lighter phase to check and adjust the zero. Check zero by filling the chamber to 100% with the lighter fluid.

Check span by filling the chamber 100% full of the heavier phase. Adjust span until 20 mA output is produced.

Check mid-range output and recheck zero and span.

EIT Latest News

Engineering Institute of Technology