Humidity Sensors for Industrial Applications
It could be argued that humidity plays a part in every industrial production process. The very fact that our own atmosphere contains water vapour bears witness to this fact even if it is only that the end product is likely to be stored and eventually used in our environment; therefore, the product’s potential performance under varying conditions of humidity must be known. The extent to which humidity plays a part in any given production process may vary but in many cases it is essential that, at the very least, it is monitored and, in most cases, controlled. It may also be said that humidity is a more difficult property to define and measure than associated parameters such as temperature and pressure. Indeed, it is a truly analytical measurement in which the sensor must contact the process environment, in contrast to pressure and temperature sensors, which are invariably insulated from the process by a thermowell and a diaphragm respectively. This of course has implications for contamination and degradation of the sensor to varying degrees depending on the nature of the environment.
This paper reviews various humidity sensor technologies and their typical applications in context of the measurement ranges to which they are best suited. The effects of contamination, highly significant in view of the analytical nature of the measurement, are briefly assessed. In conclusion, it is suggested that, if initial cost is not the prime consideration, the chilled mirror, optical dew point hygrometer offer the most accurate, repeatable and reliable method of humidity measurement with the widest possible range.
Humidity Measurement Applications in a Range of Industries
Table 1 shows an A to Z of industries where humidity measurement plays a part. Whilst the list is by no means exhaustive, it does serve to illustrate the extremely wide range of applications with which a supplier of humidity instrumentation may be confronted. Indeed, these applications cover six orders of magnitude when considered in terms of Parts Per Million (PPM) by volume of water vapour, equivalent to an overall range of -85 to +100°C dew point. It is of course very unlikely that one measurement technique can cover the entire range but, if initial cost is not the prime consideration, the chilled mirror, optical dew point hygrometer can probably be said to come closest to achieving this.
In practice, a variety of commercial and technical criteria will dictate which measurement technology is used for any particular application. Table 2 shows the most common humidity measurement parameters used within the industries referenced, depending on application, and Table 3 illustrates how certain sensor technologies are associated with specific industries as dictated by the commercial pressures and technical demands of the measurement. These aspects are themselves invariably influenced by the criticality of the measurement.
Two important points to note are that different units are used for different parts of the measurement range and that the measurement units an industry uses are very often a good indicator as to the type of sensor technology they should be employing. Humidity measurement determines the amount of water vapour present in a gas. This gas can be a mixture, such as air, or it can be a pure gas, such as nitrogen or argon. While there are many measurement techniques, the most common parameters are Relative Humidity (RH), Dew/Frostpoint (D/F PT) and Parts Per Million (PPM).
Relative Humidity Measurement (RH)
An RH measurement is the ratio of the partial pressure of water vapour present in the gas to the saturation vapour pressure of the gas at a given temperature. Thus, RH is a function of temperature. The measurement is expressed as a percentage.
The human body is sensitive to, and can experience varying RH in terms of the contrast between a dry and a muggy Summer day.
Dew/Frost Point Measurements (D/F PT)
Dewpoint is the temperature (above 0°C) at which the water vapour in a gas condenses to liquid water. Frost point is the temperature (below 0°C) at which the vapour crystallises to ice. D/F PT is a function of the pressure of the gas but is independent of temperature and is therefore defined as fundamental.
We can all observe the dew point phenomenon in our bathrooms. On a cold day, when the temperature of the surface of a mirror or a polished metal surface such a tap, is below that of dew point of the atmosphere, a dew or condensation layer will form on its surface.
Parts Per Million (PPM)
Expression of water vapour content by volume fraction (PPMv) or, if multiplied by the ratio of the molecular weight of water to that of air, as PPMw.
This parameter is more difficult to conceive as it is beyond the ability of the human body to detect changes of this magnitude in the atmosphere. However a practical example of its application in an industry is that of medical gases: those gases such as nitrous oxide, carbon dioxide and oxygen when used in surgical operations should have a moisture content lower than 60ppm and are regulated in this regard.
The Consideration of some Sensor types and their Application
As previously stated, the wide range of humidity measurement required by the various industries described in Table 1 precludes any one sensor technology from being suitable for all applications. In view of this, and the subject matter constraints on this paper, what follows is a summary of some of the sensor technologies typically used in the industries referenced.
Relative Humidity Measurement Techniques
Relative humidity measurements can be made by psychrometry, displacement, resistive, capacitive and liquid adsorption sensors. Some of these techniques are described below.
Wet Bulb/Dry Bulb Psychrometer
Psychrometry has long been a popular method for monitoring humidity, primarily due to its simplicity and inherent low cost. A typical industrial psychrometer consists of a pair of matched electrical thermometers, one of which is in a wetted condition. In operation, water evaporation cools the wetted thermometer, resulting in a measurable difference between it and the ambient, or dry bulb measurement. When the wet bulb reaches its maximum temperature depression, the humidity is determined by comparing the wet bulb/dry bulb temperatures on a psychrometric chart.
Whilst the psychrometer provides high accuracy at near saturation (100% RH) conditions and is simple to use and repair, its accuracy at lower relative humidities (below 20%) is poor and maintenance requirements are intensive. It cannot be used at temperatures below 0°C and, because the psychrometer is itself a source of moisture, it cannot be used in small, closed volumes.
Psychrometers are typically used to control climatic/environmental chambers.
Perhaps the oldest type of RH sensor still in common use is the displacement sensor. These devices use a strain gauge or other mechanism to measure expansion or contraction of a material in proportion to changes in relative humidity. The most common materials in use are hair, nylon and cellulose. The advantages of this type of sensor are that it is inexpensive to manufacture and highly immune to contamination. Disadvantages are a tendency to drift over time and hysteresis effects are significant.
Bulk Polymer Resistive Sensor
These electrical sensors provide a direct, secondary measurement of relative humidity. They are comprised of an insulating ceramic substrate on which a grid of interdigitated electrodes is deposited. These electrodes are coated with a humidity sensitive salt imbedded in a polymer resin. The resin is then covered by a protective coating that is permeable to water vapour. As water permeates the coating, the polymer is ionised and the ions become mobile within the resin. When the electrodes are excited with an alternating current, the impedance of the sensor is measured and used to derive the percent relative humidity (%RH).
By virtue of their structure, bulk polymer resistive sensors are relatively immune to surface contamination. Although surface build up does not affect the accuracy of the sensor, it does have an adverse effect on the response time. Due to the extremely high resistance at RH values of less than 20%, this sensor is generally better suited to the higher RH ranges.
The capacitive sensor (organic polymer capacitive) is usually designed with parallel plates with porous electrodes or with interdigitated fingers on a substrate. The dielectric material absorbs or desorbs water vapour from the environment with changes in humidity. The resultant change in the dielectric constant causes a capacitance variation which, in turn, provides an impedance that varies in relation to humidity. A dielectric constant change of approximately 30% corresponds to a 0-100% variation in RH.
The sensor material is made very thin to achieve a large signal change with humidity. This permits the water to enter and leave easily and also allows for fast drying and easy calibration of the sensor.
The measurement is made from a large base capacitance; thus the 0% capacitance readings are made at a finite and measurable RH capacitance level.
This sensor type is ideally suited for use in high temperature environments because the temperature coefficient is low and the polymer dielectric can withstand high temperature. Capacitive sensors are also suitable for applications requiring a high degree of sensitivity at low humidity levels, where they will provide a relatively fast response. At RH values over 85% however, the sensor has a tendency to saturate and become non-linear.
Typical applications for the polymer resistive and polymer capacitive sensors are: -
- HVAC energy management.
- Computer room / Clean room control.
- Handheld devices.
- Environmental and meteorological monitoring.
Relative Humidity computed from Dew Point & Temperature
For example, an optical dew point transmitter with a temperature measurement facility could be used to provide a high accuracy RH value. This would represent a relatively expensive ‘secondary’ output from a primary measurement.
Devices typically used for Dew/Frost Point (D/F PT) Measurements
The saturated salt lithium chloride sensor, the aluminium oxide sensor and the optical chilled mirror sensor are all used to measure D/F PT directly. These sensors provide a wide measurement range in terms of dew or frost point.
Saturated Salt Lithium Chloride Sensor
The saturated salt lithium chloride sensor has been one of the most widely used dew point sensors. Its popularity stems from its simplicity, low cost, durability, and the fact that it provides a fundamental measurement.
The sensor consists of a bobbin covered with an absorbent fabric and a bifilar winding of inert electrodes. The bobbin is coated with a dilute solution of lithium chloride. An alternating current is passed through the winding and the salt solution causing resistive heating. As the bobbin heats, water evaporates from the salt at a rate which is controlled by the vapour pressure of water in the surrounding air. As the bobbin begins to dry out, the resistance of the salt solution increases causing less current to flow through the winding and allowing the bobbin to cool. This heating and cooling of the bobbin reaches an equilibrium point where it neither takes on nor gives off water. This equilibrium temperature is directly proportional to the water vapour pressure or dew point of the surrounding air. This value is measured using a platinum resistance thermometer (PRT) and output directly as a D/F PT.
If a saturated salt sensor becomes contaminated, it can easily be cleaned and recharged with lithium chloride. The limitations of the technology are a relatively slow response time and a lower limit of the measurement range which is imposed by the nature of the lithium chloride . The sensor cannot be used to measure dew points when the vapour pressure of water is below the saturation vapour pressure of lithium chloride, which occurs at about 11% RH.
Saturated salt sensors are an attractive proposition when a low cost, rugged, slow responding and moderately accurate sensor is required. They are typically used for the following applications:
- Refrigeration controls
- Air line monitoring
- Pill coaters
For applications requiring greater accuracy and/or a wider range of measurement, condensation-type, electrolytic, or oxide sensors should be considered.
Aluminium Oxide Dew Point Sensors
The aluminium oxide dew point instrument and its derivatives, such as ceramic or silicon based sensors, are secondary measurement devices that infer the D/F PT value from the way in which the capacitance measurement is affected by the humidity environment in which it is situated. They are available in a variety of types, from low-cost, single point systems, including portable battery operated models, to multi-point, microprocessor based systems with the capability to compute and display humidity information in different parameters.
A typical aluminium oxide sensor is a capacitor, formed by depositing a layer of porous aluminium oxide on a conductive substrate and then coating the oxide with a thin film of gold. The conductive base and the gold layer form the capacitor’s electrodes. Water vapour penetrates the gold layer and is absorbed by the porous oxide. The number of water molecules absorbed determines the electrical impedance of the capacitor, which is in turn proportional to the water vapour pressure.
Oxide sensors are small in size and lend themselves to in-situ use. They are suitable for low frost point measurement (-100?C) and can operate over a relatively wide span encompassing high pressure applications. They can also be used to measure moisture in liquids and, due to low power usage, are suitable for intrinsically safe and explosion proof installations.
Aluminium oxide sensors are frequently used in the petrochemical and power industries where low dew points are to be monitored “in line” with economical multiple sensor arrangements.
The main disadvantage associated with these sensors is that they are secondary measurement devices and must be frequently recalibrated to accommodate ageing effects, hysteresis and contamination.
Chilled Mirror (Optical Condensation) Hygrometer
The chilled mirror hygrometer is widely considered to be the most precise method for dew point measurement. It is a primary measurement, measuring as its name suggests, the actual condensation point of the ambient gas and can easily made traceable to international calibration standards such as UKAS & NIST. The sensor contains a small metallic mirror, the surface of which is chilled until water condenses out of the sample gas onto the mirror surface.
The mirror is illuminated by a light source and the reflection is detected by a phototransistor. At the occurrence of condensation, the reflected light is scattered and, therefore, reduced at the detector. A control system keeps the temperature of the mirror at the point where a thin film of condensation is maintained. A PRT embedded in the mirror measures its temperature and therefore, the D/F PT temperature.
Accuracies of +/- 0.2°C are possible with chilled mirror hygrometry. Multi-stages of peltier cooling supplemented in some cases with either additional air or water cooling can provide an overall measurement range of -85 to almost 100°C dew point. Response times are fast and operation is relatively drift free. Inert construction and minimal maintenance requirements (the two features are intrinsically linked) also considered, the chilled mirror hygrometer is an excellent choice of sensor for demanding applications where the cost can be justified.
It is true to say that you will find an optical dew point hygrometer at the end of most calibration chains and the more robust designs are equally well suited to controlling a critical industrial process as they are to providing the reference standard in a calibration laboratory. Some systems have a fairly sophisticated method of addressing contamination but this issue will be dealt with in more depth within another section of this paper.
Typical applications for the Optical Condensation Hygrometer:-
- Medical air lines
- Liquid cooled electronics
- Cooled computers
- Heat treating furnaces
- Smelting furnaces
- Clean room controls
- Humidity calibration standards
- Engine test beds
Devices typically used for PPM Measurements
Electrolytic, piezo-resonance and multi-stage chilled mirror sensors are used to measure water vapour in the low PPM region. When making measurements in this range and using sample systems as opposed to in-situ measurement techniques (sometimes process conditions (high temperature, pressure, corrosive gases) and/or the type of sensor technology being used will preclude an in-situ measurement) it is vital to ensure all fittings are gas tight, non-hygroscopic materials (i.e. stainless steel) are used, and, when initiating the measurement, adequate time is allowed for the system to dry down and equilibrate.
The electrolytic hygrometer is usually used in dry gas measurements as it provides reliable performance for long periods in the low PPM range. Typically, the electrolytic sensor requires that the gas being measured must be clean and should not react with a phosphoric acid solution, although recent developments in the sensor cell technology and sample conditioning systems allow more hostile applications, such as moisture in chlorine to be addressed.
The electrolytic sensor utilises a cell coated with a thin film of phosphorous pentoxide (P2O5), which absorbs water from the gas under measurement. When an electrical current is applied to the electrodes, the water vapour absorbed by the P2O5 is dissociated into hydrogen and oxygen molecules. The amount of current required to dissociate the water is proportional to the number of water molecules present in the sample. This number, along with the flow rate and temperature is used to determine the parts per million by volume (PPMv) concentration of the water vapour.
The electrolytic sensor is used in very dry applications up to a maximum of 1000 PPMv. and is well suited for use in industrial processes such as ultra pure gas, specialist catalyst, fine chemicals and integrated circuit production. In each of these cases, the success of the production process is dependent on the maintenance of inert blanket conditions. This means that very often a continuous supply of either nitrogen or argon is used to purge the production environment. As well as maintaining the purity of the gas, the water vapour content should also be kept very low and the electrolytic hygrometer is ideally suited to providing dependable measurements in just such an environment.
Other typical applications for this sensor include: -
- Ozone generators
- Dry air lines
- Nitrogen transfer systems
- Inert gas welding
In summary, whilst the electrolytic hygrometer can provide a primary, reliable measurement at low moisture, the accuracy of the device is dependent on maintaining a controlled and monitored sample flow. Applications must be selected carefully as certain gases will corrode and/or contaminate the sensor.
The piezo-resonance sensor operates on the principle of RH equilibrium where the sorption of water increases the mass, which directly affects the resonant frequency of the crystal.
The sensor has a humidity sensitive coating placed on a resonating crystal surface. The crystal resonant frequency changes as the humidity sensitive coating either absorbs or desorbs water vapour in response to changes in the ambient humidity levels. This resonant frequency is compared to a similar measurement in a dry gas or to a reference frequency that has been calibrated for % RH.
Optical Condensation Hygrometer with maximum cooling capability
As previously stated under the Dew/Frost PT measurement section, an optical condensation hygrometer with multi-stages of peltier cooling, supplemented in some cases with either additional air or glycol/water cooling, can provide a dew point measurements at the lower end down to -85°C, which is less than 0.25 PPMv of water at 1 atmosphere pressure.
The Problems of Contamination
In order to understand the significance of the potential effects of contamination on a humidity sensor it is appropriate at this stage to refer back to a statement made within the introduction to this paper:
‘Humidity is a truly analytical measurement in which the sensor must contact the process environment, in contrast to pressure and temperature sensors, which are invariably insulated from the process by a thermowell and a diaphragm respectively. This of course has implications for contamination and degradation of the sensor to varying degrees depending on the nature of the environment’.
Very little contamination should exist in pure gas stream, dew point or PPM level monitoring; however, in most industrial processes there is a high potential for contamination either by direct, process gas borne particulates or by soluble contaminants contained within the very moisture content which it is necessary to measure.
All the sensors referenced in this paper are affected by both soluble and insoluble contaminants. Unfortunately, many of the sensors when contaminated will not appear to be so but will be seen to be providing a very logical measurement value to the humidity control system. Without periodic checking and recalibration the only evidence that the sensor has “gone to sleep” will be derived from the gradual appearance of inferior product in some form or other; therefore, with the majority of humidity sensors it is essential that periodic maintenance should include checks of response and accuracy.
This may be done with humidity calibration systems, which include saturated and unsaturated salts, relative humidity and dew point generators.
Two approaches are adopted to try to accommodate contamination affects: one approach is to devise a sensor where the detrimental effect of contamination is reduced thereby prolonging the active life of the sensor. This may be inherent in the sensor design itself (this is the concept behind the bulk polymer, resistive RH sensor) or may be effected by introducing some form or filter or sheath into the system. The more physical barriers you put between the sensor and the environment however, the more problems you encounter in trying to make a viable and accurate measurement. Once contaminated and blocked, a filter may have the effect of creating an unrepresentative microenvironment between itself and the sensor. The measurement is therefore limited in terms of accuracy and response time and the filter will only intercept particulate contamination. Alternatively, the second approach is to accept that contamination will take place and therefore devise a way in which it can be monitored and, if possible, compensated for.
One measurement technique that falls into the latter category is the optical dew point hygrometer, which can incorporate a self-checking feature, which may be operated either manually or automatically (in the case of the most sophisticated designs), within the electronic control unit. Since the optical hygrometer provides a continuous, live measurement of the dewpoint or humidity value in that the optical control system is continually viewing the mirror’s surface and, therefore, the dew or frost layer formed upon it, it will react to contamination that deposits itself on the mirror either through solid particulate or salts contained within the water vapour being monitored.
When the dewpoint sensor is first put into operation and the mirror is clean, a perfect layer of condensation may be maintained on its surface and high accuracy and repeatability will result. As the sensor continues to operate, however, sometimes for weeks or even months contaminants are gradually dropped out of the sample stream being measured on to the mirror. These contaminants can cause two types of error as follows.
Solid Particulate Contaminants
Just as a dew layer decreases the quantity of light reflected from the mirror to the light detector so to does the increasing build up of non-water soluble contamination. If this were allowed to continue indefinitely the system would go out of control and read out a large dewpoint error. Prior to this occurrence, however, the mirror must be cleaned; in all industrial applications for dewpoint sensors it is recommended that the sensor mirror be cleaned before process measurement commences.
Water Soluble Contaminants
There are often water soluble contaminants occurring in the sample, usually in the form of natural salts. These salts go into solution with the pure water on the mirror surface and cause the vapour pressure to be lowered. This can result in an excess build up of water on the mirror (deliquescence) at the true dewpoint. The servo control loop detects the resulting loss of received light then raises the mirror temperature to compensate i.e. it evaporates some of the excess water. A positive error of several degrees may result from this effect and this phenomenon is called the Raoult effect since it is defined by Raoult’s Law.
Several contaminant error correction techniques have been developed over time for the optical dewpoint hygrometer. Early systems simply used a manual balance technique, which was then developed as an automatic balance control (ABC). Later twin beam/twin mirror and continuous balance systems evolved. All these methods involved re-balancing or upsetting the optical sensor bridge in order to compensate for the accumulated contamination error on the mirror. These techniques are effective in correcting only the particulate contamination described above; they will not correct contamination errors arising through the Raoult effect because the automatic servoloop is incapable of differentiating between an excessively thick dew layer due to a increase in actual dewpoint or an excessively thick dew layer caused by salt contamination on the mirror. In both cases the servoloop will make a positive temperature correction and evaporate some of the dew layer; it will correct in the first place but will cause a readout error in the second. This error can be of the order of several degrees.
An error correction technique called PACER (Programmable Automatic Contaminant Error Reduction) was developed by General Eastern as an effective way of reducing errors due to the Raoult effect. The PACER correction cycle starts with a coalescence period, that is the mirror temperature is intentionally cooled below the dewpoint of the sample, condensing out a large amount of water. This excess water dissolves any water soluble contamination. The mirror is then heated exactly as with an automatic balance system. The large puddles of water gradually evaporate carrying increasingly heavy concentrations of contaminants until finally, when all the water has been evaporated, dry islands of crystallised contaminants are left on the mirror.
Now 80 to 85% of the surface is clean and reflective where before the entire surface was covered with contaminant. The system then proceeds to grow a new dew layer in the clean areas on the mirror surface and a further period of error free operation follows.
Eventually, of course, the system will have to be shut down for cleaning but when that point is reached the instrument will advise the user by a visual alarm or by electronic means on a control panel. During the periodic PACER cycle, which typically last a few minute, the analogue output and digital display remain at the dewpoint level prevailing immediately before the occurrence; therefore, the actual process control value is maintained. At the end of the PACER cycle real time dewpoint information is once again displayed and a bumpless transfer occurs to any new dew or frost point value now measured.
Another type of optical device that merits reference at this point is the Cycling Chilled Mirror (CCM) hygrometer. This hygrometer employs a different method of addressing contamination in that it has no error reduction circuitry as such but simply limits the amount of time the mirror surface spends in the wet state. Cooling of the mirror to the prevailing dew point is performed on a cyclic basis. Once the dew point has been detected and reported, the mirror heats to a temperature slightly above ambient and then ‘waits’ in the dry condition. The measurement cycle then repeats. As the mirror is wet for a relatively small amount of time, the potential for contaminants to fall out of the gas stream into the dew is reduced.
In summary, all humidity sensors are affected by the environment they are monitoring which can lead to contamination, causing eventual insensitivity to a changing process humidity condition. The General Eastern PACER cycle, automatic balance control or manual balance adjustment available with chilled mirror optical dewpoint hygrometers not only enables the process operator to check whether the sensor is becoming contaminated but also allows adjustment either automatically or manually to compensate up to a limit when maintenance in the form of mirror cleaning is required. As a result, optical dewpoint hygrometers can generally operate continuously and unattended for longer periods of time than most other humidity measurement systems and provide what is probably the most accurate, repeatable and reliable humidity measurement available for process industry humidity monitoring, particularly in heavily contaminated atmospheres. Some devices such as the Cycling Chilled Mirror (CCM) hygrometers simply attempt to limit the affects of contamination by reducing the amount of time the mirror spends in the ‘wet’ state. The sensor cools to the prevailing dew point on a cyclic basis and therefore reduces the potential for contaminants to be deposited in the dew on the mirror surface. It should be recognised however that this is not a ‘live’ dew point measurement technology and is not suitable for applications where process conditions are subject to rapid change.
Having reviewed a wide range of industrial humidity measurement sensors it is clear that no one measurement technique is suitable for all applications; also, whatever the technique used the process environment will eventually contaminate the humidity sensor. The question which normally arises is, at what point did the sensor become so contaminated that it was no longer able to give a reliable and accurate measurement?
The phrase often heard in the humidity measurement industry is that the sensor has ”gone to sleep”, i.e. it appears to be measuring a very logical humidity value but has become totally insensitive to process humidity changes. In most cases only removal of the sensor from the process for periodic checking and recalibration can overcome this problem. Sometimes the sensor is so badly contaminated that it has to be replaced.
The one measurement technique, however, which can overcome this problem to a great degree is the chilled mirror optical dewpoint hygrometer. If initial cost is not the governing factor in any given humidity measurement application, then the chilled mirror optical dewpoint hygrometer would appear to provide the most versatile method of humidity measurement, having built in features which allow it to monitor the degree of contamination occurring and adjust its performance to compensate. When this adjustment, which can be manual or automatic, reaches its limit the instrument will advise the user by an operational indication or by an alarm thus allowing in-situ cleaning of the sensor and re-standardising to put it back into operation.
The optical chilled mirror hygrometer, therefore, not only provides what is considered to be the most accurate method of measurement with a wide measurement range, facilitated by using sensors with varying depression capabilities, but also the most repeatable and reliable measurement due to its self checking capability. Being a primary measurement technique, it is also accepted as a traceable, continuous, on line measurement, which is particularly relevant where traceable production quality is required. It is only the relatively higher initial cost of this type of hygrometer that prevents it from being used more widely, perhaps, to solve industrial humidity measurement problems.