Showing posts with label instrumentation. Show all posts
Showing posts with label instrumentation. Show all posts

Thursday, April 19, 2018

Wireless Process Control Instrumentation

Wireless Process Control Instrumentation
Cost cutting is a fact of life for all industries. Whether it be for more efficient operations, or complying to current regulations, the need to build a better mouse trap is always present.

A very promising cost-cutting technology is wireless instrumentation. Wireless provides a compelling argument to change when you consider installation and overall cost effectiveness. Even more so when the application is located in a harsh environment, or where toxic or combustible situations exist. These robust devices provide critical performance data around the clock in the most inhospitable place in the plant, and operate through rain, wind, high temperatures and high humidity.

Untethered by cables and hard-wiring, wireless instrumentation is easier to deploy and monitor. Wireless transmitters are available for monitoring virtual all process variables such as pressure, temperature, level, flow, density, and acoustics. Networks of up to 100 (900 MHz) field devices can be created and then monitored by a single base radio or access point, with a typical communication range of over 1/2 mile. By communicating through the industry standard, Modbus, compatibility between device manufacturers is ensured.
Wireless Instrumentation
Wireless Instrumentation (Accutech and Foxboro)

The most obvious reason for choosing wireless over hard-wiring is the cost savings associated with running wires and cables. Savings estimates as high as 70% can be realized by deploying wireless field devices, compared to the same application using cables. Additional savings are realized when you consider that these devices use batteries and that the cost of adding to a network is borne only by the cost of the new device.

Wireless instruments also provide significant benefits in safety and compliance by keeping personnel out of hazardous areas. Areas that would require occasional human visitation can be safely monitored through remote monitoring.

So, what's the hold up? If the benefits are so clear, and the argument is so strong, why is there still reluctance to embrace wireless technology?

There are three main concerns:

Wireless instrumentation must provide the same reliability (real and perceived) as traditional wired units. Every engineer, operator and maintenance person knows wires. Troubleshooting wires is easy, and understanding the failures of wires is basic - the wire is either cut or shorted. With wireless however, air is the communication medium and radio signals replace wires. Radio signals are more complicated than wires in terms of potential problems. For instance, signal strength, signal reflection and interference are all possible impediments to reliable links.

The good news is that radio frequency design is continuously improving, and the use of new and advanced technologies, such as frequency hopping receivers and high gain antennas, are enabling wireless devices to create highly reliable links.

Adapting to Existing Infrastructure
Wireless instrumentation networks have to adapt to the existing environment and the placement of structures and equipment. Most times it's just not practical to relocate equipment just to create a reliable wireless link. This can make it challenging to find the optimum location for a base radio or access point that is capable of providing a reliable communication link to your wireless instruments. Furthermore, accommodating the best strategy for one wireless device could negatively affect links with other devices on the same network.

The challenges of adaptability are being overcome by providing better frequency bands (such as 900 MHz). These bands provide longer range, the ability to pass through walls, and offer more saturating coverage. Other ways to overcome adaptability concerns are through the use of external, high gain antennas mounted as physically high as possible, and also by using base radios with improved receiving sensitivity.

Integration with Existing Communications
Engineers, operators, and maintenance crews are challenged by integrating wireless instrumentation networks with other, existing, field communications systems. The issues of having to manage and troubleshoot multiple networks adds levels of complexity to existing systems. This creates a conflict between the financial argument to adopt wireless instrumentation and the possible costs to increase the data gathering capabilities of an existing system. For instance, SCADA systems need to be able to handle the additional data input from wireless devices, but may not have the capacity. Adding the additional data capacity to the SCADA system can be expensive,  and therefore offset the wiring and cabling savings.

The financial argument for industry to adopt wireless instrumentation networks is persuasive, but its acceptance in the process control industry is slow. Reliability, acclimation, and integration are all challenges that must be overcome before widespread adoption occurs. Eventually though, the reality of dramatically reduced deployment and maintenance costs, increased safety, and improved environmental compliance will tip the scale and drive wireless as the standard deployment method.

Always consult with an experienced applications engineer before specifying or installing wireless instrumentation. Their experience and knowledge will save you time, cost, and provide another level of safety and security.

Monday, March 26, 2018

Process Instrumentation and Noise

Protect instrumentation from electrical noise.
Protect process instrumentation from
electrical noise.
Instrument noise, and eliminating instrument noise, is important to consider in process control instrumentation. Noise represents variations in process variable measurement that is not reflective of actual changes occurring in the process variable. Typically, electrical devices such as high voltage wiring, electric motors, relays, contactors, and radio transmitters are the primary sources of instrument noise.

No matter the cause for the process noise, the measurement signal in the process is being distorted and is not reflecting the true state of the process at a certain time. Accuracy and precision of process measurements are negatively affected by noise, and can also contribute to errors in control system. Controller output can reflect the noise affecting a process variable.

Grounding allows for the reduction of noise stemming from electrical systems. Shielded cabling and separating signal cabling from other wiring, as well as replacing and repairing sensors, allows for noise reduction. Low-pass filters are a way to compensate for noise, and much of the instrumentation used in process systems incorporates noise dampening features automatically. Determining the best kind of filter to use depends heavily on cut-off frequency, alpha value, or time constant.

The ideal low-pass filter would eliminate all frequencies above the cutoff frequency while allowing every frequency below the cut-off frequency to be unaffected. However, this ideal filter is only achievable mathematically, while real applications must approximate the ideal filter. They calculate a finite impulse response, and also must delay the signal for a bit of time. To achieve better filter accuracy, a longer delay is needed so that the filter computation “sees” a bit further into the future. The calibration of these filters heavily relies on the desired accuracy level of the process, while also taking specific steps in calibration to best fit a particular process.

Noise is important to mitigate because the noise observed while measuring the process variable can produce “chatter” in the final control element of a process. The resulting “chatter” increases the wear of mechanical control elements, such as valves, and will generate additional cost for the process as a whole. The filtered signal lagging behind the dynamic response of the unfiltered signal is a result of the filtered signal’s increased dead time, meaning that signal filters add a delay in sensing the true process state. The solution is to find a mid-point between signal smoothing and information delay, which allows for elimination of noise while not negatively affecting the speed by which information is delivered.

For question about any process control application, or challenge, visit or call (800) 892-2769

Sunday, August 13, 2017

The Basics of Process Control Instrument Calibration

Process Control Instrument CalibrationCalibration is an essential part of keeping process measurement instrumentation delivering reliable and actionable information. All instruments utilized in process control are dependent on variables which translate from input to output. Calibration ensures the instrument is properly detecting and processing the input so that the output accurately represents a process condition. Typically, calibration involves the technician simulating an environmental condition and applying it to the measurement instrument. An input with a known quantity is introduced to the instrument, at which point the technician observes how the instrument responds, comparing instrument output to the known input signal.

Even if instruments are designed to withstand harsh physical conditions and last for long periods of
time, routine calibration as defined by manufacturer, industry, and operator standards is necessary to periodically validate measurement performance. Information provided by measurement instruments is used for process control and decision making, so a difference between an instruments output signal and the actual process condition can impact process output or facility overall performance and safety.

Instrument Calibration LabIn all cases, the operation of a measurement instrument should be referenced, or traceable, to a universally recognized and verified measurement standard. Maintaining the reference path between a field instrument and a recognized physical standard requires careful attention to detail and uncompromising adherence to procedure.

Instrument ranging is where a certain range of simulated input conditions are applied to an instrument and verifying that the relationship between input and output stays within a specified tolerance across the entire range of input values. Calibration and ranging differ in that calibration focuses more on whether or not the instrument is sensing the input variable accurately, whereas ranging focuses more on the instruments input and output. The difference is important to note because re-ranging and re-calibration are distinct procedures.

In order to calibrate an instrument correctly, a reference point is necessary. In some cases, the reference point can be produced by a portable instrument, allowing in-place calibration of a transmitter or sensor. In other cases, precisely manufactured or engineered standards exist that can be used for bench calibration. Documentation of each operation, verifying that proper procedure was followed and calibration values recorded, should be maintained on file for inspection.

As measurement instruments age, they are more susceptible to declination in stability. Any time maintenance is performed, calibration should be a required step since the calibration parameters are sourced from pre-set calibration data which allows for all the instruments in a system to function as a process control unit.

Typical calibration timetables vary depending on specifics related to equipment and use. Generally, calibration is performed at predetermined time intervals, with notable changes in instrument performance also being a reliable indicator for when an instrument may need a tune-up. A typical type of recalibration regarding the use of analog and smart instruments is the zero and span adjustment, where the zero and span values define the instruments specific range. Accuracy at specific input value points may also be included, if deemed significant.

The management of calibration and maintenance operations for process measurement instrumentation is a significant factor in facility and process operation. It can be performed with properly trained and equipped in-house personnel, or with the engagement of highly qualified subcontractors. Calibration operations can be a significant cost center, with benefits accruing from increases in efficiency gained through the use of better calibration instrumentation that reduces task time.

Monday, May 15, 2017

Process Instrument Calibration and Repair

The Mead O’Brien Instrument Shop is fully equipped to handle your instrument calibration and repair needs. Whether its repair, calibration or certification services, Mead O’Brien can handle the job. Our technicians are factory trained and certified and can repair and re-calibrate virtually any pressure and temperature transmitter, pressure gauge, pressure switch, thermometer, RTD, or thermocouple.

Monday, May 8, 2017

Industrial Pressure Switches

Industrial Pressure Switch
Industrial Pressure Switch (Ashcroft)
A pressure switch is a device that detects the presence of fluid pressure. Pressure switches use a variety of sensing elements such as diaphragms, bellows, bourdon tubes, or pistons. The movement of these sensors, caused by pressure fluctuation, is transferred to a set of electrical contacts to open or close a circuit.

Normal status of a switch is the resting state with stimulation. A pressure switch will be in its “normal” status when it senses low or minimum pressure. For a pressure switch, “normal” status is any fluid pressure below the trip threshold of the switch.

One of the earliest and most common designs of pressure switch was the bourdon tube pressure sensor with mercury switch. When pressure is applied, the bourdon tube flex's enough to tilt the glass bulb of the mercury switch so that the mercury flows over the electrical contacts, thus completing the circuit. the glass bulb tilts far enough to cause the mercury to fall against a pair of electrodes, thus completing an electrical circuit. Many of these pressure switches were sold on steam boilers. While they became a de facto standard, they were sensitive to vibration and breakage of the mercury bulb.
Pressure Switch Symbols
Pressure Switch Symbols

Pressure switches using micro type electrical switches and force-balanced pressure sensors is another common design.  The force provided by the pressure-sensing element against a mechanical spring is balanced until one overcomes the other. The tension on the spring may be adjusted to set the tripping point, thus providing an adjustable setpoint.

One of the criteria of any pressure switch is the deadband or (reset pressure differential). This setting determines the amount of pressure change required to re-set the switch to its normal state after it has tripped.  The “differential” pressure of a pressure switch should not to be confused with differential pressure switch, which actually measures the difference in pressure between two separate pressure ports.

When selecting pressure switches you must consider the electrical requirements (volts, amps, AC or DC), the area classification (hazardous, non-hazardous, general purpose, water-tight), pressure sensing range, body materials that will be exposed to ambient contaminants, and wetted materials (parts that are exposed to the process media).

Thursday, September 29, 2016

Understanding Differential Pressure or Delta-P

differential pressure
Differential pressure or Delta-P
Commonly, filters and strainers are positioned to capture solids and particulate. The filter will obstruct the flow through the pipe lowering the pressure on the downstream side. These effects may vary depending on the filters construction. Filter media is the material that removes impurities. The smaller the pores the larger the friction. Higher friction means greater pressure drop. Contaminants for particulates that buildup in the filter will reduce media flow. As the filter becomes clogged the downstream pressure drops. This results in an increased differential pressure, also referred to as the Delta-P. Saturated filters may also begin to shed captured particles.

With the filter no longer functioning properly, the contaminants can escape into the process. This is why proper monitoring of pressure drop is crucial. So how can we measure the DP? Placing taps both before and after the filter, a differential pressure measuring instrument can be connected to detect the high side and close side pressures. the instrument will report the difference between the two sides. The saturation point will be indicated when the Delta-P value reaches a predetermined threshold. This value is derived from a calculation that factors in the flow rate, fluid viscosity, and filter characteristics.

When specifying a differential pressure instrument there are two important factors to consider. The first is the DP range, which is based upon the most difference in pressure that the restriction is likely to produce. The second is the instruments ability to contain the line or static pressure level.

For more information on pressure measurement, call Mead O-Brien at (800) 892-2769 or visit

Here is a great video, courtesy of Ashcroft, that provides an excellent visual understanding of differential pressure.

Thursday, June 30, 2016

A Guide to Instrumentation for Ethanol Fuel Production

ethanol plant
Ethanol plant
Ethanol, the common name for ethyl alcohol, is fuel grade alcohol that is produced through the fermentation of simple carbohydrates by yeasts. Fueled by growing environmental, economic, and national security concerns, U.S. ethanol production capacity has nearly doubled in the past six years, and the Renewable Fuel Association (RFA) projects another doubling of the industry by 2012. Ethanol can be made from renewable feedstock’s such as grain sorghum, wheat, barley, potatoes, and sugar cane. In the United States, the majority of the ethanol is produced from corn.

The two main processes to produce ethanol from corn are wet milling and dry milling.
Foxboro transmitter
Foxboro transmitter

Wet milling is more versatile as it produces a greater variety of products, including starch, corn syrup, and sucralose (such as Splenda®). However, with this versatility come higher costs in mill design, building, and operation. If ethanol is the primary product produced, dry mills offer the advantages of lower construction and operations costs, with improved production efficiency. Of the more than 70 U.S. ethanol plants currently being built, only a few are wet mills.

The efficiency of ethanol production has come a long way during the last 20 years. As more large-scale facilities come on line, ethanol producers are faced with the growing challenge of finding innovative ways to maintain profitability while this market matures. An increasingly accepted solution is process automation to assist ethanol producers in controlling product quality, output, and costs. Because sensing and analytical instrumentation represents what is essentially the eyes and ears of any automation system, careful evaluation of instrumentation, at the design phase can reduce both equipment and operating costs significantly, while improving overall manufacturing effectiveness.

The following document, courtesy of Foxboro, provides a good overview of instrumentation and the production of ethanol.

Sunday, March 20, 2016

Types of Pressure Measurements Used in Process Control

Ashcroft pressure gauge
Pressure gauge
(courtesy of Ashcroft)
Pressure, the measure of a force on a specified area, is a straightforward concept, however, depending on the application, there are many different ways of interpreting the force measurement.

As with any type of measurement, results need to be expressed in a defined and clear way to allow everyone to interpret and apply those results correctly. Accurate measurements and good measurement practices are essential in industrial automation and process environments, as they have a direct effect on the success of the desired outcome.

When measuring pressure, there are multiple units of measurement that are commonly used. Most of these units of measurement can be used with the international system of units, such as kilo, Mega, etc.

This white paper (courtesy of Turck) will identify the various units of pressure measurement, while discussing when and why certain pressure measurements are used in specific applications.

Wednesday, January 27, 2016

Pneumatic Instruments

pneumatic transmitters
Pneumatic transmitters
(courtesy of Foxboro)
Air pressure may be used as an alternative signaling medium to electricity. Imagine a pressure transmitter designed to output a variable air pressure according to its calibration rather than a variable electric current. Such a transmitter would have to be supplied with a source of constant-pressure compressed air instead of an electric voltage, and the resulting output signal would be conveyed to the indicator via tubing instead of wires:

The indicator in this case would be a special pressure gauge, calibrated to read in units of process pressure although actuated by the pressure of clean compressed air from the transmitter instead of directly by process fluid. The most common range of air pressure for industrial pneumatic instruments is 3 to 15 PSI. An output pressure of 3 PSI represents the low end of the process measurement scale and an output pressure of 15 PSI represents the high end of the measurement scale. Applied to the previous example of a transmitter calibrated to a range of 0 to 250 PSI, a lack of process pressure would result in the transmitter outputting a 3 PSI air signal and full process pressure would result in an air signal of 15 PSI. The face of this special “receiver” gauge would be labeled from 0 to 250 PSI, while the actual mechanism would operate on the 3 to 15 PSI range output by the transmitter. As with the 4-20 mA loop, the end-user need not know how the information gets transmitted from the process to the indicator. The 3-15 PSI signal medium is once again transparent to the operator.

Typically, a 3 PSI pressure value represents 0% of scale, a 15 PSI pressure value represents 100% of scale, and any pressure value in between 3 and 15 PSI represents a commensurate percentage in between 0% and 100%. The following table shows the corresponding current and percentage values for each 25% increment between 0% and 100%. Every instrument technician tasked with maintaining 3-15 PSI pneumatic instruments commits these values to memory, because they are referenced so often:

Using the Foxboro model 13A pneumatic differential pressure transmitter as an example, the video below highlights the major design elements of pneumatic transmitters, including an overview of "maximum working pressure" versus "maximum measurement range" pressure.

The Foxboro model 13A pneumatic d/p cell transmitters measure differential pressure and transmit a proportional pneumatic output signal.

The information above is attributed to Tony Kuphaldt and is licensed under the Creative Commons Attribution 3.0.

Wednesday, November 4, 2015

Automation Competency Model Helps Guide Future Technical Workforce

Author, Stephen R. Huffman, Vice President, Marketing and Business Development, at Mead O’Brien, Inc.
Eight years ago, the Automation Federation (AF) delegation told an audience at the Employment and Training Administration (ETA) about the people practicing automation careers in industry. Not long before our visit, the ETA, part of the U.S. Department of Labor (DOL), had worked with the National Institute of Standards and Technology (NIST) to develop a “competency model” framework based on the needs of advanced manufacturing. The ETA was eager to engage AF and ISA to use our tiered framework to develop a competency model for the automation profession.

After developing the preliminary model, hosting subject-matter expert (SME) meetings facilitated by the DOL to finalize our work, and then testing the model with several automation managers against their own criteria for validity, we rolled out the Automation Competency Model (ACM) to educators, government, and industry in 2008. Since then, it has been a tool for educators and parents to show students what automation professionals do, management to understand the skill sets their employees need to be effective and to use as a tool for gap analysis in reviews, program developers to create or alter curricula for effective education and training, and lawmakers to understand how U.S. manufacturing can be globally competitive and the jobs needed to reach that goal.

In the lower tiers, the model identifies necessary soft skills, including personal effectiveness, academic, and general workplace competencies. Automation-specific work functions, related competencies, and references (e.g., standards, certifications, and publications) are detailed in tier 5. In short, the model stakes out our professional territory and serves as a benchmark for skill standards for all aspects of process and factory automation. Previously, parts of the academic community and some U.S. lawmakers and agencies had the misconception that industrial automation and information technology (IT) are synonymous. Although there has been some convergence between IT and operational technology (OT), much of that perception has changed. OT-based industrial automation and control systems (IACS) were a focus in the recent cybersecurity framework development organized by NIST in response to the presidential executive order on cybersecurity for critical infrastructure.

The ACM has been a great tool for the AF to use to draw new organizational members and working groups, who visualize the big picture in automation career development. Also, we are telling our story and forming partnerships with science, technology, engineering, and math (STEM) organizations such as FIRST and Project Lead the Way. Since forming in 2006, AF now has 16 members representing more than 500,000 automation-related practitioners globally. After two three-year critical reviews, the ACM is still the most downloaded competency model on the DOL website. As a result of our work in creating the ACM and the IACS focus in cybersecurity framework meetings, the DOL asked AF to review a heavily IT focused Cybersecurity Competency Model. After adding IACS content and the philosophy of plant operation (versus IT) cybersecurity, the model released was a much stronger tool with wider applicability.

Recently, ISA, as a member of the American Association of Engineering Societies (AAES), presented the development of the ACM to AAES leadership as a way to provide tools for lifelong learning in the engineering profession. AF/ISA was once again invited to work with the DOL and other AAES member societies to lead in developing an Engineering Competency Model. The model framework and our experience in ACM development enabled us to identify the front-end skills, necessary abilities, knowledge to be developed, and academic prerequisites for any of the disciplines, plus industry-wide competencies from the perspective of all engineering-related plant functions: design, manufacturing, construction, operations and maintenance, sustainability and environmental impact, engineering economics, quality control and assurance, and environmental health and safety—with emphasis on cyber- and physical security, and plant safety and safety systems.

Now the societies dedicated to each vertical discipline listed in tier 5 will begin to identify all critical work functions, detail all competencies within each function, and note the reference materials. It is important for the participants to see the big picture, consider the future, and keep an open mind; agreement typically comes easily when SMEs participate with that mindset. Once the model through tier 5 is complete, job titles and job descriptions are created. When the DOL accepts the model, the U.S. government officially recognizes these positions. We hope the emerging Engineering Competency Model will be a great tool to address the overall skilled worker shortage. If the automation model is any indication, the new engineering model will have a large impact on achieving the skilled workforce goal.

Thursday, October 15, 2015

Cybersecurity, ISA, and Automation Federation and How We Got Here

Author, Stephen R. Huffman, Vice President, Marketing and Business Development, at Mead O’Brien, Inc.
Published: InTech Magazine, May-June 2015

Cybersecurity and
Technical leaders had the foresight to create the ISA99 standards committee back in 2002. They recognized the need for cybersecurity standards in areas outside of the traditional information technology (IT), national security, and critical infrastructure areas of concentration at the time. In the following years, a number of ISA99 committee members spent time and effort advocating and even testifying on Capitol Hill about our profession, which was not well defined, and our cybersecurity efforts therein, which were not well discerned from IT perceptions.

When Automation Federation (AF) refocused its efforts in 2007 with both automation profession advocacy and industrial automation and control system (IACS) cybersecurity as two of its strategic imperatives, we ventured forth to Capitol Hill with a message and a plan. We found that in general our lawmakers equated process and industrial automation as “IT” and thought that IT was already addressing cybersecurity in terms of identity theft and forensics, and that the Department of Defense was handling cyberprotection for national security. For the next several years, AF built its story around cyberthreats in the operational technology (OT) area and how ISA99 through its series of standards, technical reports, and work group output was providing guidance for asset owners, system integrators, and control system equipment manufacturers specifically for securing IACS.

The operating philosophy of IT cybersecurity versus OT cybersecurity is quite different. Although the approach of shutting down operations, isolating cybersecurity issues, and adding patches may work well to mitigate IT breaches, the same cannot be said for operating units in a real-time process. In short, it really is not feasible to “reboot the plant.” The message resonated enough for us to help create the Liebermann-Collins Cybersecurity Senate Bill introduced in 2012, but opposition (more political than reasonable) doomed this first effort.

In 2013, the President issued Executive Order 13636 for enhancing cybersecurity protection for critical infrastructure. It included directing the National Institute of Science and Technology (NIST) to establish a framework that organizations, regulators, and customers can use to create, guide, assess, or improve comprehensive cybersecurity programs. Of the more than 200 proposals submitted by organizations receiving a request for proposal, almost all were IT-based. The AF/ISA submittal took the perspective of operational technology backed by the strength of the existing ISA99 set of standards. After a set of five framework meetings of invited participants, including the AF “framework team,” over the course of 2013, the OT and IACS teams were much more successful in defining the needs, and the automation message was much better understood. NIST personnel with legislative experience with AF on the 2012 Senate bill understood that private industry is a key piece of the cybersecurity and physical security puzzle.

AF organized a series of NIST framework rollout meetings in 2014 around the country with attendees from the AF team, NIST, and the White House. The meetings were hosted by state manufacturing extension partnerships, which are state units of NIST. After these meetings and more work with Senate lawmakers, a bipartisan Senate bill, The Cybersecurity Enhancement Act, was signed by the President and put into law in December 2014 ( In summary, the act authorizes the Secretary of Commerce through the director of NIST to facilitate and support the development of a voluntary, consensus-based, industry-led set of standards and procedures to cost effectively reduce cyberrisks to critical infrastructure. As you can imagine, ISA99, now IEC/ISA 62443, will play a more prominent role in securing the control systems of industry in the future through a public-private information-sharing partnership. Thanks for the foresight and fortitude of the ISA99 standards committee.