Merging sensors with communication systems
Research associate Dawn Fuller discusses the Knowledge Transfer Partnership between Partech Instruments and the University of Exeter, and focuses on the design and development of instrumentation in the wastewater industry
With the growing availability of intelligent sensors in the process and monitoring industries, we must consider the implications and the benefits to be gained, both as device manufacturers and end users. There is an ever-increasing need to agree on methods of communicating and managing the enormous amount of data made available by these intelligent sensors. The future will no longer be people talking to people, or even people accessing data through a machine, it will be machines talking to machines on behalf of people.
While instruments continue to gain intelligence, far too often they remain mute. The reason for this is their inability to communicate their data to remote systems. The lack of industry standards has complicated the sensor integration process, highlighting the increasing need for standards that enable both hardware and software compatibility. The main driver towards more processing power within our instruments is the need to meet with increasing regulations and the move towards un-manned sites and self-monitoring schemes.
So, just what is the primary role of any sensor or sensor-based system? The answer is to acquire information, be it temperature, flow rate, turbidity, machine health, or any of the other parameters we would like to measure. Advances in miniaturisation, standardisation and low-powered electronics are creating a new breed of intelligent sensors in the monitoring industry. Sensors can be made smaller, cheaper and lighter but, most importantly, they provide more reliable and accurate information along with validation of that data or measurement.
These technologies provide exciting new opportunities for monitoring the environment. Traditional solutions involve data loggers from which data is collected periodically in person or via telemetry. Now, real-time data is allowing scientists and technicians to build a detailed picture of entire processes, be they man-made or natural, and to better manage the environment through increased awareness. So, what makes a sensor intelligent and how do we manage that intelligence and gain optimum results? Intelligent sensors possess greater functionality than simply gathering data and blindly transmitting the raw data to a centralised controller. Intelligent sensors are an extension of traditional sensors, with advanced learning and adaptation capabilities.
With wireless communications and energy drawn from local sources such as solar cells, devices can be deployed without the constraints of having to configure and calibrate, and data can be conveyed only when useful. There still remain problems to be overcome, however. Deploying many more devices in the natural environment brings a number of challenges.
Devices must be robust, reliable and accurate, and where possible completely maintenance-free. Who will install and maintain these clever little devices? No longer is it the man or woman with the screwdriver and voltmeter; more likely is it that a bus analyser and laptop will be the essentials tools. Therefore, operators and technicians require new skills and knowledge.
For the system designers charged with integrating devices from multiple vendors, the lack of agreed standards creates expensive headaches on a regular basis. How do we manage increased quantities of data? Significantly these technologies make it possible to deploy more sensors in order to obtain more data more often – and this richness of data is set to create a powerful impact on environmental monitoring and decision-making in the process industry.
So the question many engineers and operators must ask is: what does the future hold? Is it going to get any easier? Standards are emerging for both software and hardware, but are they emerging fast enough? What are the benefits gained by the designer and manufacturer? The realisation of intelligent devices poses one of the most challenging tasks for designers.
As the need grows for remote monitoring, sensors must be extremely energy-efficient in order to avoid inconvenience due to frequent battery changes or costly power supplies. These sensors will revolutionise the design of sensor systems. It will become easier, cheaper and faster to design a sensor system, and the resulting systems will be more reliable, more scaleable and capable of providing a higher performance than traditional systems.
These benefits are gained by embedding computing resources on the sensor itself. The processing of data is performed within each individual sensor, rather than at a central system controller as in most traditional systems. While a sensor in the traditional sense outputs raw data, an intelligent sensor outputs only useful information in a format that can be reliably transferred and stored. Furthermore, they may be dynamically programmed as user requirements change. This will reduce the need for expensive, application-specific sensors, as cheap, programmable, general-purpose sensors will be adequate for multiple applications. The “smartness” and standardisation of the sensor will allow straightforward integration of sensors into any system.
It is necessary to consider the benefits gained by the user of these intelligent sensors. In traditional systems each sensor has varying gain, offsets, hysterisis etc, and this must be compensated for elsewhere in the system. The system that supports multiple sensors has been forced to deal with proprietary software designed for each individual sensor. Where such systems exist, the software developer is often faced with trying to merge incompatible software architectures and development languages or with rewriting software to meet the requirements of their system.
An intelligent sensor can store the physical attributes of the transducer, in the form of TEDS or GSD files and compensate for non-idealities locally in the processor. This allows the replacement of sensors without the need for recalibration. The system must also be re-configurable and perform the necessary interpretation and fusion of data from multiple sensors, together with the validation of local and remotely collected data.
Communication of data over long wires (or wirelessly) consumes a significant amount of energy. A common approach to the reduction of energy consumption is to reduce required communication band width by on-sensor data processing. This requires an intelligent sensor platform with low-powered processors or micro controllers which can go to sleep when there is nothing new to report.
When planning and designing a system an appropriate method of processing, analysing
and managing the data before, during and after taking the measurement must be chosen. Setting up an environmental or process monitoring system is only half the battle – how you store the data and present it after this is the other half. One side-effect of the greater density of incoming data, and the move towards a more rapid turnaround in the processing of data, is that a significant amount of computation is required at various stages of the communications and storage system.
There are two broad categories of data types to store. The first, system data, is information about the sensors, measurement instrumentation, automated test station and model number, the serial-number of the instrument, the name and type of sensor and calibration data. The second category, measurement data, is information about the device or substance measured and measurement results (for example, temperature, turbidity, frequency, waveform data, event count, signal maximum/minimum, dynamic range, alarm events).
For distributed systems, the data needs to be in a format which can be sent over the network. Both the measurement system and the server must be consistent in the data format. Today, the standard data format is extensible mark-up language (XML). Previously, all data formats were proprietary, so applications could not read each other’s formats without some conversion application translating from one to another. A standard format such as XML lets you use a selection of third-party tools to format and analyse the data.
A test data management system using XML will take data acquired by the measurement system, convert it to XML, and send it to the centralised server. The server will then convert the data back to its original measurement and store it in the database. In fact, many commercially available databases today can receive XML data, thereby simplifying the storage of data in the database.
Currently, most sensors are hard-wired into the systems they are monitoring and controlling, due in part to the lack of appropriate, reliable and cost-effective wireless solutions. Also, many companies interested in going wireless are still not clear about which technology to use. Wired communications protocols, such as modbus, profibus or devicenet, do an excellent job of integrating sensors into their environments and generally provide high levels of reliability and security.
These networks are appropriate whenever time-critical data and closed-loop control are required. However, cabling and installation costs can run as high as 80% of total system cost and once a cable is installed, it’s costly and time-consuming to relocate, even if it needs to move only a few feet. The development of network technologies has prompted sensor designers to consider alternatives which reduce costs and complexity as well as improving reliability. Early sensor networks used simple twisted shielded-pair cabling for each sensor. Later, the industry adopted multi-drop buses (for example, Ethernet).
Now we are starting to see true web-based networks implemented in the process industry and monitoring schemes. Depending on the properties of sensors, geographic coverage, network access capabilities and, more importantly, domain applications, the physical architecture can be very different. In the drive to provide open data-
acquisition networks, the emphasis has been on sensor bus, device bus and fieldbus protocols. However, standardisation at that level is not enough. In any network, software is becoming the key component of the system.
The low-level protocol stack, the network operating system, the higher-level operating environments and the applications of the clients and servers are all software. Software forms a much larger part of the system as a whole – therefore, standards established now will save the multi-vendor madness over future generations of intelligent instruments. While generating sensor data is fairly straightforward and well understood, conveying data from a sensor to a monitoring or control system remains a challenge due to the cost and complexity of installing and maintaining communications networks.
There are emerging, yet still inadequate, standards for interfacing location-based services such as sensors into the worldwide web. Data fusion techniques are required in order to combine information from multiple sensors and sensor types and to ensure that only the most relevant information is transmitted between stations. Consequently, the load on network bandwidth is kept at an acceptable level. This data must then be converted and presented or stored in a format to make that data easily available.
Computer-based control systems can use remote I/O made up from digital and analogue I/O boards from a wide variety of vendors, sent over any of several data communications schemes. In many cases, a large control system spread over a monitoring site or treatment works may be taking data from entirely different types of data acquisition equipment.
Information coming from a testing lab might be originating in analysers and instruments which use the IEEE-488 standard. Real-time data from a batch reactor may consist of dozens of 4-20mA flow, pressure and temperature transmitter signals coming in over a fieldbus network; and information from the control system could be motion-control data from actuators, valves and level switches gathered by a PLC and made available on a PLC communications network.
The realisation of a practical smart sensor system requires the synthesis of several technologies. One must bring together knowledge in the fields of sensors, data processing, distributed systems and networks. While extensive research has been conducted in all of these areas, an intelligent sensor system imposes some new design parameters, which must be met. IEEE 1451 provides a standard for network communication among sensors and an “electronic data sheet” which stores the parameters of the physical transducer in memory. While the IEEE 1451 standard is a first step towards truly smart sensors, much work has yet to be done before the full potential of a
smart architecture is realised.
It is important not to forget the primary role of any sensor. The data we base our decisions on and store is at best only as good as the original measured value. Different applications require different data networking technology to overcome environmental constraints and other practical limitations, yet agreements must be reached on standard communication methods and the type of data we collect and how we archive and transfer it.
Designing and implementing an appropriate networking technology for a specific application can be challenging; one size does not fit all. However, once an application’s communications requirements are clearly defined and the various attributes of the networking alternatives understood, the most appropriate communication solution is usually easy to identify.
© Faversham House Ltd 2023 edie news articles may be copied or forwarded for individual use only. No other reproduction or distribution is permitted without prior written consent.
Please login or Register to leave a comment.