The migration to Ethernet
Telemetry is a complicated business. Mike Knights looks at which solutions an engineer should select for a particular challenge, and asks whether there are any golden rules.
Telemetry used to be a simple concept. Remote data was read by simple instruments. Then telecoms group BT wired up a telephone line, and telemetry companies made equipment that interfaced these components together.
A Scada computer archived all the data, displayed the values and trends and printed out reports of events.
The increasing integration of business systems along with the expansion of communication technologies, internet services and the availability of high-performance, battery-powered, remote-communication devices means that their implementation now incorporates many different technologies and systems all interacting to achieve the overall goals.
The simple sensor device measuring a key parameter may still be individually wired. But, if cabled, it is much more likely that it will be on a local sensor bus. If wireless, it will be in a smart private network of some kind or directly linked by GSM/GPRS to a central server.
And, if linked by GSM/GPRS, there will be some data storage capability at the sensor, and often analytical computations so that the data sent is already in a partially summarised format.
The change in the quantity of options means that engineers have to address the issue of what methodology to employ in gathering this data.
At each level of challenge, there are now privately owned and operated systems. But, increasingly, there are cost-effective public-service options based on GSM/GPRS or other public wireless systems offering alternate solutions.
So what solution does a telemetry engineer select for particular challenges? Are there any golden rules?
Starting at the measurement location, it is likely many monitoring points will not have power available. And the cost of running power to the location would be prohibitive. So, battery power is increasingly the solution, with solar or wind-powered back up where appropriate.
To maintain minimum installation costs, signal cabling can be avoided by use of either a local wireless network to collect data to a concentrator location typically within a few hundred metres even back to some remote central facility by GSM/GPRS network.
This kind of system, expanding rapidly in the general machine-to-machine (M2M) marketplace bypasses all traditional hierarchical telemetry structures and introduces a parallel data path from the sensor to the Scada headquarters, sometimes directly but more often via a third-party data server service.
For monitoring applications, this continues to be attractive. But for critical real-time control issues where security, response times and cost of ownership are key parameters, a private high-performance radio network interfaced to the data concentrator is a far better solution.
The wide area networks used by utilities and other bodies have traditionally relied on secure and reliable time-critical alarm and control data and generally employed UHF scanning telemetry systems. These networks are still the backbone of many UK utilities' control systems, and have traditionally been simple, transparent data carriers.
With the advent of Digital Signal Processing (DSP) and digital radios, these can now be fully managed systems that not only carry user data streams from many hundreds of outstations but also carry control and health data about the network components.
With highly distributed equipment involved, these self-diagnostic and management tools significantly increase maintenance efficiency, reduce costs and reduce system downtime.
Pacscom recognises that utility organisations need to embrace and appropriately deploy emerging technologies such as ZigBee and radio frequency identification (RFID) sensor networks as part of their networking strategy. Starting at the remote locations for data gathering, pressure is growing for ultra-low-cost wireless-based sensor devices, self-powered (battery or solar) that are networkable together to retrieve data.
The current favourites are based on IEC 802.11 (WiFi) and IEC 802.15.4 Mesh Networks (ZigBee), although 802.11 is not really suitable for battery operations.
ZigBee is not a fully approved international standard as the application layer is not yet completed. Another option is Waveness, a mesh radio network that is the European equivalent of ZigBee, which uses proprietary protocols.
These are based on the use of deregulated wireless technology for the first 50-250m where a data concentrator box then interfaces into a longer-range technology. At this concentrator or hub point it is also likely that Ethernet technology will increasingly become the norm for linking into higher-level networks.
In fact WiFi systems effectively carry the Ethernet right out to the remote point already.
Pacscom's product and deployment project strategies concur. But where clients have a requirement for high-availability point to multipoint applications of a time- or data-critical nature, radio frequency products using smart modulation techniques and network management are fast emerging as the leaders in the field.
Pacscom has now developed a group of modules, known as OpenNet Systems, that have got excellent connectivity options so that they can all talk Ethernet, serial and wireless communications in different forms.
In the past it was necessary to be specific about what type of radio, cable or wireless communications had to be used on a site. But with OpenNet modules there is total flexibility to use any type of connectivity. Pacscom supports the protocols that are required to make them talk to other components, so they are powerful modules with which to build systems. Most significantly, OpenNet Systems can be used to upgrade existing communications systems.
The attraction of Pacscom's OpenNet modules to system builders is that they allow customers to be able to configure the systems themselves as opposed to having to rely on the equipment manufacturers to configure systems. The modules use a standard IEC611 311 configuration methodology, used extensively throughout industry by plc manufacturers.
The data value and control split has a significant bearing on which technologies in telemetry will be best employed. What is clear is that simple data acquisition makes up in excess of 80% of telemetry data requirements so has been the focus of a large proportion of the investment in product developments.
What is also clear is the convergence of technologies at several levels. As the processing power of Scada and the need for database interchange with business systems has grown, the techniques employed in business data handling in advanced telemetry systems and IT are essentially the same. And it is IT departments that are now in the driving seat.
Scada systems are still real-time sensitive in many critical applications. And the constructs which deal with this in Scada architectures are different, with the elements that operate in the real-time environment distributed down close to the real world interfaces. Likewise, the growth of low-cost communications technologies that leverage existing infrastructure (i.e. GPRS and internet products) are becoming firmly established options as aspects of telemetry solutions where high-security and high-availability are not perceived as mission critical.
M2M for data gathering has grown alongside telemetry users accessing GSM/GPRS networks to the point where, again, there is no real difference. There seems little question that all new concepts of telemetry or M2M applications, particularly for
non-mission-critical data acquisition, will be common and will continue to integrate further generally under the IT banner.
The hub or concentrator of a remote network will increasingly become a communications hub with Ethernet links over a selection of media back to central servers where the actual data path for monitoring and data acquisition systems will increasingly become GPRS/G3 or internet-based.
Mike Knights is sales and marketing manager at Pacscom. T: 023 8073 7557.