Keeping in touch

Mike Knights of Pacscom looks at the changes in telemetry that are on the way and which should be of benefit to the water industry

Telemetry used to be a simple concept 20 or so years ago. Remote data was read by simple instruments, and then BT wired up a telephone line and telemetry companies made equipment that interfaced these components together and had a top-end Scada computer that archived all the data, displayed the values and trends and printed out reports of events.

The concept is, perhaps, still simple. But the increasing integration of business systems plus the explosion of communication technologies, both private and public service, internet services and the availability of high-performance, battery-powered, remote-communication devices means that the implementation now incorporates many technologies and systems interacting to achieve the overall goals. In fact, telemetry more than ever is just a part of powerful business information systems.

Take, for example, the simple sensor device measuring a key parameter. These may still be individually wired but much more likely, if they are cabled, they will be on a local sensor bus. And, if wireless, they will be in a smart private network of some kind or directly linked by GSM/GPRS to a central server. (GSM stands for Global System for Mobile Communications and GPRS is General Packet Radio Service.) And, if linked by GSM/GPRS, it is highly likely there will be some data storage capability at the sensor, and often analytical computations so that the data sent are already in a partially summarised format. Furthermore, not only process data is sent, but also health information and set up parameters of the device itself.

So the key elements of modern telemetry systems now include:

  • Multiple communications media

    n ?Highly intelligent data concentrators and local media managers
  • Data manipulation often at the remotest level possible to reduce the amount of data that the communication system has to carry
  • System-wide diagnostic information for all component units
  • Power-efficient products wherever possible
  • Battery or self-powered monitoring devices for wireless data gathering
  • Technical and commercial driving forces

The driving force behind this expansion of telemetry facilities and techniques has generally been legislation and, of course, cost reduction. Pressures for better energy efficiency, tighter controls on product quality, environmental constraints on how processes are run and, of course, billing customers as quickly and accurately as possible all arbitrate in favour of better intelligence. And this means more and better source data. The cost and possible errors of using manpower for these monitoring activities has long been uneconomic and the telemetry methods have proved to be more accurate and cheaper. This issue of the value of any piece of data clearly runs through the whole telemetry industry. Data acquisition and monitoring systems are clearly important but generally not mission critical. Remote automatic control systems on plant or moving machinery clearly are, to a greater or lesser extent, depending on the plant/machinery function. So the methods employed in any telemetry application must reflect the needs, the value of the data and its security and resilience.

This data value split has a significant bearing on which of the ever-changing technologies in telemetry will be best employed. What is clear is that simple data acquisition, which is neither time or mission critical, actually makes up in excess of 80% of telemetry data requirements, so has been the focus of a large proportion of the investment in product developments, although much of this development has been justified by other markets such as the mobile phone or home/building management sectors.

What is also clear is the convergence of technologies at several levels. For many years Scada systems were strictly the responsibility of Scada engineers, both design and operations. IT departments looked after MIS and the two did not overlap. However, as the processing power of Scada and the need for database interchange with business systems grew, the interface became fuzzy, and now the techniques employed in business data handling in advanced telemetry systems and IT are essentially the same.

Scada systems are still real-time sensitive in many critical applications and the constructs which deal with this in Scada architectures are different, with the elements that operate in the real-time environment distributed down close to the real-world interfaces. Likewise, the growth of M2M (low-cost communications technologies that leverage existing infrastructure (GPRS and internet) products are becoming firmly established options as aspects of telemetry solutions where high-security and high-availability are not perceived as mission critical. M2M for data gathering has grown alongside telemetry users accessing GSM/GPRS networks to the point where again there is no real difference. There seems little question that all new concepts of telemetry or M2M applications, particularly for non-mission-critical data acquisition will be common and will continue to integrate further generally under the IT banner.

New concepts

Starting at the remote locations for data gathering, enormous pressure is growing for ultra-low-cost wireless-based sensor devices, self-powered (battery or solar) which are networkable together to retrieve data. The Bluetooth project promised to address this market but for some users technically never quite hit the jackpot. The current favourites are based on IEC 802.11 (WiFi) and IEC 802.15.4. (Mesh Networks) (ZigBee – ZigBee is based upon the IEC 802.15.4 evolving standard) although 802.11 is not really suitable for battery operations. These are based on the use of deregulated wireless technology for the first 50-250m where a data concentrator box then interfaces into a longer-range technology.

At this concentrator point, it is also likely that Ethernet technology will increasingly become the norm for linking into higher-level networks. (In fact, WiFi systems effectively carry the Ethernet right out to the remote point already.) Cabled systems (including fibre) still exist, of course, but the high cost of installing cabling; particularly when the installation may well be a retrofit to an existing site makes this a less attractive option than wireless particularly as wireless speeds have increased significantly.


The hub or concentrator of a remote network has several duties. It is first and foremost a communications media interface linking the remote short-range system into a long-range, possibly global, system. Use of ADSL telephone lines or GPRS/G3 wireless communications from the hub means internet-style connectivity is feasible. Data speeds and need for local data manipulation suggests hub devices may well be intelligent devices offering simple configuration opportunities to set up the communications as well as basic data storage, archiving and often data filtering. What is certain is that Ethernet-style communications will increasingly become the norm for these concentrators – so they will communicate both with their peers on a site/plant as well as upwards towards central processing facilities or Scada centre.

One particular example of a hub-based system is an RFID reader and remote tags. To all intents and purposes, an RFID tag is a sensor supplying data over a network via a hub/concentrator to a central processing facility. Another is an AMR network with remote meters all reporting data to their local hub for onward transmission over an appropriate medium. A third is intelligent devices (RTUs and PLCs). One particular implementation of a hub/concentrator is the Telemetry RTU or a PLC used in this mode. These devices now all support local peer-to-peer communications on a high-speed network, often proprietary but more and more in the future based on IP and in the water industry the WITS protocol based on the DNP 3.0 defacto standard.

Centralisation of data

Links from hubs to central data servers follow one of two conceptual routes. Both are suitable for data-gathering activities but have different features and benefits. Control and mission-critical systems are generally managed over the direct-style network:

  • Direct to user server where data processing is done by the users own server facilities
  • Over a public network into a third-party server from where the assimilated data is passed to the user server in advanced file format

Direct to user server

The wide area networks used by utilities and other bodies for control have traditionally relied on secure and reliable time-critical alarm and control data, generally employing UHF scanning telemetry systems. These networks are still the backbone of many UK utilities’ control systems, and have traditionally been simple, transparent data carriers. They offer fixed scan rates, guaranteed response times and a high level of security of data in real time, which third-party systems cannot do cost effectively.

With the advent of DSP and digital radios, these can now be fully managed systems, which not only carry user data streams from many hundreds of outstations but also carry control and health data about the network components. With highly distributed equipment involved, these self-diagnostic and management tools significantly increase maintenance efficiency, reduce costs and reduce system downtime. Other technologies such as satellite and meteor burst offer wireless communications and have a particular role to play. These perform in a similar manner to the advanced radio communications although often at considerably different data rates.

Via third-party servers – GPRS technology

GPRS as a connectivity option has been available to the UK market for some four years. In the early years, the option was only really appropriate as a pilot to prove the technology, lacking the coverage and reliability for successful and sustained roll-out to the marketplace.

GPRS is an IP-based technology, working in the same way as the internet – the user establishes a physical connection, then using peer-to-peer protocol is authenticated and is ascribed a dynamic IP address by the network, enabling interaction with other IP connections to the network.

However, GPRS is a private IP network connected to the public IP internet network with many fire-walled interconnects. This is problematic to the integrator and implementer of a GPRS solution as the inherent security within the network causes the loss of the device IP address. Consequently, the remote unit is able to send and receive data using standard network protocols such as FTP, HTTP and Telnet but the back office is unable to send control data back to the device, thus causing incompatibility with existing back office systems.

The market has responded to this limitation in four ways:

  • Mobile virtual network operator (MVNO) add services over and above the standard GPRS service. Using a virtual access point name (APN) a Radius service identifies the device and assigns the unit with the same static IP address each time the unit is authenticated on the network, in effect providing a dynamically assigned static IP solution
  • Virtual private network (VPN) where the hardware device incorporates a hardware VPN chipset. The user then addresses each device through a separate virtual private network with each device
  • Domain name service (DNS) – some newer services are appearing on stream where the service provider essentially tracks the dynamic IP address that has been assigned to the originating device and allow the user to address that device through a virtual title (the user addresses, a telephone number or a device name) rather than an IP address. This is in the same way that an internet user addresses rather than
  • Jabber/XMPP (open IM protocol) implementation that enable transfer of information without the necessity to resolve IP address

It is clear that GPRS and 3G technologies are here to stay within device network strategies. As GPRS is wireless, the cost of implementation is relatively low, and its IP base means that it is well understood by IT departments within both enterprise and public organisations. Of course, there are also bridging technologies between IP and private wireless solutions that may reduce the complexity of the implementation or allow a hybrid-style solution.

Changing concepts

The key factor overriding the changing concepts in telemetry is the choice now presented to designers/engineers/users in their management systems. There is no right solution for all applications or all types of users. Instead, the best solution is a careful mix of a whole variety of different techniques but some common ground is emerging. Commercial issues, some obvious and others less immediately clear, play a large part. The obvious issues are the capital purchase option for a private system versus the public system with revenue costs. But there are also hidden management and maintenance resource costs, which again need to be weighed against guarantee of service.

Security of communication, particularly if the third-party medium considered is open to overloading particularly at times of natural or terrorist-initiated disaster, must be a factor on mission-critical systems.

So what solutions does a telemetry engineer select for particular challenges, and are there any golden rules given the plethora of communication options available? Starting at the measurement location, it is likely many monitoring points will not have power available, and the cost of running power to the location would be prohibitive. So battery power is increasingly the solution, with solar- or wind-powered back-up where appropriate.

To maintain minimum installation costs, signal cabling can be avoided by use of either a local wireless network to collect data to a concentrator location typically within a few hundred metres or even back directly to some remote central facility by GSM/GPRS network. This kind of system, expanding rapidly in the general M2M marketplace, bypasses all traditional hierarchical telemetry structures and introduces a parallel data path from the sensor to the Scada HQ sometimes directly but more often via a third-party data-server service.

For monitoring applications, this looks likely to continue to be attractive except for critical real-time control issues where security, response times and cost of ownership are key parameters. A private, high-performance radio network interfaced to the data concentrator is a far better solution for these time-critical applications.

In addition to the established concept of transparent networks, the modern digital DSP-based radio systems can provide an advanced solution offering multiple services on one infrastructure. This IP network, offering Ethernet over radio, allows the co-existence of multiple services and users on a single, shared infrastructure. Wireless networks handle serial and IP traffic, high-speed and low-speed links, licensed and unlicensed radio bands, private and public infrastructures, data and high-speed applications.

These private systems provide secure owner-managed long-range solutions that enable communications networks to provide real-time monitoring and control for any Scada need, including water and wastewater applications.

So, projecting the paths of current trends forward, the concepts for telemetry into the future seem to be crystallising. Monitoring systems will continue to expand using smaller, cheaper sensors increasingly based on wireless networked technology and low-power operation facilitating battery or solar power.

Concentrators will increasingly become communications hubs with Ethernet links over a selection of media back to central servers where the actual data path for monitoring and data acquisition systems will increasingly become GPRS/G3 or internet based.

But there will continue to be mission-critical real-time applications where the value of the data is too high to entrust it to shared public systems. And these will continue to function over high-performance privately owned and managed wireless networks, again increasingly using Ethernet as the basis for the communications.

Action inspires action. Stay ahead of the curve with sustainability and energy newsletters from edie