Following the adoption of EC Directive 21, May 1994, the UK UWWTD (England

&

Wales) Regulations 1994 place new demands on both industry in general and,

more specifically, those involved in water and waste treatment. There

exists, then, some urgency in the drive to establish more confidence in,

and

comparability between, measurements. As Terry Long of the Environment

Agency

said at a recent SWIG Workshop: ³Self monitoring is very different from

what

went before, and now the Agency will want to assess the effectiveness of

the

process generating the discharge. There is a complicated set of rules to

be

addressed.²

One way to help establish confidence is to establish common practice and

procedures via accepted standards. The Environment Agency has established a

Monitoring Certification Scheme, MCERTS, to improve the quality of

environmental monitoring data. MCERTS was initially introduced for

Continuous Emission Monitoring Systems for chimney stacks, to provide

regulators and industry with improved data quality on releases from

industrial processes. WRc has a project underway, jointly funded by the

Environment Agency and GAMBICA, to expand the coverage of MCERTS to include

water monitoring instruments.

The basis of MCERTS is to provide the format for formal product

certification of monitoring instruments. Product certification is based on

laboratory and field testing, using International or European Standards

where possible. In the area of continuous water monitors (CWMs) there are

few standards which adequately define performance standards and conformance

tests; with the exception of certain flowmeters. Therefore, there is an

intention to develop the MCERTS performance standards and conformance tests

to include CWMs.

Extensive work has been carried out on instrument standards in the past and

work is ongoing at both European and intenational levels; notably including

that of PISEG, CENELEC, the Agency, SCA, ETACS and ISO. Some individual

water companies have produced instrument specifications, and others are

known to have similar work in progress.

Performance standards and conformance tests need to be designed to

demonstrate that the instrument is ³fit for purpose², and due care should

be

taken to ensure that the performance standards are based on the purpose of

the measuring system ­ not necessarily on what is achievable.

Lack of confidence

Some lack of confidence in the quality of sensor data arises from the

different methods and cultures in laboratory determinations and

measurements

made on the process or in the environment. Operational standards must

incorporate a thorough understanding of the analytical standards that are

defined in regulation and the potential differences between the analytical

methods used in the laboratory and those used by the CWM. This becomes

particularly important when it is necessary to define conformance tests

based on comparisons with a reference analytical method, as opposed to the

use of a certified reference material (CRM).

The quality of data and the confidence in the data produced by on-line

monitors is also due in part to the cost of ownership of the instrument. So

whilst the cost and ease of operating, calibrating and validating

instruments is not strictly the domain of a regulatory standard, these

factors are important in obtaining good quality data over the long term and

so may need to be addressed within the performance standards.

Disparate data

A wide variety of tools is used to generate process and environmental data

and there is considerable business advantage in developing methods of

combining the data from different sources, places, operators and times in a

way which provides increased understanding and confidence.

At a recent SWIG Workshop, Dr Martin Lloyd, Farside Technology, emphasised

that the title of his presentation, Combining Disparate Data, was

important;

the emphasis for his approach was one of ³combining², and not the currently

fashionable ³fusion². A great deal of work has been undertaken by Oxford

University on the SEVA validity codes, which describe the various aspects

of

a measurement and its robustness, but there was further work to be done

extending these definitions. Using extended versions of SEVA it has been

shown that any measurement data, regardless of source, uncertainty, time

and

spatial considerations, can be fitted into a single database. Visualisation

tools can then be made available to cluster data for a particular site or

investigation. Tools can be made available to make it easier to see whether

the data set has deficiencies and where extra data might improve the

information gained.

To gain confidence that a process or environmental process is properly

understood, monitored and controlled, we need to have benchmarks or

standards and we need to combine data from different sources so as to

increase our knowledge.

Action inspires action. Stay ahead of the curve with sustainability and energy newsletters from edie

Subscribe