Software and tear
The proliferation of software applications and associated developments which have taken place in recent years have not left the field of environmental management unscathed. Lynda Fryer of AEA Technology Environment's Strategic Consulting Group, and Malcolm West, business manager, Software Solutions at AEA, consider the development and subsequent impact of environmental information technology.
- the general increase in environmental protection, displayed throughout society, which has been accompanied by a strengthening of environmental legislation;
- the improving cost effectiveness of computer-based solutions.
The Environmental Protection Act 1990 made widespread provisions for improving control and management of potentially polluting activities and was arguably the most important piece of environmental legislation introduced in the last decade. A major consequence of this Act was the establishment of a system for integrated pollution control, which brought with it the need for operators to demonstrate greater understanding of their emissions and impacts than ever before. This included the need to justify why lower environmental impacts were impracticable.
The increased focus on understanding and justifying emissions has brought with it a greater need to predict, monitor and report releases to the environment. These changes have significantly altered the way in which both regulators and the regulated operate. For example, in the late 1980s and early 1990s most industry sectors had only minimal requirements to monitor and report their impact on the environment; what monitoring was required was often infrequent and against simple threshold targets. The relatively small amounts of data collected in support of these requirements could be managed with a paper-based system, supplemented in some instances by simple spreadsheets. At that time there was also relatively little either demand or opportunity for public scrutiny of the data, although some companies did report headline figures as part of their environmental reports.
However, as the need to demonstrate an understanding of the impact of industrial activities on the wider environment has increased, there has been a corresponding increase in the number and frequency of parameters to be measured. Continuous monitoring and the setting of time-averaged release limits have, in consequence, become more common. In parallel with this general increase in complexity of monitoring, there has been a corresponding increase in the magnitude of the effort needed to successfully manage the control and reporting of environmental emissions.
To date, reporting needs have largely been met by increasingly complex spreadsheet and database systems set up to maintain records of both manual readings and automated output from SCADA-type systems. However, in most companies with anything more than the simplest of monitoring requirements, the multiple spreadsheet approach is feeling the strain. The lack of robust data control and auditability associated with the use of spreadsheets has led to an increasing number of notices from the EA and SEPA for incorrect reporting of emissions.
More recently, improvements in computing power and the standardisation of operating systems have begun to provide environmental management tools more completely integrated into control and management systems. The availability of more robust and professional tools is increasingly leading to the in-house spreadsheet expert being re-equipped, and in some instances replaced, as the cost-effectiveness and quality assurance of modern monitoring and reporting tools becomes clear.
Driven largely by regulatory requirements to improve the design of industrial processes and systems, generic models have increased in accuracy and simplicity of use, and are now available for most types of environmental assessment. The major trend in model development has been the improvement in graphical interfaces, which enable easier model construction and intuitive interpretation of results.
Regional models (global and national) are increasingly used to assist in the development of environmental policy, and site-specific models tailored to particular local conditions are used in some of the more complex situations. In both instances understanding of effects and calibration of the models is assisted by data gleaned from monitoring. Such techniques are widely used throughout the design, commissioning and operation of industrial plant to facilitate determination of the best practicable environmental option and to demonstrate to the regulatory authorities that appropriate prevention and mitigation measures have been adopted. As well as their use in this context, model predictions can also be used directly to influence emissions and it is increasingly common to find control systems using a mix of model and monitoring data.
With the cost of computing power continuing to fall and regulations tightening, the trends of the last five years look likely to continue, with less human involvement in the monitoring and reporting of environmental impacts and increasingly integrated use of environmental models.
The increasing standardisation of electronic formats, such as Windows and the internet standard of HTML, is moving equipment suppliers away from in-house standards - previously seen as a competitive advantage - towards formats which enable users to combine output from many suppliers. This process of standardisation will gain pace and, in conjunction with the ability to include models in control systems and update these with feedback from process monitoring, will increase the tendency to operate automatic and even remote control and monitoring equipment.
As the monitoring and reporting of environmental performance becomes more routine and automatic in nature it will become a commodity need and is more likely to be out-sourced by organisations.
Full integration of the system for managing environmental impacts with the overall business management system is becoming more widespread. Allied to this the realisation of benefits from aspects such as emissions trading will see the need for environmental software able to record emissions with the same degree of rigour as is found in financial accountancy systems. Over the next few years we will see process software able to operate industrial processes to optimise the economic and environmental performance within current and planned requirements.
With ever improving dispersion models, increasing density of air quality monitoring and increasing spatial definition and accuracy of environmental models, it should soon be possible to provide on-line environmental information to sensitive groups such as asthma sufferers via mobile communications equipment. Conversely, the ever reducing cost of GPS technology and associated software products could mark the end of fixed monitoring stations and see train or bus-mounted monitoring equipment providing real-time pollution maps.
The rise of the internet and remote operations could well see monitoring information pass directly through a company's control and monitoring system to the regulator. It seems a logical extension that certification and approved environmental accounting and reporting schemes will re-define the way in which regulators monitor industry.
The potential for including higher levels of sophistication and building bigger, more complex models has increased dramatically over the last decade, aided by the huge leaps forward in computing power and the development of 3D visualisation tools.
An example of the increased sophistication of models is the explicit representation of heterogeneity. Increased complexity of model generation can be found in the use of bigger domains with more rock types giving a more realistic model, and in the coupling of flow and transport of heat and salinity.
Modelling the complexity of flow in fracture networks has become more tractable as computing power has increased, giving modellers the freedom to make more realistic representations of groundwater systems in three dimensions.
Air qualityOver the last 10 years, many countries have been measuring increasing numbers of pollutants, with greater accuracy and time resolution, and at many more locations, creating large volumes of data.
To be of real use, though, the data needs to be transformed - through analysis and interpretation - into useful information on pollution hotspots, trends, human and ecosystem exposure. This process of turning raw data into useful, targeted air quality information can increasingly only be achieved using high-performance information technologies.
This information then needs to be disseminated as rapidly as possible, in a form appropriate to the needs of a wide community of public, Government, technical and planning end-users. As recently as 10 years ago, only a handful of government scientists could access information on major air pollution episodes that posed a real threat to human health. Today in the UK, however, millions are empowered through rapid and comprehensive access to air quality information. All that you now need is a phone, radio, TV or computer.
Incident response information systems
The changes in the software used for emergency response happened primarily for two reasons: rapid development of computers and software; and growth in legislation.
Programmers were increasingly able to write software for different applications, e.g. environmental assessment or information retrieval, using the same database structure - allowing them to develop programs relating to new regulations without significant re-writing of code. This, in turn, enabled programs to be developed that take the risk out of cross-referencing different regulations for the same chemical. It also became possible to produce 'expert systems' that used tools such as decision trees to clarify complicated regulatory processes. In addition, the 'standardisation' of PCs with the use of Windows meant users did not have to completely relearn how to navigate these new applications.
Computer processing speed and improved storage capacity allowed more information on chemicals to be accessed faster. Incident responders were able to find substance details sooner and so decrease the time taken to handle problems, thus minimising environmental impact.
Before a robust emissions trading market can develop worldwide and real money can change hands for emission permits, substantial changes will be required in the way organisations monitor and report their environmental performance. The party selling its permits will need to predict accurately its environmental impact for the period and then demonstrate that this has been achieved. This will require an accurate integrated environmental and process model that can be adjusted to reflect operational changes during the period.
The party buying the permits will need to look even further into the future than one year in comparing the cost of remediation activities against ongoing purchase of permits, this will also need accurate and integrated environmental management tools. Forward looking process modelling and environmental modellers are already working on the complex products that will be required to enable the emissions trading mechanism a have a real impact on improving our environment.
Compliance & EMS
Until relatively recently the reporting, let alone controlling, of emissions from most industrial processes was very much a low-key affair. However, this situation changed markedly during the 1990s in the wake of toughened regulations, increased regulatory powers, and the greater interest in environmental matters displayed by the public and pressure groups. Happily, these changes occurred during a revolution in the power and flexibility of computer systems. Whereas the early reporting systems were largely restricted to paper based systems and simple spreadsheets, the current generation of environmental software is able to handle the collection, collation and reporting of very many different emissions.
Systems are now able automatically to capture data from instruments and monitoring points scattered around industrial sites and to transcribe these data into standard reporting formats. This trend started with software that could only interface with instruments provided by the same manufacturer. Newer systems use open architecture enabling data extraction from any proprietary data logger or auto extraction from e-mail accounts. The next step of allowing reports to be transmitted to both regulators and publicly available information systems is no longer a pipe dream.