Modelling on a record scale
MWH's Richard Norreys and Steve Kenney, and UU's Graham Squibbs and Luke Pearson look at applying urban pollution management methodology to a large CSO study programmeThe larger number of outputs and tight deadlines required by the United Utilities (UU) AMP3 unsatisfactory intermittent discharge (UID) programme have been both a technical and managerial challenge.
Now, with the completion of the majority of the studies, this article describes the general approach, methodologies and the tools used to identify problems, develop solutions and maintain product quality for different types of water quality studies. A number of case studies will be presented to illustrate how these methods and tools have been applied.
The UU AMP3 UID programme is the largest programme of work in the UK to identify and address pollution of surface waters by urban wastewater discharges in wet weather. The 914 UIDs were initially identified for assessment and grouped into 77 study areas. UU undertook water quality analysis-based urban pollution management (UPM) studies in 38 of the study areas in the programme.
The UPM studies demanded integrated modelling of all elements of the wastewater networks and the watercourses the networks discharge to.
The studies required the building and verification of sewer models for flow and quality, the constructing and calibration of river quality models, the development of design rainfall series and the production of WwTW models.
To feed this modelling effort vast amounts of data had to be collected. Table 1 illustrates the scale of the modelling and data collection exercise. A single UPM study offers a big technical challenge but 38 of them was a daunting prospect. data collection The prospect became even more daunting when the constraints of time, equipment availability and staff resour- ces were factored in.
The only way to complete the studies was to develop and apply sound technical approaches, exercise tight management, practice good quality control and exploit software and IT system capabilities.
Data collection was the most expensive element of the work and collection of wet weather data was the most critical element of the data collection programme. To reduce the risk of missing events or sampling non-events, wet weather data collection was controlled by a team member monitoring the weather. The individual was responsible for mobilising data collection contractors, notifying laboratories of the arrival times of samples and triggering the events or demobilising the data collection teams. The weather did not play into our hands. On average, a planned survey for eight weeks extended into 16 weeks and a planned 16-week water quality survey extended to 36 weeks. Time series flow and quality data were collected in sewers, rivers and WwTWs.
An important lesson from this is that the quality of the data has to be closely checked as soon as it is collected because environmental changes can affect sites, data can be mixed up in the labs and devices can measure the wrong parameter. On a big programme of work it is all too easy to collect and archive the data until it is needed for modelling - by which time it is too late to correct some mistakes because the survey will have been removed. The studies were classified into five types, each subject to a different modelling approach based on the size of the catchments, their characteristics and the significance of the study.
The simpler the approach the more conservative the modelling, so there was a trade-off between the cost of undertaking the study and the conservatism of the impact assessment. Table 2 details the different levels of study.
Acceptable approach The second edition of the UPM manual provides a planning framework for these studies, however, a lot of details about methodologies and reporting had to be agreed between the study participants UU, the Environment Agency (EA) and MWH. The strict time constraints dictated that once an acceptable approach had been agreed between the stakeholders, it was adhered to for all subsequent studies. Full network detail was included in the sewer models and models were run to produce results at a five-minute resolution. This enabled the methods for testing compliance to be refined:
- all events from a ten-year rainfall series (typically 1,100-1,500 events) were run to assess compliance,
- sewer models were run with a five or seven-day antecedent dry period to build up sediment loads in the network,
- large surface water systems were included in the modelling,
- high percentile standards were assessed based on an hour-by-hour breakdown of CSO spills from the ten-year runs,
- high percentile standards were adjusted to account for dry weather failures,
- fundamental intermittent standards (FIS) were assessed based on the worst hour/six hours of spill from an event,
- default network modelling used CIRIA recommended default water quality profiles and was verified against observed crude sewage quality at the receiving works,
- a look-up table of default river quality parameters was produced to aid the development of simple river models,
- once storage solutions were developed, these were built into the sewer models, ten years of rainfall data run through the model and the compliance tested once more.
For most of the studies, ten-year design rainfall series were produced in Stormpac 2 and verified with observed data, although hourly observed data was used for two rainfall zones. The north-west was divided up into 34 zones of similar rainfall and series developed for these zones.
Software was used to add API30 values and tidy up the series. Sewer network flow and quality modelling was carried out using dynamic network models in either HydroWorks or InfoWorks. Simplified models were not used. WwTW's hydraulics and stormtank spill quality were included in the sewer network model.
The sewer model provided final effluent flow predictions and final effluent quality was simulated using a probability distribution. River quality impact models (RQIMs) were constructed and calibrated in Mike 11 and then used to calibrate simplified river models (SRM) that incorporated the algorithms in SimpolV2.analysis tools
The SRIM was built in an in-house MWH tool, Simon. Figure 1 shows the impact analysis tools. The MWH program Stanks was used to process CSO spill flow and load time-series predictions from the HydroWorks/ InfoWorks models.
The program produces tables of data summarising the spills from CSOs and formats the data for input to Simon.
Stanks' further role was to simulate the effect of off-line storage or altered pass-forward flows as part of solution development. Stanks also allows the user to impose BOD or ammonia qualities on a set of hydraulic predictions, allowing a surface water system or WwTW final effluent quality to be simulated.
The MWH programmes Simon and Bounder were initially used for the impact analysis. As the project developed, Stanks, Simon and Bounder were combined into one piece of software - river impact optimisation tool (RIOT). RIOT was developed to enable multiple off-line storage scenarios to be assessed and an optimiser routine was included to identify optimal storage configurations based on least cost or overall storage volume. The approach of using full network models resulted in very complicated analyses.
In one study, RIOT co-ordinated inputs from about 180 CSOs, drawn from a total of 3Gb of results files (1,172 events, five results files per event = 5,860 files per network) from ten separate network models. Once this complicated scenario was set up, new results could be inserted and the same basic set-up used for needs assessment, solutions development and final impact assessment using the final solutions models.
RIOT turned out to be an essential tool in the programme - the automated procedures allowed many simulations to be carried out with minimal modeller input. At the beginning of the studies, a library of MapInfo GIS data was compiled at a central server location. This contained all the information pertaining to the studies from UU asset databases, SIRS, DG5 and outfall locations, additional background mapping, rivers and contours. A front-end termed SAMS, with a number of control buttons and routines, was written to call up individual tables and produce plans. This permitted the information to be easily accessed by all members of the team. The underlying philosophy of the studies was to efficiently undertake the impact analysis and pass data and models seamlessly from the needs to the solution and detailed solution phases. MWH data manager (DM) software was used to build the network models and hold and process modelling information.
DM holds the asset data, standardises model build or alteration and undertakes quality checks on the data. These permit changes to models to be tracked over time, providing an audit trail from the development of the models of the existing systems through to the modelling of solutions.
DM was available to all staff concerned with the modelling without the need for precious HydroWorks/InfoWorks dongles being tied up in model builds when they were needed for simulations. MWH GraphViewer was used to present results from sewer network calibrations. The programme presents observed and modelled data for all calibration sites. GraphViewer produces comparative plots of rainfall, flow and quality, and will produce summary statistics and scattergraphs.
Once configured, it will repeat the production of plots for different model runs, allowing the modeller to focus on verification rather than data processing. Once verification is obtained the observed and modelled data can be written to CD, along with a copy of GraphViewer for model sign-off or audit. Having software developers on-hand was particularly beneficial because as the programme progressed, new problems arose which needed fixing or development of the existing tools. With such a complicated project strong technical management was required to roll out software and methodologies and ensure they were adhered to.
At certain stages in the job, work was going on in seven MWH offices, in three continents. This offered a considerable challenge to provide a uniform product. Although there is no substitute for good old-fashioned nagging, technology was available to help produce a consistent project. Centralised Lotus Notes databases were provided to hold all of the project communications - very useful for keeping track of changes to each particular project. In addition, the MWH company intranet, KNET, held all of the methodologies and processing tools, giving easy access to all staff.
At the start of the studies it was undecided what standard should be applied to each watercourse. Two standards are described in the UPM manual, but there was no policy as to which should be applied.
The studies progressed with the aim of meeting both standards, however, as time progressed it became increasingly clear attempting to fully model compliance would significantly impact on the programme. The standards had their pros and cons but the high percentile standards had an advantage because they were easier to apply and more definitive.
Having reviewed a lot of the data collected, and in the light of initial modelling results, UU and the EA decided to apply high percentile standards only for the majority of studies.
A few watercourses with a high pH were also assessed against an additional fundamental intermittent standard (FIS) unionised ammonia standard, while others with marginal DO levels were assessed against additional FIS DO and unionised ammonia standards.
UU and the EA then decided the standard that should be attained in the watercourse, setting the target for solutions development.
This adoption of a simpler methodology and a significant reduction in the workload had a big impact on reducing the work programme.
One UPM study determined there was a requirement for nearly 50,000m3 storage at a WwTW to reduce the spills from the stormtanks and the inlet CSO. This presented a problem. The WwTW had taken more flow over the years as other smaller works in the area had been decommissioned and their effluents pumped to it. Consequently, there was little excess capacity to treat the returned flows from the required increased storage on the site. In addition, there was not sufficient space at the WwTW to construct the required storage facility.
A new approach was needed. The final effluent quality at the WwTW was due to be improved in AMP4.
The combination of reduction of the intermittent discharges and the improvement of the final effluent quality offered a potential answer to the problem. The UPM study had already developed tools and methods to examine upgrading scenarios.
In addition, the RIOT software allowed this to be done efficiently by combining the inputs from a number of different models, maintaining good QA and providing optimal storage solutions for a given scenario. This was particularly useful because an answer was required quickly.
Several scenarios were run to look at increasing the flow treated at the WwTW and/or improving the final effluent quality and the consequent effect on storage requirements.
The following final effluent quality scenarios were modelled - option A and a higher effluent quality option B.
The earlier work had established a model for the final effluent quality, so for a given 95 percentile value a distribution mean and standard deviation could be calculated and a distribution for final effluent quality established.
The quality model was applied in RIOT to the final effluent flows generated by HydroWorks. The results of the analyses are presented in Figure 2. The work showed the existing storage at the works would be adequate to meet water quality standards if the pass forward flow was increased. All of the needs assessments were completed by the end of August 2002.
The UPM procedure provided a precise method to quantify the water quality impact of the intermittent discharges. The UPM methodology enabled the identification of actual problems. For example, some sites previously considered to be problematic proved not to be so and some new sites were identified as first-time issues.
If this integrated approach had not been taken, a lot of money could have been misdirected addressing CSOs that were not causing a problem while missing a number that were. The methods applied on the studies were technically advanced two or three years ago, but in the mean time, modelling software and hardware has developed further.
Ultimately, this will potentially lead to the sewer quality models being run as a continuous simulation. This will reduce the arguments about antecedent dry periods and sediment build-up, and should remove the need to make conservative assess- ments in the modelling.
A continuous simulation approach would be beneficial for the river impact analysis - this would allow a more automated and easier assessment of the compliance of the system against ten percentile, 90 percentile and 99 percentile standards and, potentially, FIS.
However, the development and use of continuous simulation and its impact on future AMP projects needs careful consideration with detailed definition to allow funding to be accurately assessed.
Technical issues will also have to be resolved, such as an acceptable methodology for assessing compliance against FIS in continuous simulations. A number of watercourses were seen to regularly fail FIS in dry, summer weather as a result of the diurnal variation due to plant growth.
There is no satisfactory methodology for developing solutions in such cases. The issue may be the level of nutrients in WwTW final effluent discharges, not a CSO issue.
Good quality, continuously-observed data is essential for understanding how the system behaves and developing an acceptable solution.
A number of studies were complicated by the need to assess the impacts of distributed CSOs in a large catchment where the watercourse has a long time of travel.
The approaches applied in these studies were often rather conservative and complicated. A simple river model incorporating distrib-uted CSO inputs would be a useful development.