Watching the waters
Andrew Wetherill of Yorkshire Water assesses the development of technology to monitor and optimise the process of coagulation
The lead-up to the Cryptosporidium legislation of 1999 focused attention on turbidity removal in the water treatment process. Yorkshire Water adopted an approach of setting tight but realistic turbidity targets and alarm values for treated water based on data from on-line instruments (Wetherill, 2001)1. This work led to a higher degree of awareness of the dependency of filtered water turbidity on coagulation dose and coagulation pH. Jar tests were traditionally conducted to determine the most suitable coagulant dose and coagulation pH. However, the turbidity data monitoring work led to the practice of interpreting on-line data in a similar way to jar test data in order to optimise coagulation. The work led to the development of an automatic coagulation control system (ACC) which is being installed at 23 WTWs throughout Yorkshire. To date 15 systems are running.
Over the course of the next few pages we shall see examples of how changes in raw water natural organic matter (NOM) content, coagulant dose, coagulation pH and water temperature affect filter outlet water turbidity trend baselines.
Yorkshire Water operates 30 WTWs, at which the company utilises inorganic coagulants in the potable water treatment process. Prior to the introduction of automatic systems at most of these works, the coagulant dose was adjusted manually, often in response to or in anticipation of changes in raw water NOM concentration. Coagulant dose adjustments were necessary in order to maintain high treated water quality. At some sites, streaming current detectors (SCD) were being used with some success to automate this process, but their introduction has been difficult because of, among other things, sensitivity to coagulation pH.
Alternative methods of automatic coagulation control have been researched by Yorkshire Water over the course of the past decade. A useful development has been the recent appearance of the ABB high-range UV 7320 monitor. This, used in conjunction with readily available data from on-line monitors measuring turbidity, temperature, pH, coagulant flow and works flow, has facilitated the
development of a successful ACC system suitable for installation at many Yorkshire Water sites.
Interpreting turbidity trends
Coagulation involves in-situ production of hydrolysis products of hydrolysing metal salts to destabilise particulate suspensions of NOM, allowing the particle mixture to flocculate to form larger particles or floc.
A characteristic of floc is that it can be captured on sand filters either by means of direct contact with the filter media or through contact with similar particles already attached to filter media. For given filtration conditions (filtration flow rate, filter media size, media depth and particle loading range), the concentration of particles in the filtrate is an indication of the extent of the particle-to-media and particle-to-particle interactions within the filter.
These interactions are largely dependent on the surface characteristics of the particles, which in turn are dependent on the degree of coagulation optimisation.
As a typical filter run progresses, the number of particle capture sites within the filter bed diminishes and the filter will eventually be blocked or particle concentration in the filtrate will increase. Under these conditions the increasing particle concentration is not associated with coagulation optimisation.
A sudden increase flow within a filter can dislodge particles from the media, pushing them further into the filter bed (Thurston, 2001)2. When approaching the end of the filter run, this can result in an increase in the particle concentration of the filtrate. This increase is not taken to be an indicator of extent of coagulation optimisation.
A significant non-typical increase of particle concentration of the pre-filtered water can result in a significant increase of particles penetrating the filter; this is because, though the proportion of influent particles being removed across the filter bed does not necessarily change, the number of particles penetrating the filter increases due to increased particle load. Such an increase is not considered when using filtrate particle concentration data to determine the extent of coagulation optimisation.
After backwashing, at the start of the filter run there is a period of higher-than-normal filtrate particle concentration, a period normally referred to as ripening (Chipps, 1998)3. This part of the filtrate particle concentration trend is ignored when interpreting for coagulation optimisation, although the ripening peak has been observed to be higher and wider when coagulation optimisation is compromised.
For practical reasons, the method used for the routine measurement of on-line filtrate particle concentration is turbidity. It is suggested that suitable interpretation of individual filter outlet turbidity trend data, ignoring the interferences described above, can be used to determine coagulation optimisation; thereby may be reduced or even eliminated the requirement for the traditional jar test to aid routine optimisation of the coagulant dose.
Coagulation pH and water temperature
As water temperature increases it has been observed that, at constant coagulation pH, NOM removal efficiency of the coagulant decreases. This is particularly noticeable as the temperature increases above 12°C.
Comparing Figures 1 and 2 shows that a lower coagulant dose for a similar raw water UV absorbance resulted in a lower filtered turbidity at 4°C than the higher coagulant dose at 16°C. Coagulation pH was similar in both instances. Evidence from other sites that also use aluminium sulphate coagulant showed that reducing coagulation pH towards pH 6 in summer lowers filtered water turbidity for a given raw water UV and coagulant dose. That is, to get equivalent NOM removal efficiency from the coagulant, a lower coagulation pH is required as temperature increases.
Solubility curves of aluminium in aqueous solutions show that pH of minimum solubility, albeit in deionised water, decreases as temperature increases. Furthermore minimum solubility occurs at constant pOH over the temperature range 0-25°C. The trend data from works D (Figures 1 and 2) tends to tally with the findings of various researchers summarised by Hanson and Cleasby (1990)4. It lends further empirical support to their conclusion that operating within a constant pOH range is more appropriate than operating at constant pH in order to achieve coagulation at the most economical coagulant dose as temperature changes.
However, operation at the most economical coagulant dose may not lead to minimum treatment chemical costs at high temperature, due to the cost of acid and alkali when lowering and then raising pH of high-alkalinity river waters before and after coagulation. Where possible, and where economically viable, coagulation pH is lowered as temperature increases at both iron and aluminium coagulation treatment works in Yorkshire.
Coagulant dose/raw water UV ratio
Reducing the coagulant dose to raw water UV absorbance ratio leads to an increasing filtered water turbidity trend baseline and widening ripening peaks, as demonstrated in Figures 3 and 4. During the early part of 2005, the coagulant dose to raw water UV absorbance ratio at many Yorkshire Water works treating moorland water was significantly less than it was during the summer of 2004. This was despite operating at low coagulation pH during the summer.
It could be that seasonal changes in the composition of the NOM are making a contribution to this effect. If this is the case, then Yorkshire Water at present has no way of measuring such variations on-line and it could present a problem for the feed-forward automatic coagulation control system.
Feed-forward automatic coagulation control
A UV monitor, with appropriate algorithm, was installed at two river-supplied WTWs due to the following having being established by ongoing internal research:
Figure 5 is a schematic of the system. An algorithm was developed by examining the relevant process trend data from the works. The algorithm takes the form:
D = f(UV) + (f(T) x f(UV)) + K
Where D = predicted coagulant dose (mg/l of coagulant metal)
UV = raw water light absorbance at 254nm
T = raw water temperature
K = operator adjustable value (offset)
Observed seasonal changes in coagulant efficiency are
catered for by the temperature function and, should that be
inadequate, the operator adjustable value. The constants in the
algorithm are tuned over time with long-term periodic examination of the process trend data.
The control algorithm was implemented using existing on-site programmable logic controllers (PLCs) and software installation was undertaken by Yorkshire Water staff. The PLC system enabled accurate data manipulation, control and monitoring. The required input signals were read into the PLC and scaled appropriately. A variety of error-checking routines were then initiated in order to provide security of the predicted dose control algorithm. These included:
It would have been difficult to implement ACC without easy access to process trend data afforded by Yorkshire Water’s information technology infrastructure. The data, in spreadsheet chart form, are periodically examined for shift in filtered water turbidity baseline. Using this system, a process chemist can review the coagulation/filtration performance of several WTWs from a single location and can use the data to recommend changes to the algorithm constants in order to fine-tune the coagulant dose.
Comparisons between the actual coagulant dose and the predicted coagulant dose calculated using the algorithm on the spreadsheet are also undertaken. If the spreadsheet-calculated predicted dose and the actual plant coagulant dose deviate, then the process engineer has changed the offset or the system is not running in auto-control.
Figure 6 shows a typical example of the performance of the ACC system at a lowland river source. The peaks in filtered water are a combination of ripening peaks and breakthrough caused by filtered water flow changes. Filtered water turbidity baseline is about 0.03 FTU.
Process scientists and plant operators are being trained to interpret filtered water turbidity trend data and make the appropriate adjustments to the ACC system offset, should they be required. Jar testing of raw water, at various doses and various dosed raw pH values, is becoming less frequent.
Implementation of the ACC system, based on use of on-line data as an indicator of coagulation optimisation, has the following benefits:
This article is based upon a paper presented at Developments in Water Treatment and Suppy, organised by Cranfield University and held in York on July 5-6, 2005.
1. Wetherill, A and O’Neill G, Routine monitoring, reporting and interpretation of filter performance. Advances in Rapid Granular Filtration in Water Treatment, London, 4-6 April 2001
2. Thurston A, Fitzpatrick CSB, Tattersall J. Particle breakthrough caused by flow rate changes during rapid gravity filtration. Advances in Rapid Granular Filtration in Water Treatment, London, 4-6 April 2001
3. Chipps MJ, An experimental investigation into filter ripening:
Contact filtration of lowland reservoir water. PhD thesis. University of London, 1998
4. Hanson AT and Cleasby JL, The effects of temperature on turbulent flocculation: fluid dynamics and chemistry. Journal of American Water Works Association. 1990, 82, No.11, 56-73
© Faversham House Ltd 2023 edie news articles may be copied or forwarded for individual use only. No other reproduction or distribution is permitted without prior written consent.
Please login or Register to leave a comment.