Special feature: laboratory analysis
As companies become increasingly aware of the impact of contaminated land and groundwater on their corporate liability, the demand for quality assured laboratory services, effective remediation contracting and accurate consultancy advice becomes more acute.
Lab customers face a barrage of jargon about quality, equipment and test methods, making the task of commissioning a lab that much more difficult.
So in this month’s contaminated land feature, IEM attempts to demystify the complex world of laboratory analysis by exploring the crucial role of the specialist laboratory in contaminated land and groundwater analysis. Expert contributors provide readers with an insight into proficiency testing and laboratory accreditation schemes, the rationale and scope of existing chemical testing methods and a layman’s guide to the key analytical methods and equipment used in contaminated land investigations. Finally, our table summarises the key credentials of a selection of leading specialist laboratories.
Contaminated land assessment: clarifying the complex
While proficiency testing and accreditation schemes have been developed to boost confidence in the quality of work coming out of labs, they do not absolve lab users from their responsibility for understanding the relevance and limitations of chemical test data. Fergus Healy, formerly senior environmental chemist at Johnson Poole & Bloomer and now business manager, Yorkshire Environmental Solutions and Mark Rawlings, Joynes Pikes Associates, provide a user’s guide to the leading schemes and test methods.
Part I: Be confident in your choice of laboratory
Over the last few years the stakeholders involved in the investigation of contaminated land have gradually embraced the concept of risk assessment (RA). Fundamental to successful and defensible RA is the provision of high quality analytical data that can be plugged into the hazard assessment component. The accuracy and reliability of chemical test results recorded by laboratories is a major concern to many involved in the contaminated land field. This is particularly relevant when many less informed practitioners are still rigidly applying trigger and threshold levels to assess whether a site requires ‘clean up’.
The continuity and reproducibility of chemical analysis in an intra- and inter-laboratory context strikes at the heart of real concerns about the confidence that can be placed on the interpretation of analytical data.
Although a small number of laboratories, that have splintered away from proficiency scheme sub-committees, are endeavouring to standardise methodologies, in general there is a variety of methods in use by laboratories for any one determinand. This may have been borne out of the former Department of the Environment’s (DoE) “fit for purpose” ethos and a wish to retain some flexibility in the system. The DoE has historically eschewed an overly prescriptive attitude that would naturally lead to the development of so called ‘brown book’ methods.
Unfortunately the side effect of this approach is significant potential differences between laboratories. Clearly these differences have technical/scientific ramifications but additionally possibly major cost implications as sites are condemned unnecessarily.
Inter-laboratory proficiency testing schemes
The relatively recent development of proficiency testing schemes (PT) for contaminated land is an attempt to identify and quantify differences in laboratories and methodologies. Although the initiative is to be encouraged it does highlight the large discrepancies there can be between laboratories analysing the same sample using different methodologies. It is not untypical to see orders of magnitude separating participants.
The CONTEST PT organisd by the Laboratory of the Government Chemist (LGC) is the pre-eminent scheme with almost one hundred subscribing laboratories. PTs like CONTEST and accreditation schemes like UKAS (formerly NAMAS) are tools that mya be used in an attempt to validate the performance of an analytical laboratory.
CONTEST operates via the distribution of a number of samples, in a number of matrices, for a variety of determinands. In theory the participant laboratories treat these PT samples in the same way as routine samples. The results of analysis along with the methodologies used are submitted to the LGC who collate all the information. A ‘z-score’ is assigned to each result which reflects deviation from a consensus mean.
A test with a z-score of less than two is considered satisfactory. Achieving a score of less than two essentially means that the submitted result is similar to a significant number of other participants; it is not necessarily an assessment of accuracy.
Simply participating in an external PT is not an indication of accuracy or precision, rather the willingness to uncover one’s flaws and endeavour to improve them (collectively ideally). The more cynical may suggest participation is no more than a case of being seen to be doing the right thing. The crucial point with PT’s is the laboratories’ response to unsatisfactory results, whether real or perceived. This needs to be a systematic, coherent and diligent approach, investigating the cause of poor quality and attempting to right the situation. Clearly in today’s commercial world laboratories may cite time constraints as reasons not to be rigorous and thorough in this regard. However, if standards are to be developed and driven upwards this area cannot be underestimated. Indeed a number of laboratories do have various levels of personnel with quality responsibilities.
Uncertainty and reliability in chemical measurement
Encompassing inter-laboratory variations and more besides, the concept of uncertainty is fundamental to the provision of all analytical data. The term itself is contentious, implying doubt and lack of technical ability. This negative connotation, however ill deserved, does clearly have ramifications in relation in particular to legal defensibility. Uncertainty reflects the range within which the true result actually falls. Inherent variability in chemical measurement takes account of all potential error sources in the final result. This includes inter-laboratory differences and intra-laboratory differences. The latter typically include biases associated with actual analytical runs, methodologies and laboratory specific factors such as the environment in which the test was performed, and the grade or brand of reagent used.
Clearly it is essential that analytical data is never taken at face value, but instead its limitations are understood and taken into account when interpreting against pre-defined standards.
The United Kingdom Accreditation Service (UKAS) was formed as a merger between NAMAS (National Accreditation of Measurement and Sampling) and NACB (National Certification Bodies). A laboratory seeking UKAS accreditation has to satisfy UKAS Technical Officers and independent assessors that it is technically competent and operates a quality system to rigorous international standards.
Central to the requirements is the laboratory’s Quality Manual, which must cover the policies and procedures for every accreditation, calibration and test performed, and for all activities that may have a bearing on the quality of the work done. UKAS will issue a schedule of accreditation to the successful laboratory and re-visit the laboratory annually.
Although UKAS does police the system, in terms of re-visiting accredited tests it is questionable whether it has the necessary resources to do this as comprehensively as it would ideally wish. Laboratory methodologies and systems of operation are by their very nature inherently complex – it is a daunting task, even for a team of assessors, to evaluate all aspects of a laboratory’s operations and all tests performed at every visit.
The seal of accreditation for a method is an endorsement of the quality system that influences all work associated with that method and an estimation of the reproducibility between analysts. It deems a method repeatable by any adequately trained analyst vis-a-vis a recipe book. It is not a comment on the application of a given method in a given scenario.
UKAS requires uncertainty to be quoted to the recipients of the data, where relevant to the validity of the application. It is important for users of the data to be aware of this and request the information so that a more realistic and pragmatic interpretation can be made.
This brings us full circle to the situation where a number of UKAS accredited laboratories generate differing results using similar or dissimilar methods on the same PT samples. Which begs the question, can degrees of correctness be assessed and if so what form could this assessment take?
One solution may be individual laboratory audits, undertaken formally by independent assessors who can devote the necessary time to immerse themselves in the fine detail. Typically this could encompass the meaning, scientific validity and applicability of a range of analytical methodologies and an examination of the laboratory’s systems of management and control.
Although NAMAS accreditation and participation in external PT’s may be regarded as imperatives they should not be seen as some kind of panacea which removes all responsibility from laboratory users. It is not an invitation for those who commission laboratories to indulge in complacency, looking no further than these inclusions.
Another key issue is the relevance of the test methods to the scenarios that the risk assessment attempts to simulate.
CONTEST, the proficiency scheme run by Laboratory of the Government Chemist, is a pre-eminent scheme, with some 100 subscribing laboratories.
Part II: The applicability of testing
Understanding the rationale and scope of chemical testing methods is clearly initially important for all the major reasons explained above. Before often far reaching conclusions are drawn from chemical test data, those who interpret analytical results should be fully aware of the relevance and limitations of such data.
This sentiment can be most easily illustrated with a number of specific examples.
Total metal analyses
Total analyses, which essentially are undertaken by digesting the soils in a mixture of nitric and hydrochloric acid at temperatures of 160oC to 180oC may not produce data relevant to the situations encountered on a given site. In particular, ingestion, inhalation and dermal contact of contaminated materials by humans is not being replicated by total analyses. It may be more representative to determine the extractable concentration of contaminants by simulating the pH and temperature of the human metabolism and, in particular, the digestive system, i.e. low pH levels combination with temperatures of approximately 35oC, over a time period of 3-4 hours, using organic acids and enzymes similar to those in the gut. The use of total analyses for assessing human health effects is at the very least overly precautionary and indeed may have resulted in unnecessarily costly remediation works being undertaken.
With regards to availability of contaminants to plants, the Inter-departmental Committee on the Redevelopment of Contaminated Land (ICRCL) quoted threshold trigger levels for various phytotoxins based on ethylene diamine tetracetic acid (EDTA) extracts in their first edition of Guidance Note 59/83. In the second edition, trigger levels based on EDTA extracts were dropped in favour of total analyses. It is understood that this was because better correlation between phytotoxic effect and concentration was possible with total analyses. Clearly the relationship between phytotoxic metals in the soil and the availability and distribution of these agents through the plants is highly complex. Taken to its natural conclusion, if judgements are to be made on hazard/risk to plants then they should ideally be based on site specific criteria such as plant genotype, pH, redox potential and organic matter present.
The use of the thresholds quoted in ICRCL may be generally appropriate but will not be appropriate under all circumstances. Furthermore the implication is that copper, nickel and zinc are the only phytotoxic agents. In some instances cadmium, for example, is more phytotoxic. In a complex interaction between calcium and phosphorus, and more so at high pHs, zinc becomes progressively less phytotoxic, conflicting with the spirit of ICRCL.
Overall then, historically narrow arbitrary judgements about the use of material for landscaping need to be refined to consider the necessity for a higher degree of sophistication.
Leachability tests are a widely used tool available for use in RAs to assess the availability of contaminants to the aquatic environment and bioavailability to other targets such as higher order plants. However, it must be realised that such tests only reflect the short term leachability of contaminants from a soil mass and may not, therefore, represent long term conditions. Furthermore the rigidly controlled laboratory environment is not directly analogous to the intrinsic complexities seen in-situ, such as oxygen penetration, acidity and temperature effects).
The Environment Agency (EA) R&D Technical Report P13 refers to its recent review of the EA leach test methodology in the Project Record which came to the conclusion that for most inorganic compounds of low volatility, the test provides a ‘reasonable worst case’ estimate of the amount of contaminant available to the aquatic environment within a relatively short timescale (of the order of a few years). Leachability can also be determined by using partitioning relationships, however the EA indicates that these theoretical methods are more conservative.
Several leachate test methods are currently available which have different test conditions. However the EA’s own adopted methodology is almost invariably used, allowing ‘direct’ comparison to be made with the trigger values set for the Upper Tame Catchment. This guidance was devised to assess the suitability of contaminated materials for deposition within that specific area. It recommends a methodology for determining the solubility of contaminants and compares the results with the acceptable quality for discharges into the watercourses. The overall concept of leachability testing is useful but the practical outcome should only be viewed as an indicator to be utilised in conjunction with other relevant data.
The relevance of chemical testing can, to some extent, be judged by the relevance of the regulatory guidance flowing from the UK Government. In terms of contaminated land this guidance is represented by one rather isolated document in the shape of ICRCL 59/83. It is probably fair comment that ICRCL is generally felt to be rather outmoded and the environmental sector is crying out for more relevant guidance from government. It is acknowledged that this is incipient in the shape of a number of the former DoE’s Contaminated Land Research Reports (CLR Series). The publication of the CLEA (Contaminated Land Exposure Assessment) model is also awaited by many in the industry. This has unfortunately been delayed by last year’s change in government, and seems to have become less of a priority. But until its pronouncement some practitioners will continue to use other quantitative risk based computer models whilst others consult the ICRCL and its surrogate sister, the Dutch intervention values.
The relatively recent moves towards risk assessment (qualitative and quantitative) and its new widespread use in environmental consultancy, and to a lesser extent with the 1996 Special Waste Regulations, lead us to a situation where the cart has been put before the horse. A large proportion of ICRCL 59/83 is devoted to inorganic species (in contrast to the Dutch guidance). In making decisions about degrees of hazard and concomitant risk, the identity of the compounds present, rather than individual cations or anions, is crucial. Without this speciation a number of assumptions have to be made ie:
The most likely combinations from knowledge of past industrial usage.
The combination of cations and anions giving rise to the most hazardous compounds.
It is no longer always good enough to know how much arsenic is present, rather how much of a particular compound is present. This may appear superficially fastidious or semantic but when one recognises that arsenic trioxide is approximately twenty times more toxic than calcium arsenate and fifty times more toxic than arsenic acid, it can be seen how vital these distinctions are. In crude terms reclaiming a site that is contaminated with arsenic trioxide is likely to be considerably more costly than a similar degree of contamination with calcium arsenate.
The source-pathway-target approach espoused by the 1995 Environment Act, and universally accepted, does not always dovetail well with the principles of ICRCL. The concept of defining hazards and their impact on a variety of targets is obviously far more sophisticated than the approach that drove the formulation of ICRCL, which was essentially derived for other, more limited, purposes.
The two key factors in progressing the current situation are:
improving current chemical testing methods;
undertaking more specific testing when elevated concentrations of contaminants are identified.
More specific chemical testing could include that discussed below:
Speciation of Inorganic Com-pounds: The logical way forward in obtaining more meaningful data for the accurate assessment of human toxicological effects may be to undertake speciation of compounds (particularly metals) if high levels of total compounds are detected. Such metals that exist in well-defined speciated forms include: arsenic, chromium, lead, mercury, antimony, phosphorus, selenium, silicon and tin. Whilst speciation analysis is significantly more costly than total analysis, if these tests indicate the presence of lower toxicity species, there is the potential for considerable remedial cost savings.
The impetus for this more sophisticated analysis is not only from quantitative risk assessment but also the provisions contained in the recent Special Waste Regulations. Within this document clear reference is made to the need for something more relevant than assessment of individual cations and anions.
Direct Toxicity Assessments: The Environment Agency is currently considering direct toxicity assessments (DTAs) for effluent using organisms including algae, macroinvertibrates and fish. Such tests allow synergistic and antagonistic effects of “cocktails” of pollutants to be determined. As variations in river water chemistry can also be taken into account in the tests, DTAs may be regarded as more site specific than traditional chemical analysis. DTAs could not only become an important new tool for assessing risks to the aquatic environment from effluent, but have the potential for assessing the effects of groundwaters and leachates from “contaminated land”.
Phytotoxicity Tests: Seed germination and root elongation tests have been undertaken to determine the phytotoxicity of soils following bioremediation. Such tests could be used as a tool for further assessing the phytotoxic effects of soils and groundwaters on proposed site specific species of flora. This would overcome the current drawbacks of phytotoxic effects being species dependent.
For all the described reasons relating to intrinsic variability (uncertainty) and application/relevance, those who interpret analytical data should be both questioning and vigilant. Working with progressive laboratories and guided by Government and regulators, should be the aim of all those stakeholders who are interested in providing defensible high quality bespoke solutions to complex contamination issues.
It is clear that the current rather inconsistent and confused approach to contaminated land assessment is wholly unsatisfactory.
The still widely applied ICRCL guidelines may no longer be appropriate within the current assessment framework, but a crumb of comfort may be that they are at least apparently over precautionary. Indeed it is this desire to shed this conservative “worst-case” approach ethos and embrace a more realistic, pragmatic and site specific approach that drives the development of RA.
Whilst the industry holds its breath waiting for the publication of CLEA and supporting documentation, not to mention the guidance notes for the Environment Act, it ought to be the goal of all forward thinking laboratories to gear up to meet the challenges posed by a quantitative risk assessment regime.
There is a danger that advances in toxicology, a key element in RA, will outstrip advances in analytical chemistry and a situation will result where chemical testing data cannot adequately support all the disparate provisions of RA.
© Faversham House Ltd 2023 edie news articles may be copied or forwarded for individual use only. No other reproduction or distribution is permitted without prior written consent.
Action inspires action. Stay ahead of the curve with sustainability and energy newsletters from edieSubscribe
Sign up to our daily newsletter to stay ahead of the curve with the latest updates on sustainability and energy