To the Nth degree
Environmental reporting is about to take on new meaning for growing numbers of organisations. Nick Cottam looks at the drivers behind the data
When Dr Keith Tovey returned from a fact-finding mission to China he did so with his academic’s command of statistics intact. The Shanghai waste mountain, for example, amounted to 11,000 tonnes of rubbish generated every day – all currently heading for a 12-metre deep, 4km2 landfill site. Rubbish generated by an overheated economy and a local population of 17 million is part of the carbon conundrum and Tovey, a reader in environmental sciences at the University of East Anglia, was in town to investigate carbon reduction initiatives.
When it comes to understanding what is happening to our environment, Tovey is not alone in wanting to know the facts – where emissions are coming from and who is committed to reductions. It seems a straightforward task, but making sense of those facts, he admits, can be far from it.
“It depends on how you account for things. Britain, for example, has changed the way we analyse carbon dioxide emissions and other countries are doing the same. There are some real scandals out there.” Discrepancies can stem from the quality of the data and what is chosen as a starting point. There is also the issue of when measurement begins. And achieving cuts in emissions too early means companies could find themselves struggling when the main event is up and running.
The accuracy – and value – of different types of data is concerning more and more organisations following the launch of the EU emissions trading scheme, which began operating on 1 January. “The scheme requires organisations to determine what their emissions actually are,” says Malcolm Hutton, global head of business development at consultancy ERM. “They may ultimately become more efficient in achieving reductions, but first they will have to measure emissions and get a trading mechanism in place.”
UK-based facilities are likely to face some of the toughest challenges, as the government is committed to a 60% reduction in greenhouse gas emissions by 2050. With the London-based Climate Change Agency in place, this goal includes the much shorter term target of achieving a 20% reduction by 2010 – one reason why the government may be taking a hard look at its own measurement parameters.
The measurement of CO2 emissions from electricity generation is an example says Tovey. “In the 1990s there was a steady fall in tracked carbon dioxide emissions for each kWh of electricity. But in 2000 this was frozen at 0.43Kg of CO2/kWh for the foreseeable future, which makes a mockery of any changes in the fuel mix. I am uncomfortable with the 0.43 level because it may not take account of emissions during transmission. It depends where you do your metering, but 0.45 may be more realistic.”
Skewing the figures
If governments can skew the figures to meet political objectives, where does that leave industry? Concerned about their emissions data if they want to end up winners in the early rounds of carbon trading for one thing, and looking anxiously at the new corporate reporting rules which mean they will need to be certain that non-financial information is accurate and verifiable.
“One of the biggest drivers on European companies listed in the US is the Sarbanes Oxley Act,” says Hutton. “This means environmental directors are being interviewed by lawyers to explain exactly how they know they are in compliance.” Add to this the requirement for UK companies to report on key risks as part of OFRs and it is small wonder they are taking a closer look at source data and measurement processes.
Despite the small matter of miscalculating its oil reserves, Shell considers itself up there with the title chasers when it comes to the measurement and the management of greenhouse gas emissions. With a 1990 starting point of 123m tonnes of CO2 equivalent emissions, “I know I have one of the most rigorous processes for measuring emissions in the world,” Shell’s group environmental adviser Richard Sykes maintains.
“Our emissions data is accurate to +/-5% and at the moment I’m happy with that accuracy level. We have a consistent basis for measuring the environmental data we report each year and every facility we operate must adhere to our monitoring and reporting protocol.”
A financial and logistical non-starter
Sykes admits that getting below the 5% mark – something the EU is currently asking for – is a financial and logistical non-starter. “To get below 5% you need to meter individual emissions streams, which would result in incremental costs for little benefit in terms of greater data accuracy.”
Another company engaged in an as yet unresolved discussion over the finer points of emissions data is speciality chemicals company Johnson Matthey. “We are likely to opt out of the first round of ETS because we are still in discussions about our overall emission limit,” says the company’s EHS director Bob Binney. “We believe the limit has been set too low and we are likely to continue with the carbon levy until 2008.”
Binney, who comes from an operations background and only recently took up his role, is also unclear as to exactly what can be achieved on the data accuracy front. “With around 50,000 products, we are an incredibly disparate operation,” he says. “That’s why it is so difficult to confirm the accuracy of different types of emission – we don’t publish a margin of error because it would be too difficult to do so.”
He stresses that JM emits relatively small quantities of greenhouse gases compared to Shell, which in 2003 dished out 112m tonnes of CO2 globally, roughly equivalent to that produced by Belgium. While this amounts to a significant reduction on 1990 levels – largely through reduced gas flaring in Nigeria – “it gets increasingly difficult”, says Sykes, “especially as our customers are demanding lower sulphur fuels which are more energy-intensive to produce.”
Supporting confident decision-making
The benchmark, according to the Global Reporting Initiative, is to measure emissions data accurately enough to support confident decision-making. This is qualified – rather vaguely – by the rider that quantitative information may depend on “specific sampling methods used to gather hundreds of data points from multiple operating units”. In other words, allow for a margin of error which may be +/-5% but could be significantly more.
Hutton offers a more specific explanation. “There are a number of reasons why emissions data could be inaccurate,” he says. “It could be because of the wrong equipment or sampling techniques, the wrong sampling period or an incorrectly calibrated machine.” Poorly serviced equipment, suggests Tovey, could be a key reason why the actual margin of error can be way above the +/-5% figure.
Hutton contrasts the European approach to sampling in the US, where companies have long been used to the exacting demands of the EPA and the Toxic Release Inventory. “In the US, the EPA issues approved stack sampling techniques and consultants have to go on a course before they can carry out sampling. It’s a very complex set of skills at this level.”
Whatever other companies have to do to meet the demand for more accurate, accountable environmental data, Sykes is confident that Shell is there already. “The way we go about measuring emissions for the ETS is exactly the same as we do for internal reporting.”
But Binney says there is still work to be done. If Johnson Matthey is to find its way in the ETS, it will need an emissions measurement and monitoring plan, he says. The company will not be alone in having to adopt a more localised approach to data management.
© Faversham House Ltd 2023 edie news articles may be copied or forwarded for individual use only. No other reproduction or distribution is permitted without prior written consent.