Ensuring the right information
Ian Miller, director of spatial data management specialist Optimal Solutions, on delivering data capable of supporting both business processes and effective performance management
Since privatisation the water industry in England and Wales has achieved significant improvements in performance. But, as with all businesses, there is room for further progress, as the current periodic review is likely to indicate. To build on its successes the water sector needs to identify new ways of improving performance.
Many companies are engaged in re-engineering business processes, developing new IT systems and adopting best practice across a number of fronts as a means of achieving improved performance. A vital component in performance improvement is performance measurement and herein lies a challenge. How can utilities successfully measure and manage business performance and is existing data up to the job? The answer depends on the quality of the information received by decision-makers, how quickly it is received and what control they can exert over complex business processes.
To measure the health of a business, levels of performance need to be measured across all activities. This process links strategic planning with the organisation’s ability to deliver, measure and act upon results. The regulatory framework alone demands performance measures are robust, repeatable and auditable.
Success requires connected processes and a high level of control across all business practices, which will in turn help enforce control by enabling greater consistency, improved efficiency and easier monitoring. This is reflected in the need to optimise customers’ service levels across several different measures, traditionally under the control of different business functions.
The ability to establish effective control depends on establishing a better understanding of the business – or business function – through unified and consistent data. This requires water undertakers to move towards a unified data model and single repositories for key data sets.
The ability to see the entire organisation, regardless of geographic boundaries, through a single corporate view of information promotes effective performance
measurement and improved decision-making. Current performance measurement tools are a powerful resource for managing business across the traditional functional boundaries but their degree of success will reflect the underlying constraints of the data.
There is no single answer to deriving reliable and accurate data capable of supporting business processes and effective performance measurement. It is possible, however, to identify a number of characteristics of a successful strategy based on current projects at Optimal Solutions:
- adopt unified data structures for key data sets,
- assess quality of existing key data sets,
- address data quality issues according to cost and benefit,
- involve users throughout the process.
Optimal Solutions is currently working with a water utility to establish a unified data structure for all fixed assets. The importance of a common data structure and the need for that structure to take precedence over all others was recognised at the outset, as was the need to evaluate and reconcile other sources of asset data. The main business drivers are consistent regulatory reporting, justification for capital investment and reduced costs of preparing returns.
The single data structure provides a framework against which all asset data is held and, more importantly, allows relationships between system components to be captured in a rules base. Writing the rules directly into a database safeguards data quality and simplifies its maintenance. From a business perspective this approach ties together the
different components of the system needed to deliver a service to customers and provides the basis for effective performance management.
The dispersed nature of assets and customers in a utility dictates key data sets be spatially referenced. This allows the geographic component of the business to be reflected in day-to-day work management activities. To improve customer service and control costs, utilities need routine access to structured asset data, in conjunction with operational data, to enable shortfalls in performance to be related back to an event, asset or a group of assets.
One of the main factors undermining performance measurement in any organisation is inconsistency in source data, typified by the ever-present problem of data duplication. Database administrators (DBAs) know nothing damages confidence in a database more than inconsistency. Having two pieces of the same data is worse than having only one incorrect piece – any DBA worth their salt knows that if you have multiple pieces of data in a database, you should delete all but one. The same is true with multiple databases containing the ‘same’ data.
In the aforementioned utility project all asset-related data sets were evaluated and either re-engineered to fit the unified structure, migrated into the new structure or phased out. The process of safeguarding and improving data quality needs to be viewed as a long-term activity and any effort expended needs to ensure fitness for purpose. In the case of utilities some performance measures require long-term data streams combined with sophisticated modelling techniques, capital maintenance is a case in point, whereas others can be readily abstracted from established systems. Any investment in data needs to reflect its value to the business.
Many performance measures will evolve over time to become more than a simple report and will become tightly integrated with core business processes. Improvements to data quality cannot be seen as a one-off investment, they are best viewed as a stand-alone business process requiring routine investment to ensure they continue to reflect the performance of the core processes.
The human aspects of performance measurement should not be overlooked since they have a major impact on the degree of success or failure of such systems. Based on recent experience at Optimal Solutions, the following were found to be critical to success:
- identify and segment the target audience according to key characteristics,
- keep communications in high gear and pressure participants to produce,
- ensure all senior people are publicly enthusiastic,
- introduce new functionality/features in small increments,
- maintain users’ support by frequent
A critical step in measuring performance is to define the target audiences, since each will have different characteristics and require information relevant to their sphere of operation. It is vital to acknowledge the different needs of users and segment them according to this need, which allows appropriate reports and software functionality (if appropriate) to be developed.
User involvement is best done as a routine activity (say monthly) to report progress, introduce new ideas, resolve data issues and allow hands-on interaction with software solutions. This allows each change to be small. The advantage of gradual improvements is that mistakes are smaller and easier to handle
so there is less risk of a high-profile failure.
It is also important to emphasise the value of improved data quality and to manage expectations. As is always the case, it is better to under-promise and over-deliver. For most water utilities the next steps are quite clear and will be driven by the results of the Periodic Review. In the longer term the key data sets, such as assets, customers and finances, will be subject to greater scrutiny as utilities adopt more sophisticated methods for measuring business performance. This will require more structure and consistency in key data sets and a review of existing data-related processes.
The benefits of taking a
corporate approach to
managing data are significant and include the possibility of
a continuous rolling asset management plan, capable of providing accurate and reliable information for all decision-makers at any stage in the regulatory cycle. This, however, requires a long-term commitment to addressing data related issues.
Experience shows investment in the ‘back-end’ – data structure, database design, data assessment, data cleansing and data migration – yields the greatest returns in the long-term. Only once these underlying data issues have been tackled and a corporate view of key data sets are available, can performance improvements be measured with confidence.
© Faversham House Ltd 2022 edie news articles may be copied or forwarded for individual use only. No other reproduction or distribution is permitted without prior written consent.