The saying “what gets measured gets done” illustrates the importance of developing a process for measuring performance. Mike Sopp describes how to go about benchmarking health and safety.

As the UK Audit Commission emphasises, “if an organisation does not measure what it values, it will end up valuing what can be measured”.

Measuring health and safety performance is a key element of the widely known Plan-Do-Check-Act management cycle. The Health and Safety Executive (HSE) states that it “will give you the confidence that you are doing enough to keep on top of health and safety and maybe show you how you could do things better in the future”.

But what do you measure against? One technique is to benchmark against good health and safety practices in other organisations. In 2015, the European Agency for Safety and Health at Work (EU-OSHA) published a report on research that aimed to assess the benefits and limitations of benchmarking, Review of Successful Occupational Safety and Health Benchmarking Initiatives.

Defining benchmarking

The maxim “you can’t manage what you can’t measure” is as applicable to health and safety as it is to any other business management function.

The usual measures used to indicate current health and safety performance will be internal leading (inputs) and lagging (outputs) indicators but it is possible to use external measures, eg by benchmarking against similar organisations.

A benchmark is defined as “a standard or point of reference against which things may be compared” while the verb benchmarking can be defined as the “evaluation (of something) by comparison with a standard”.

From a health and safety perspective, the EU-OSHA defines benchmarking as “a planned process by which an organisation compares its health and safety processes and performance with others to learn how to reduce accidents and ill health, improve compliance with health and safety law and/or cut compliance costs”.

In other words, benchmarking can be used as a tool to improve health and safety performance by looking at comparable organisations (or systems) and on the basis of the insight gained, to redesign and/or change current practice.

It is generally accepted that there are three types of benchmarking, which can be summarised as follows.

  1. Strategic: a comparison of the strategic choices, decisions and dispositions made by other organisations.

  2. Performance: the comparison of key indicators of performance with the similar performance indicators of others.

  3. Process: the comparison of methods and practices of performing business processes.

Report findings

The EU-OSHA report reviewed some 24 benchmarking schemes across Europe.

It concluded that benchmarking can bring benefits to an organisation, with calibration of its own performance against the market being “a strong motivating factor” to join a scheme.

Previous HSE guidance stated that benchmarking may:

  • improve organisational reputation with stakeholders

  • be cost effective, in terms of avoiding reinventing the wheel

  • improve cost effectiveness in managing risks

  • improve overall health and safety management.

Interestingly, the review found that there are a wide variety of factors that influence the success of schemes but that “there is no one factor that all respondents agree on as critical to the success of their scheme”. It concluded that “the ability to use information outputs to facilitate change was central to the success of a scheme”.

Certainly, data requirements emerged as being influential, particularly with regard to participation and membership. The report concluded that benchmarking schemes with requirements to collect performance data “are less attractive to members than those involving the sharing of good OSH practice” (that is process data).

Schemes that were most successful often involved the sharing of process documents and policies, particularly in electronic format for ease of access.

Conversely, the sharing of sensitive, quantitative data such as accident or incident rates particularly with competitors was seen as off-putting. Schemes that involve the collection of large amounts of data were perceived to be more of a research project than a meaningful way of sharing knowledge and good practice.

The issue of whether benchmarking should use the phrase “good practice” or “best practice” was also explored. The report highlighted that a message of good practice rather than best practice was often more beneficial “as it was seen as offering guidance rather than imposing prescriptive procedures” and that “managers then feel they have greater ownership of the resultant OSH processes”.

The report identified a series of barriers that prevented the schemes examined from being successful most notable of which was industry support and the usefulness and/or quality of the data provided.

It is worth noting in respect of the latter point, HSE’s own guidance on performance measuring states “because performance measures should be derived principally to meet an internal need, there will be a limit to the number which can be used meaningfully from organisation to organisation (ie for external benchmarking purposes)”.

Benchmarking best practice

The EU-OSHA report brings together the emergent themes and issues that organisations should consider with respect to supporting the development of a benchmarking scheme.

Annex A to the report contains a “practical guide to setting up a successful OSH benchmarking network”.

As a first step, it suggests that interest should be gauged through existing networks at low cost and relatively informally to determine if a scheme is viable. There should be a central point of contact for the scheme, preferably a trusted “neutral broker” lacking self-interest that can engender confidence in the scheme.

The need to have a clear vision of the scheme’s purpose and tangible outcomes are seen as essential to get buy-in to the scheme. It suggests a “radical slogan needs to be carefully thought through; the wrong one can alienate potential members, particularly if it seems unrealistic in the context of their sector or market or the wider economic context”. Other essential criteria include getting:

  • an initial membership of organisations committed to health and safety improvement

  • a mixture of performers at different levels so as to maximise the opportunities for peer learning and support

  • financial support to develop free-to-end-user initiatives that are more appealing to potential members.

Deciding on the information required is an essential element to any scheme success. As already highlighted, schemes using process data rather than outcome data are deemed to be more successful but if outcome data is required, it is suggested that “anonymous reporting” could provide a means of addressing concerns over data sharing.

Whatever data is used, clarity is needed regarding its use, whether it will be anonymised, and the extent to which it will be shared with other members or made public.

In terms of practicalities of information-sharing activities, the report concluded that “there was a widely held view among research participants that involving members in personal contact within real-world environments is a crucial success factor” but that “online forms of communication play an important role”.

It suggests that newsletters and other online resources can provide an efficient means of updating members and sharing information, while seminars and meetings also provide valuable face-to-face contact for members and allow attendance of a large number of individuals.

Sustainability of a scheme and keeping momentum are essential if long-term benefits for members are to be realised. Scheduling activities at regular intervals will set a “purposeful rhythm, which helps maintain momentum”.

Last reviewed 6 May 2016