Last reviewed 21 May 2015

Thoreya Swage looks at the types of errors that lead to patient safety incidents and how to try to avoid them by creating a “risk aware” culture.

Over a million patients are treated successfully in the NHS every day. However, there are occasions where, despite the best efforts of healthcare professionals, treatment can go wrong and patients are harmed by the intervention. These are called “patient safety incidents”.

“A patient safety incident is defined as any unintended or unexpected incident which could have or did lead to harm for one or more patients receiving NHS-funded care and can impact on an organisation's ability to deliver ongoing healthcare.” (NHS England)

Clinical governance is a statutory duty that has been in place since 1999, whereby all organisations providing NHS care are required to ensure that high quality care is delivered to their patients. It is a set of processes which are designed to reduce clinical risk and to improve the quality of care through evidence-based practice and guidelines, and through training and education.

In recent years, the main focus of clinical governance has been on patient safety.

Improving patient safety

It is important to be aware that improving patient safety involves more than the tangible policies and procedures and learning about the latest drugs and interventions. It is also about effective team working and learning from mistakes.

Much of this learning arises from understanding the human factors involved.

Human factors are just beginning to be recognised as a major influence in the delivery of safer care within the NHS. They have, however, long been a feature of aviation flight safety and many lessons have been learnt from this industry which could be applied to healthcare.

Causes of accidents

There is never one cause of any accident. There are causal factors which may be already recognised and staff may have an unofficial “work-around”, and there are unrecognised dangers. There are failures of the system itself and, of course, the ever-present, unavoidable ”human error”.

Essentially, there are two types of error, as follows.

  1. Latent failures: failures of policies, decisions and culture that are an everyday part of organisational life (resource allocation, staff training, documentation provision, management, etc).

  2. Active failures: errors of front-line operators (eg doctors, nurses, allied health professionals and administrative staff).

Since it is not possible to stop humans making errors, it is necessary to analyse what circumstances and environmental factors lead them to make mistakes, try to prevent those mistakes that it is possible to do so, and tailor the systems to allow for mistakes without endangering patient safety.

Creating a risk-conscious or risk-aware culture

It is essential to create a risk-conscious culture, in which all staff are actively encouraged to speak up about safety concerns and in which their input is valued. If this “speaking up” does not occur, then the single most valuable source of information will be cut off at source. Only visible threats are manageable and therefore potentially avoidable.

It is therefore vital to develop a culture in the surgery in which people are aware of risk and that the basis on which high-quality care can flourish, dependent on a supportive and creative environment.

Error chains

An error chain is a set of circumstances (human error, system fault and catalyst event) which left unchecked or unrecognised can lead to an adverse event.

The two types of errors (latent and active errors) illustrated below coincide with a catalyst event to produce an unsafe situation. Such a situation can be handled provided the players involved are aware that such a situation exists. However, if they suffer a loss of awareness (see below), one slip — the “Final Error” shown in the diagram — will produce an adverse event or patient safety incident.

Error Chain

Situational awareness and mental models

“Situational Awareness” (a phrase borrowed from aviation) is defined as an accurate mental model of one’s environment, including surroundings, other people, the task being performed and how it is changing with time.

Building situational awareness is not difficult — it happens automatically in everyday situations. What people may not necessarily know is whether their mental model is accurate or not. The human brain, however, tends to use all inputs that reinforce the mental model and discard those that disagree with it, to the extent that when irrefutable proof arrives that one’s mental model is incorrect, it is no longer possible to function effectively until a new model has been constructed and the situational awareness has been recreated.

For example, it is possible to assume that a correct diagnosis has been made in the past on a patient which actually may have been based on a dubious set of symptoms which had not been confirmed or investigated properly. This can easily lead to the prescribing of drugs that are not suitable for the patient in question. Furthermore, this “mental model” will be reinforced by the healthcare professionals attempting to force the data to fit this model, as the natural human characteristic is to structure information and to make inferences from it.

Once a doctor has come to a diagnosis, it is very difficult to shake them out of it until information to the contrary becomes overwhelming.

To ensure an accurate mental model of one's environment, and so maintain situational awareness, it is important to:

  • gather as much data as possible from all possible sources before making an inference

  • not rush into a decision (rapid decisions are rarely necessary)

  • consider all possible interpretations of the data, including unlikely ones

  • once embarked on a course, keep stock of the situation

  • ask: does the hypothesis still fit the data as the event progresses?

  • ask: how can one’s actions be tested to see if the hypothesis is accurate?

  • if incoming data do not fit the hypothesis, do not disregard them but reconsider the situation

  • ensure that you do not interpret the situation as you would like it to be, but as it is.

Warning signs (“Red Flags”)

In order for all staff to be aware of risks and to develop their situational awareness, it helps to show a few warning signs, or red flags in other words, that show up when a situation is rapidly developing out of control. This can include a number of circumstances, such as confusion, broken communications, fixation and missing steps or information.

Speaking up — PACE

In a clinical situation, when a member of staff highlights one or more of the above, it is time to stop and raise a hand. The person concerned has stopped acting effectively as a member of the team and needs to be reminded of this and to have things clarified. To do so is possibly one of the more difficult parts of the job for any team member, since it risks being singled out as lazy, incompetent, easily distracted or simply unintelligent. To raise concerns diplomatically is not easy, so to borrow another procedure from aviation where it has been shown time and time again as an effective tool, the “PACE” mnemonic is recommended.

Experience within the aviation industry has shown that, by commencing raising one’s concerns (starting at the “Probing” level and raising it each time if communication at the previous level does not achieve the result), it is possible to be heard even at the most senior level in a non-confrontational way. It is rare that the procedure needs to be taken to the extreme as, usually, a senior team member who has also been trained in this technique will recognise when he or she is being “PACEd” and stops themselves and re-examines the situation.

Examples of the PACE procedure
  • Probe: “I think I’ve lost the plot here — can you help me?”

  • Alert: “Are you sure — what do the notes say?”

  • Concern: “I’m very worried about this procedure — can we please have a recap?”

  • Emergency: “Sorry, but I think you’re WRONG. If you aren’t, I’ll happily apologise later, but you need to STOP NOW and discuss this.”

It is important for the individual who is being PACEd not to feel threatened or under challenge, and to understand that the process is a safety mechanism to ensure the wellbeing of the patient concerned. There is nothing wrong with being PACEd; it is just effective and supportive team working at its best.

Team work and effectiveness

It goes without saying that effective team working and mutual support are essential in establishing and maintaining patient safety. In primary care, where team members tend to talk more readily to each other than in other healthcare settings, communication generally works well. The disadvantage, of course, is that such teams can appear to a newcomer to be a small clique or coterie of friends, into which it is difficult to break.

Regular short briefing sessions (between 10–15 minutes in length), probably once a week, at which all members of a team have the opportunity to provide input, are helpful in order to enable effective team working and communication and ensure that essential points are covered.

Sign up to Safety campaign

In June 2014, the Secretary of State for Health launched the “Sign up to Safety” campaign with the aim of improving patient safety in the NHS and making it the safest healthcare system in the world.

This three-year campaign asks organisations to commit to the following five safety pledges.

  1. Safety first: demonstrate the commitment to reduce avoidable harm by half by publicising locally developed goals and plans.

  2. Learning continuously: listen to feedback from patients and the public, and monitor regularly how safe services are in order to make the organisation more resilient to risks.

  3. Honesty: be open with people about progress on dealing with safety issues, and encourage staff to be honest with patients and their families if things go wrong.

  4. Working together: take a lead in encouraging learning collaboratively so that improvements can be implemented across all local services.

  5. Providing support: facilitate the understanding of why things go wrong and how to rectify them.

Three central commitments are required to be made by organisations when signing up to the campaign.

  1. Explaining the actions that will be undertaken in response to the five Sign up to Safety pledges, and publishing this on the practice's website.

  2. Develop a Safety Improvement Plan which sets out how the organisation intends to reduce patient harm over the next three years.

  3. Monitor the progress made against the plan and publicise this.

By joining the Sign up to Safety Campaign, practices will have access to virtual learning networks which will enable the sharing of knowledge and improvements in primary care.

National incident reporting

As with all quality systems, the processes cannot improve without effective feedback and learning from previous experience. This is even more important when dealing with human error and patient safety matters.

NHS England supports a national incident reporting system (the National Reporting and Learning System (NLRS)), the purpose of which is to: gather information on adverse events and near misses; encourage a reporting culture that is supportive and constructive; disseminate findings and solutions to the NHS; and promote research on patient safety. There is a general practice e-form that has been developed by NHS England to make it easy to report patient safety incidents to the NLRS.

As the practice reporting culture matures, incident reporting rises. This should not necessarily be a state of worsening patient safety but should be taken as an increasing level of awareness of the issues around safety.

By understanding and following the principles of human factors and ensuring that all staff are trained appropriately, this will ensure that the practice develops into an organisation that delivers safe care for its patients,

Further information