This site is intended for health professionals only


Managing risk – how safe are your systems?

27 March 2009

Share this article

FIONA DALZIEL
MA(Hons) CIHM FIHM

Independent Consultant in
Practice Management

Fiona is an experienced primary care trainer and facilitator. She is the national RCGP QPA Adviser and has advised on both the original and the review of the Quality and Outcomes Framework of the 2004 GP contract

Twenty percent of fatal accident inquiries (FAI) are for sudden deaths or possible medical negligence.(1) In recent years, an FAI into the death of a female patient heard that she might still have been alive had she been diagnosed as suffering from a spinal infection. According to an expert in acute general medicine and emergency admissions, if the practice had not lost a result through misfiling for about a month, then the outcome could have been different.(2)

As practice managers, we are all aware of the importance of paying care and attention to our practice’s systems. We ensure that staff are trained in these systems when they join the practice and we write down the procedure so that it can be referred to at any time (many practices now having such resources available through their practice intranet). We undertake significant event reviews when something goes wrong with a system, and we may decide to update the written procedure to record changes made and retrain staff.

However, many of our systems carry such inherently high levels of risk to patient safety that we should look at finding ways of designing and analysing them that are proactive in recognising possible problems and that help us to act before something goes astray.

System defences and faults
Patients are better protected if we pay close attention to potential hazards in our systems. High-risk systems in the practice – eg, results, messages and repeat prescribing – deserve particular attention. The systems we design and run in the practice have “defences” built into them so that, if one part of the system goes wrong, then the information and ultimately the patient are protected from harm.

However, systems can also have faults built into them, which lie in wait for a chain of circumstances that give them the opportunity to show themselves. Often, these faults have been unwittingly designed into the system by a software supplier or the practice itself.

Practices may, of course, design systems that are virtually watertight and harm would only occur to the patient if something extremely unlikely was to happen or if a team member deliberately ignored warnings such as onscreen alerts. But building safe systems is quite an art; a safe system may be so cumbersome and time-consuming that it takes a lot of resources to administer. Cumbersome, unworkable systems can encourage staff to miss out steps for the sake of expediency.

Medical defence organisations encourage practices, not unreasonably, to make sure that their systems are as watertight, in terms of risk of harm to patients, as possible. This makes perfect sense from their point of view. To advise practices to have systems that allow potentially harmful events to drive a coach and horses through a system’s defences would be in nobody’s interests. Patients would be put at risk and complaints and claims would rise, resulting in rising defence fees for GPs.

In the past, medical defence organisations have been known to advise practices that they should be contacting every patient on whom they have performed an investigation to inform them of the result, even if it is normal. While this would certainly go a lot of the way towards making a results system watertight, it is simply not a cost that many practices would be happy to bear, arguing that it is more important to concentrate on abnormal results.

An accident waiting to happen
Let’s look at a significant event involving a results system.

A patient attended the GP. She was concerned that she might have a sexually transmitted infection (STI), as her boyfriend had been unfaithful. She was advised to attend the genitourinary medicine clinic but was not keen to do so. She made an appointment to come back for swabs.

Five days later, the patient had three swabs taken, including one for chlamydia. She was instructed to phone in for her results, which she did. Two results were back and were normal and she was informed of this. However, the chlamydia swab was not available. This came in a day or so later and was positive. The GP left instructions for reception that the patient was to collect a prescription, make a follow-up appointment with the GP and use condoms.

The patient attended four months later with an upper respiratory tract infection and was seen by the registrar. The registrar asked how she had got on with the antibiotics. It became clear that the patient had been unaware that a prescription had been left for her or that she had chlamydia.

The patient’s records showed that the script had never been collected and had been destroyed three months after it was written up.

Analysis of the event showed the following failings, among others, of the practice’s system defences in its results system:

  • There was no practice policy on how many swabs to send if screening for STIs.
  • The patient was not informed of how many results to expect and the number taken was not recorded on the computer.
  • The important result, when it arrived, was dated as the date the test was taken, not the date received – so staff thought it had already been communicated.
  • “Patient informed” was not clearly recorded.
  • It was not clear whether the patient was ever informed that there was a prescription to be collected.
  • There was no protocol for checking prescriptions before destruction.

This is a very typical example of a results system’s defences having breached or, indeed, not having been designed well enough in the first place. The accident was inevitable; it was only a matter of time before it happened.

Other typical defence breaches relate to the following stages of a results system:

  • The practice knows a sample was sent, but not that a result came back.
  • Patients are asked to come for a test. They do not come, but nobody is aware.
  • Action to be taken is not clearly recorded.
  • Action recorded to be taken is not taken.
  • The test result is filed in the wrong patient’s records.
  • Confidentiality is breached when the result is given out.

Evidently, this list is not exhaustive. What action can practices take when designing and analysing their results systems to try to reduce patient risk? The steps outlined in Box 1 are worth considering.

[[FD_Box1]]

Investing to keep the system safe
When planning or reviewing a system, make sure you involve as many team members as possible in discussing the risk analysis as above. Bear in mind that defences need to be proportionate to the risk to patients.

The incorporation of defences into a system has a cost in time and effort. The cost of the defences needs to be balanced against the cost of something going wrong. As we have identified, a results system is high risk and some practices prioritise risk reduction in the system by putting in specific defences that, although costly in resource, do keep the system safe.

For many practices, this takes the form of logging every step in the system. It is possible to log the following:

  • The patient has been asked to make an appointment for a sample to be taken.
  • The patient has attended to give the sample.
  • The sample has been sent to the lab.
  • The result has been received.
  • The result has been read.
  • The responsible person has been identified.
  • Action has been taken.

Remember to train everyone in the new system and raise everyone’s awareness of their contribution to patient safety. You won’t stop accidents from happening. But at least you will know that your results system is as safe as you have made it!

References
1. Anderson S, Leitch S, Warner S. Public Interest and Private Grief: A Study of Fatal Accident Inquiries in Scotland. Edinburgh: The Scottish Office; 1998. Available from:  https://www.scotland.gov.uk/Publications/1998/12/b596454e-4698-4b30-ad50…
2. Dr David Rubinstein, Director of Clinical Studies at Trinity Hall, Cambridge, quoted in the Aberdeen Press and Journal.