James Reason Lecture

Professor James Reason, University of Manchester

Problems and Perils of Prescription Medicines
Thursday 22 May 2003
Royal College of Physicians

Stumbling at every step

In the last 5 or 6 years something remarkable has happened in medicine and healthcare, a kind of glasnost and error has come out of the closet. But there hasn’t been a big bang or catastrophe. There has been the Bristol Royal infirmary, but all of this has been brought about by a flurry of high-level reports, the chief medical officers report and organisation with a memory and various other documents that raised errors very high on the agenda. The great pioneers say we should look to aviation, as a model for safety in healthcare and indeed there are many good lessons to be learned.

Aviation is predicated on the assumption that people screw-up. You (healthcare professionals) on the other hand are extensively educated to get it right and so you don’t have a culture where you share readily the notion of error. So it is something of a big sea change. We are able almost at every meeting that crops up hear some discussion of error.

Plane

My original title was "Stumbling at every step". I wanted to start with the notion of error management.

A lot of the past ways of managing error have been fashion driven and atheoretical. I want to give you a principled view of how one might go about it. Errors can’t be eliminated but they can be managed. That isn’t something that springs to the lips of most managers, they see it as some sort of divine curse. Error is extremely adaptive and comes from cognitive roots that are the very basis of human behaviour. Our ability to delegate our actions to non-intentional control or automatic control is essential for normal life, but it also leads to actions take on a life of their own without any intention. I think one needs to understand the nature of error and the variety of error before we can manage it.

Error is not random, it is not out of the blue, and it is largely predictable. You can predict it using the uncertainty principle. In order to be performed correctly any cognitive action needs to be specified in a number of ways and where there is under specification, which takes a variety of forms like the failure to attend at some particular point in an action sequence or having degraded sensory data, or forgetting something, or having incomplete knowledge then the cognitive system does something very clever. It moves into a state where it will adopt the response that is most useful in that situation. It is very contextual. It is hinged upon frequency and familiarity.

What it does then, is falls into these well-used and frequent things, but they are useful. So if I say who said "the lamps are going out all over Europe and we shall not see them lit again in our lifetime" 90% of you will say Winston Churchill. You will say it because you link it subconsciousily with some catastrophic event in history and in the limited context some statesman who is the gabbiest politician on the eve of a world war. Clearly it is Winston Churchill. It wasn’t. It was Edward Gray on the eve of the First World War. But the error you make is an entirely adaptive and sensible error. It has nothing to do with logic; it is to do with frequency and matching frequency to context. So I want you just to keep that in mind.

The wrong guy

I think one of the ways people have started to classify errors and one of the necessary errors and think about the levels that humans perform at. In trying to explain human behaviour we need two things. What is going on between people ears and what the situation, or context, of what is being performed. If you think about how we control our actions, it is a continuum from the very automatic habitual where we don’t need to give any conscious attention at all to the highly laborious but complicated hard limited sequential conscious process.

You think about any situation and most situations in your bathrooms or offices, they are routine they follow a regular pattern and sometimes that pattern is broken. It’s broken in a problematic way usually; you can define problem as something that breaks your routine expectation. There are two types of problem, there is one kind of "trained for" problem which you have spent decades in expertise if x do y kind of rules in your head, your expertise becomes more and more refined so that more and more you can discriminate the appropriate reactions. Whereas students can tell you the rules. You have got to the paradox of expertise where you cannot articulate the rules anymore, but you know what the pattern looks like.

The other area is where your rules run out and you come across an entirely new situation.

You have to think on your feet and that is something you hate doing and it is not easy to do. Ironically this is why we put humans in nuclear power stations to deal with things that the machines cannot deal with, but we are not good with it. But we are still the best in the sense that there are no machines that can do it better.

In that context there are three levels of behaviour.

The skill-based level is the highest level of achievement.

We can classify errors in 2 ways, as a rising in performance of these levels, So we start with the skill based levels, where the action is planned correctly, but where you actions do not go as planned. Your plan is satisfactory but you get into the bath with socks on. Or you say "thank you" to a coffee machine. These are to do with a level of habit, "strong but wrong" responses. The error form is strong; it’s habitual, but wrong, inappropropriate to the circumstances. This often happens when you change your intention in regard to a routine action.

For example you say I’m getting fat, I’m not going to put sugar on my cereal, then automatically you defy your intention by sprinkling sugar on it anyway. Rules based errors are more interesting, and are common in healthcare. There is a wide range of knowledge based mistakes because of the extraordinary uncertainty of your business, uncertainty is something that we the patients don’t like to think about. We don’t like to think about how little you know and you don’t like to tell us about how little you know. But compared to most other hazardous things, it is an area where enormous uncertainty, I won’t say ignorance, exists. There are frontiers which are still yet woolly.

But you can also discuss errors in this context. Medication errors result in the misuse of medication, the overuse of medication, or the under use of medication.

M6

Another way of expressing this picture of the three levels is to represent them like this. When you know what you are doing, you have slips or memory lapses. Trips and fumbles. These three levels (skill, rules and knowledge) can co-exist. Think of driving on the M6. Your speed and directional control are under skill-based control. You just watch out the corner of your eye, these actions which you don’t realise you are directing. The rule based applies to the other road users, you are all the time dodging, but you are applying the rule-based level.

Then you hear on the radio that somewhere north of Birmingham there is a major tailback and you think, Oh God, then you resort to this limited mental model that you have of the map of Staffordshire. Should I go through Stone, you wouldn’t think to stop and look at the map. (laughter from audience).

But this is what we do, we work with limited incomplete erroneous models and it is an extraordinary error provoking activity. So, I wanted to point out these 3 things can co-exist. In the case of rule based errors you can do three things, you can have a perfectly good rule, which is contradicted and you fail to notice the contraindication, because in every clinical situation (or life situation) indications and contraindications co-exist in the same problem space.

Or you can have a bad rule that has got you by on many occasions, but in this occasion it is inappropriate or you can choose quite deliberately not to apply a good rule. And then of course there is the knowledge-based error where you are struggling with the mental map of Stafford or whatever. Actual examples of slip, may be where a physician writes 0.5mg but writes 5mg or a lapse may be were a nurse delivers a medication late, rule based are interesting as the same error can occur by different routes.

The physician can apply the wrong rule to adjust the dose of an amino glycoside anti-biotic with a patient with renal insufficient or a physician simply does not know such a rule is required. The crucial point I want to make is that the same situation keeps producing the same errors, the same type of errors keep cropping up time and time again. Even though quite different people are involved. That surely indicate we are dealing with error prone circumstances rather than error prone people. We are dealing with error traps.

And my goodness if there ever was such a conspicuous error trap I would choose the intrathecal administration of vincristine, now counting up to 13 or 14 instances. The same tragic consequences occur.

The first thing to do is find error traps and to do that you need a database, some kind of error collecting system. You need a reporting system; you need a near miss reporting system. You need a memory. Without that you have no idea of identifying where these errors occur.

Photocopying

To give an example of familiar error, what is the most likely omission you are going to make at a photocopier? From research we know that forgetting the last page of the original is in the photocopier is the most common error. It is not coincidence. There are all kinds of error provoking factors. In all the previous times you have had to change the paper, now the last one no longer cues this. What you have got the goal of the task complete you have the copy, although you have not completed the task. You are also at the end of the task, you are starting to think about the next activity.

You could not think about photocopying and remain sane. And it is concealed, out of sight out of mind. All these things pile up to make it an almost inevitable error. What can we do about it, simply use reminders. It is best to use your own. They have certain characteristics, but the point is error can be managed.

Others are much more subtle. If this represents a serious of defences and safeguards fault lines slowly develop, and it is only when they develop in one particular area of weakness that you suddenly get the breakdown in the system. And I think this is exactly what happened, I’m talking now about the Queen’s medical case in Jan 2001 which is now in the public domain. If ever there was a situation in which those people involved added the final ingredient to a disaster in waiting, that had been going for months before hand, that was it. All the cracks (in the system) were often for very good and very helpful reasons.

The cheese model shows how adverse in complex well defended systems exist, not that healthcare is such a well-defended system. I mean compared to aviation or nuclear power, which are like castles with ribbons and defences in depth in medicine it is often you, your judgement and the patient. There is not actually very much between the surgeon’s scalpel and the unwanted or untargeted nerve/blood vessel except that person’s skill.

So it is much more of a person-orientated skill however we all know that with vincristine that it is a well-known hazard. So we build defensive protocols, we create invisible barriers, we say for example, we will never deliver the cytotoxic syringes at the same day and we will train our SHOS and junior doctors to understand the consequences and we will never leave them in situation were they are entirely in their own unsupervised and ignorant as they were on that day in Nottingham.

Cheese

But because the defences are Swiss cheese, flawed, they have always got holes in them and the wholes align. Because there are people at the end who make errors or violations, and they are quite short lived, this is much more dynamic in life than cheese. The designers, the managers or the maintainers put in the flaws unwittingly because they don’t see all the barriers.

The other much more manageable area are latent conditions I used to call them latent errors, but they are just latent conditions. They are ever presence, they are resident pathogens in the body, and they are always there. What people try to do is try to manage the last error. They focus on the last error; there is a huge song and dance about the last error. So just like mosquitoes you swat one and another appears, so the only way of dealing with the error problem is to go to the latent condition, to go to the systemic swamps in which they breed and they can be various and we could have lousy interfaces, poor ergonomics, bad labelling, hours of work which wouldn’t be allowed by Victorian miners, poor defences, conflicting goals, there is always a productivity goal which effects care, training deficiencies which where revealed so manifestly in the tragedy in Nottingham.

So to conclude how do we learn to live with error, we have to recognise, and this is something I know it is obvious to say, we all nod and say its obvious but it is not. Errors are crucially opportunities for learning. Organisations that are very resilient to slings and arrows of operational fortune have two characteristics, one they always have chronic unease, they always think today is going to be a bad day, they are ever vigilant, they are wary all the time, the second is they are always taking big lessons, not local lessons, from the past failures. Saying how can we learn things that will make the system more resilient?

Then there is the obvious thing about naming blaming and shaming, we know and finally there are designers and designers the world over, and although they are often human, they actually do not seem to be aware of the limitations of human beings. So the kinds of terrible systems that we are still forced to operate within is all a consequence of the inadequate awareness of the way people think.

Questions from the floor:

Do you think the health service is too big to manage errors effectively?

I’ve been to a number of talks where I have seen the organogams go up and I have never seen anything quite so mind-blowing interactive and complex and incomprehensible. Yes the system is hugely complex but again, most of the necessary management needs to go almost at the local level.

Healthcare is performed by teams, and this kind of management can work on a team level. You have to try and change it at the team level, the local level.

In a large database of errors, what proportion are knowledge, skill and rule based. Do you think that is useful data to manage errors?

Good questions and troubles me no end, because I do recognise the enormous difficulty for people to do that. It does have some utility, in the well-established reporting system that exists. What is most interesting is what is it that people were doing at the time the event occurs? There is no one good way classifying, but try and find one that can show you how to take remedial action.

Health care does not occur in a vacuum we also have a legal system, given that there seems to be an increasing amount of prosecutions for manslaughter of healthcare professionals I just wonder what your opinion was about whether the crown prosecution service (CPS) needs to think about no blame culture and what effect they are having on safety?

The behaviour of the CPS is bizarre completely bizarre. In a variety of cases I think the appeal courts have indicated they have made the wrong judgement. As to no blame culture, I think that that no blame is simply not workable there are egregious errors and we know there are egregious errors, so we have to acknowledge them. We have to negotiate what percentage of unsafe acts, errors whatever, are truly unacceptable, definitely culpable errors. They have done this in aircraft maintenance and it account for about 10% of a range of errors, so 90% are able to be reported without blame but it requires a very intensive negotiation between all levels of the system. You can’t just put it down and say that is where the line goes, you have to negotiate. Unless you have such a thing in your system in that level, i.e. the CPS, there simply won’t be any reporting. So trust is the first step on that road.