Meeting Professor James Reason CBE
was the realisation of something significant personally. Professor Reason was
engaged by Shell globally in 1988 and as a result Tripod Delta Proactive
Approach to Safety (“Tripod”) was developed to understand and explain incident
causality. It revolutionised approaches to incident investigation and ventured
explanations regarding the elephant-in-the-room of human error. I worked for
Shell during 2002-2003 and was trained extensively in Tripod. Tripod is the
basis of modern Incident Causal Analysis Method (ICAM).
Professor Reason authored Managing the Risks of Organizational
Accidents (1997), which includes the original Just Culture Model, and this had a profound impact on my early
career in safety (I started as a safety professional in 1997) and continues
today.
The following are the noteworthy
quotes (verbatim, or as close to verbatim as possible):
Foresight is a very tricky trap.
There aren’t enough trees in the rainforest to cover the
procedures needed.
You can measure safety by the absence of something.
A falling or zero LTIFR is the road to hell.
Personal injury rates do not measure process rates.
The real data of safety are stories.
You don’t make absent-minded slips crossing a busy road.
Business is the delicate balance between production and
protection.
There are always more errors putting things back together than
taking things apart, because there is more than one way to put something back together.
Extroverts are not more error-prone.
Young men violate, old women don’t.
People last less time than the task, hence the need for
procedures.
Opinions of close others are what shapes potential violators
most.
Young lads violate when they are endorsed by other young lads.
Human beings are such good pattern-matchers.
The brain abhors a vacuum. We would prefer to align with the
wrong theory than choose no theory.
A virtuoso is not someone who never makes an error, but someone
who detects and recovers from the error.
100,000 Americans die annually because of medical and health
care mistakes. The moral to the story: don’t go to hospital.
Roads are wonderful laboratories for violations.
90% of errors are honest errors.
I (Professor Reason) never called the Swiss Cheese Model the
Swiss Cheese Model. Rob Lee from Canberra
first coined the phrase. I am indebted to him, because I love the name.
I (Professor Reason) regret inventing the term “Latent
Failures/Errors.” It should have been “Latent Conditions.” (e.g. the oxygen
state in a fire is not a failure, but a condition)
Lawyers are typically into billiard ball causality. It is better
to be probabilistic about causality.
The first question to be asked post-incident is, “Which defence
failed and why did it fail?”
Many political structures in the 1980s (the lean and mean age)
caused many accidents.
The same situations cause different errors in different people.
It’s not always easy to develop good lead indicators.
Air Canada
pilots had three words about their culture. They were, “very boring, very
proceduralised, and very safe.”
After a textbook landing and Air Canada pilot remarked, “That was a
very boring landing...” Professor Reason interjected, “No, it was a very
safe landing.”
Every time a human being touches something it’s likely to go
wrong.
Tripod was named after a three-legged dog.
Of the General Failure Types (GFT) in Tripod, Professor Reason
would no longer include “defences” or “error-enforcing conditions,” reducing
the GFTs to 9 in total.
When British Airways stopped using tripod they drifted back into
Jurassic Park.
There’s a big difference between safety management and error
management.
We need to think not in terms of human as hazard, but in terms
of human as hero.
What is needed along with ALARP (as low as reasonably practicable)
is ASSIB (and still stay in business).
Production pays for safety.
Air-traffic control is not about safety, but revenue... to push
more planes through the sky.
The safety war cannot be won like Waterloo was.
Entropy [the lack
of order or predictability; a gradual decline into disorder] gets you all.
Zero harm and target zero misunderstand the nature of the safety
war.
“History is a race between education and catastrophe.” (H.G.
Wells)
Culture should be CEO-proof. Long after the CEO has left should
the culture remain. The trouble with most CEOs is they leave after a year or
two.
The Nazi military in 1940 and 1941 were the very best High Reliability
Organisation (HRO). They knew what needed to be done without being told.
Human error is a system failure.
In reality, a good safety record is no such thing.
Bad outcomes, just like good outcomes, are a team effort.
No accidents is a cause for concern. (Commenting on the idea of
“chronic unease”)
Chernobyl operators
had never learned to be afraid.
There is a natural tendency for things to go wrong.
The pursuit of excellence is wrong. Excellence is only
manifested by the pursuit of the right kind of excellence.
Human Error Reduction Operation (HERO) is about harnessing what
we have, doing what we need to do, and doing what we should.
Safety is a dynamic non-event; we have to work very hard so
nothing will happen.
Either we will manage human error or it will manage us.
© 2012 S. J. Wickham.