One of these things is not like the other

Image credit: charmcitygavin via Flickr

Errors of similarity

I was recently carrying out a Human Error Analysis on a chemical plant and came across a practice that reminded me that for all our sophisticated view of error and cognition, sometimes we have to go back to basics. So, let’s go back to basics and talk about errors (‘slips’) provoked by similarity.

We’ve known for a long time that errors of execution are related to equipment deficiencies. One of the equipment deficiencies that can catch us out is that of similarity. If we store dissimilar chemicals in similar containers then we are more likely to pick the wrong one. Every day, on a process plant somewhere in the world, someone is charging the wrong additive or topping up with the wrong lubricant simply because they erred when they selected the container from stores. We still store dissimilar chemicals in identical ‘blue plastic drums’ and marvel when an experienced operator or technician picks the wrong one.

Here’s an example of two identical containers, stored in the same location, with strikingly similar labels.

Image credit: Tony Atkinson
Image credit: Tony Atkinson
Image credit: Tony Atkinson
Image credit: Tony Atkinson

Even if the chance of picking the wrong one is as low as one tenth of one percent, over the life of the plant we’re likely to get it wrong and charge the wrong chemical more than once. As long as the result is ‘only’ product quality then that’s perhaps not so bad, but if the consequence has a safety or environmental impact then we need to be taking more positive action to prevent mis-selection.

One of our clients recently recognised exactly this issue and put some practical barriers in place to help with precisely this type of error. You can’t always get round the ‘blue drum’ issue, so he chose to physically segregate two chemicals that must not be confused. They are stored in two barriered storage areas, and to gain access you obtain a key from a ‘smart key safe’. Is this still prone to error? Yes, of course, a planning error or ‘mistake’ could still cause an issue, but the similarity of the container is taken out of the equation. (for those considering this type of solution, you still need to ensure that the correct drum is stored in the correct location of course).

Sometimes you don’t need to go to these lengths, even if you can afford to do so. Here’s a picture of two highly similar components that must not be mixed up. See if you can recognise them and post your answer in the comments section (if you’ve been on my human factors course you’ll already be familiar with them). The organisation that fits these has taken the simple expedient of changing the colour of one of the components to minimise the potential for error.


Image credit: Graham Scarborough


So far, so good. But we not only have to live with similarity and defend against the associated errors, sometimes we inadvertently (by not thinking things through) introduce similarity where it doesn’t need to exist. I’d like to go back to the condition that prompted me to write this blog.

As I said at the top of the blog, I was recently conducting a Human Error Analysis on a chemical plant, nothing unusual in that. The process required the regular sampling of batch reactors, which was carried out manually. The sample is placed in a cup and carried to the laboratory for analysis. The sample is somewhat toxic in nature. All pretty standard stuff so far. The thing that got me worried was the container that the process operators were using to hold and transport the sample. Here’s a picture (actually two pictures). My question is “can you tell which is the toxic sample and which is my cup of tea”?


Image credit: Tony Atkinson
Image credit: Tony Atkinson

The two photographs were taken less than 10 minutes apart. Is it likely that any specific two cups will be mixed up? Not really. Is it possible? Absolutely, and like our blue drum example, over the life of the plant, and given the number of samples, the possibility becomes significant, especially considering the consequences.



Categories and Tags
About the author

Tony Atkinson

I lead the ABB Consulting Operational Human Factors team. I've spent over 30 years in the process industries, working in control rooms around the world, in the fields of ergonomics, control and alarm systems, control room design and operational and cultural issues such as communications, competency and fatigue. I've been blogging on diverse topics that interest me in the widest sense of 'human factors', all of which share the same common element, the 'Mk.1 Human Being' and their unique limitations, abilities and behaviours. I'll discuss the technical and organisational issues that affect safety and performance of these process safety operators and technicians and how this impacts control rooms and the wider plant. However learning comes from many places and you can expect entries about aviation, automotive, marine, healthcare, military and many other fields. Outside of work, I indulge in travel, food, wine and flying kites to keep myself moderately sane. Please feel free to post your comments on each post. Blog entries are posted with no set frequency. To ensure you don't miss out on the latest blog post, click the button below to subscribe to email alerts when a new blog has been posted.
Comment on this article