Ship of Fools

Errors in information, engineering and procedures
Comp.risks pointed me to this interesting story about the (somewhat embarrassing) grounding of a US Navy minesweeper.
The U.S. Navy also revealed on Jan. 18 that the digital navigational chart in use by the Guardian misplaced the correct location of the reef by about eight nautical miles [http://blogs.defensenews.com/intercepts/2013/01/trapped-u-s-navy-minesweeper-to-be-broken-up/#more-3853]
At first I was interested as it looked like another ‘sat nav’ issue with people blindly following and trusting technology. However that doesn’t seem to be the case as the actual data was at fault in this case and there was no obvious alternative contradictory data available. The grounding would have happened with paper based charts if they had been similarly incorrect.
The article had me thinking about how our decision making processes are inextricably linked with the accuracy and appropriateness of the information available to us. This is highly relevant to both the process industries and to the automation and information industry that supports them. As we make our control rooms increasingly isolated from the process and from direct sources of information, everything that the process operator sees and hears (he no longer feels, touches or smells the process) comes through the filter or lens of the engineer and the chosen technology. Not only do we decide what the operator or technician sees or doesn’t see, but the relative importance or salience of the information is determined by how we choose to present the information. The location of the information within a hierarchy of display affects whether the operator sees or acts on the information, as does the way we present even simple variable data. Think of the difference between a spot value and a trend for instance, or text vs. pictorial display of status. Even where we get the presentation of the particular variable right, we can still influence the relative salience of the data by the context in which it is displayed. An information rich screen gives different prominence to an individual variable compared to an information sparse screen. Even those non-variable lines that we draw to represent pipes or plant or control relationships imply a context and relationship between individual data items or groups of items. In effect, we impose our mental model on the operator’s mental model by the way we choose as engineers to present the data.
In the ‘good old days’ there were multiple ‘recovery paths’ available to the operator through diverse sources of information. If the instrument told him something misleading, his eyes and ears might see or hear something else and he had a chance to take the right decision. Now we remove many of those alternatives by increasing isolation of the operator from his real environment, that of the process. This is increasingly driven by the needs of safety regulation or even legislation following events like Piper Alpha and Texas City, where (quite rightly) the need to ensure that people were located in a safe place is considered paramount.
Thirty or forty years ago, the automation industry and the process industries collectively ‘lost the plot’ with process alarms. We allowed the capabilities of computer systems to hypnotise us with low cost and clever functionality and we created an explosion of alarms without thinking of the needs of the process operator. It’s only in the last decade that we have recognised this and started to reverse the trend to ever more inoperable alarm systems.
I wonder whether we are in danger of making a similar mistake with our drive for increasing isolation of people from the process and plant equipment. We should be putting the needs of the operator at the heart of our information flow and interface design. We need to recognise the nature and risk associated with this increasing isolation and ensure that there is sufficient, robust and appropriate information available for those critical decisions.