As it’s Friday, I was planning to write a light-hearted piece on safety critical communications, but that can wait for another time. The Rail Accident Report from the derailment of a Docklands Light Railway train in 2009 has recently arrived in my in-tray and I thought I’d write something on that.
As usual with incident or accident reports, there is some significant human factors learning.
If you skip down to the recommendations section, you might miss some important elements in the report regarding the display of information to the controllers. Check out paragraph 68 onwards for a look at some of the issues with the displays and the alarm systems.
Both the displays and the alarms allowed the controller to filter information regarding the current operational state. In the case of the graphics, the controller has the option to turn off levels of detail, including some important details as to which block of track the train occupies (paragraph 77). Importantly, there is no guidance or procedural support for the controller regarding what to display and in which mode.
At the same time as looking at the overview graphic, the controller had the alarm screen in view. He had selected the view mode to only display high priority alarms. You can hardly blame him; in the hour leading up to the incident a total of 1500 alarms had been recorded. In selecting high priority, he limited the number of alarms he had to deal with to a mere 520! The point position indicator alarms were not high priority and therefore didn’t indicate. Again, he was blind to significant data.
There was no support for the controller in how to use the alarm filters, and no alarm response manual or other support on responding to alarms.
As with the graphics, if we allow multiple modes of data presentation, we need to ensure that the process remains operable and safe when information is withheld. In this case, filtering on priority was just too crude a technique.
If we give our personnel options regarding the operation of the control system it is important that we train people properly on the consequences of any filters or display modes that they may use. ‘Declutter’ or filter modes of display are increasingly common as a way of managing large volumes of data which may confuse or swamp the operator. In many ways these types of solutions are an easy way out for the designer of the interface. It gives a superficial feeling of control, but unless the consequences of operating in the degraded mode are properly thought through, we risk compromising real situational awareness. In this case, real and valuable information was withheld from the controller by the system as a consequence of the application of data filters.
We shouldn’t be assuming that operators understand the operation and implication of the interfaces and tools we give them. We need to train them and ensure that they remain competent in this important area. That’s why we run an ‘alarms for operators’ course.
It might be worth looking at how your operators are using their interface and revisiting your alarm summary screens.
Those readers who have been on my Human Factors course will know how important the handover is between operators (in this case, controllers). Take a look at paragraph 95 for some commentary on the handovers and the workload pressure at the time. I’m surprised that this didn’t make the summary and recommendations sections of the report.
There’s a lot more in the report that merits attention. Positioning and design of indicators, use of procedures, communications, competence, and (as usual) management and culture all play their part. The report makes interesting reading.