Beyond the limitations of safety systems

Saturday, 01 April 2000

    Current

    To err is human – but the big challenge in accident prevention is to foster an environment in which people are prepared to admit their blunders, says James Reason*


    Most major industrial accidents provide a legacy of salient lessons. The question is, do they point companies and industries in the right direction and do companies see the relevance of the lessons to their own situations. For instance, last year's royal commission into the Longford accident focused a great deal on the role of safety systems. While valid and timely, the commission's observations stopped short of addressing the inherent limitations of the pure systems approach. What is more, it failed to take a long hard look at the broader organisational aspects which contributed to the accident. Consequently it tended to feed into the widely-spread misconception - not just in Australia but many other comparable countries - that somehow systems sit apart from culture. It is this belief that drives managers' over-reliance on systems on one hand, and an insufficient understanding of, and emphasis on, workplace culture, on the other.

    They believe, mistakenly, that compliance with such rules and procedures can be achieved simply by imposition of systems, while ignoring the crucial cultural dimensions. Yet it is the latter that ultimately determines the success or failure of such systems. There is no disputing the importance of standard operating procedures and safety systems. They are, and will continue to be, a critical part of every workplace safety program. For example, a survey carried out in the US nuclear industry identified poor procedures as a factor in some 60 per cent of all human performance problems. The problem with systems lies in the disproportionate reliance placed on such prescriptive procedures. Procedures are essentially feed-forward control devices, which makes them susceptible to changing circumstances. Along with all such control systems, they also suffer the problem of not dealing with local variations. Many companies use systems and procedures as a means of achieving greater consistency of human action. Less system variability, they argue, means less human unreliability. And it is the latter that is regarded as causing system failures and accidents.

    What they often fail to appreciate, however, is that human variability in the form of moment-to-moment adjustments to changing events is very important. In fact it is this quality, more than anything else, that preserves system safety in an uncertain and dynamic world. In other words, by striving to constrain human variability, organisations are in danger of undermining one of the system's most important safeguards. Good managers already know this. Most modern organisations today strive to equip their front-line people with increasing discretion to deal with local issues rather than use the discredited bureaucratic route of passing the decision up the tree. Safety is no different. Empowerment of front line staff within clear boundaries or guidelines is the key. One of the major issues they face is the question of human variance. Conventional wisdom suggests that an organisation's reliability depends significantly on the consistency and invariance of its activities. Now evidence is mounting against that orthodoxy. Increasingly, unvarying performance is being equated to inability to cope with the unexpected.

    Traditional "efficient" organisations are seen as lacking the collective mind-set necessary to detect and understand unpleasant surprises before they bring about serious consequences. By contrast, collective watchfulness allows organisations to better cope with unforeseen system failures. In such a culture, people are acutely aware that system failures can occur at any time and in a variety of forms that had never before been encountered. They are forever on the lookout for novel ways in which latent conditions can combine to beat or by-pass the system. In the final analysis, what characterises such flexible cultures is that they work on the assumption that failures are not isolated events but are more likely to come from a combination of causes. Consequently, the orientation is toward effecting system reforms rather than mere repairs. There is, of course, a vast difference between the culture described above and a culture that treats control systems as immovable objects. A flexible safety culture is not only one where there is a desire to learn and to constructively use new local information and insights. It is also one that is able to work with and within systems without having its creative capacities constrained by prescriptive controls.

    So what does it take to create such a flexible, informed culture? The definition of an informed culture here is one where the whole organisation participates in the process reporting errors and incidents. The key to achieving such a culture is "free lessons". In other words, near-misses from which management can learn a great deal, providing it has the necessary analytical resources. But resources are not the only answer. The over-riding challenge is to foster an environment in which people are prepared to confess their blunders. It goes without saying that such behaviour is unlikely - even out of the question - in a culture where errors or violations are seen as moral issues warranting sanctions. Yet like the excessively punitive culture, the more recent "no blame" culture is neither desirable nor workable. The question that remains is where to draw the line between acceptable and unacceptable actions and behaviour. I believe too much is made of this issue. The most important point is not where the line is drawn but how acceptable it is to all members of the organisation.

    Without broad-based acceptance there cannot exist an environment in which reporting is an integral part of the safety culture. And it is only in such a culture that any systems, even world-class ones, can make an optimal contribution to an organisation's pursuit of safety.

    * Professor James Reason is recognised as one of the world's leading expert in the nature of human error, both on an individual and organisational level. He is being brought to Australia for seminars this month by Melbourne-based OH&S advisers Zeal Consulting

    Disclaimer

    The purpose of this database is to provide a full-text record of all articles that have appeared in the CDJ since February 1997. It is aimed to assist in the research and reference process. The database has a full-text index and will enable articles to be easily retrieved.It should be noted that information contained in this database is in pre-publication format only - IT IS NOT THE FINAL PRINTED VERSION OF THE CDJ - therefore there might be slight discrepancies between the contents of this database and the printed CDJ.

    Latest news

    This is of of your complimentary pieces of content

    This is exclusive content.

    You have reached your limit for guest contents. The content you are trying to access is exclusive for AICD members. Please become a member for unlimited access.