Tuesday, March 29, 2005

Making the world a safer place

We'd all like the systems (both the computerized ones and the harder-to-see organizations of people, processes, and tools that also compose systems we use) that protect our lives to work well and never fail, whether they are the safety features in nuclear power plants or the clinical procedures and equipment in our hospitals. Yesterday's comp.risks had two postings about failures in computerized medical systems, reminding us that many of our traditional beliefs about what causes failures in such complex systems aren't consistent with research.

In 1990, James Reason wrote the excellent and very readable Human Error, describing how errors are made in practice and describing how they can be reduced. Richard Cook and the Cognitive Technologies Laboratory have created a Web site with many short (and some longer) articles helpful to those designing such systems. Nine Steps to Move Forward from Error is food for thought for people thinking about how to make improvements after a significant system failure has occurred.

What's that got to do with us? Some of us manage people who create or use such systems; we need to know what good, current research tells us. Sometimes the pressures we put on others may create the problems we're trying to avoid.

Some of us may design such systems. We especially need to know what the research is discovering, for we will make many decisions and recommendations that affect safety and performance.

Some of us may simply use such systems. Normally, we may not need to know much of this. When systems fail, though, we may become part of a concerned and vocal populace that cries out for action. Knowing what research suggests about dealing with the aftermath of such failures helps us advocate for actions that will help us, not actions that may create even bigger problems in the future.

These concerns don't only apply to life-critical systems; we're also involved in business and organizational systems daily. The penalty for failure in those systems may not be someone's death, but it could involve people's livelihood.

By the way, comp.risks is a good way to stay abreast of the risks associated with the development and use of computer systems.

Labels: , ,

0 Comments:

Post a Comment

<< Home