BL-3 and BL-4: Mistakes Will Always Be Made

The difference between smart students and not-so-smart students is that the not-so-smart students will screw up in predictable ways, while the smart ones will screw up in completely novel and unexpected ways (of course, we never screwed anything up…). With that in mind, this bit from a Government Accounting Office (‘GAO’) report about high-containment laboratories (those that work with BL-3 and BL-4 pathogens) is chilling (pdf; boldface mine):

For example, while investigating a power outage incident in its recently constructed BSL-4 laboratory, the CDC later determined that, some time earlier, a critical grounding cable buried in the ground outside the building had been cut by construction workers digging at an adjacent site. The cutting of the grounding cable, which had hitherto gone unnoticed by CDC facility managers, compromised the electrical system of the facility that housed the BSL-4 laboratory. Given that grounding cables were cut, it is apparent that the building’s integrity as it related to adjacent construction was not adequately supervised. CDC officials stated in 2009 that standard procedures under local building codes did not require monitoring of the integrity of the new BSL-4 facility’s electrical grounding. This incident highlighted the risks inherent in relying on local building codes to ensure the safety of high-containment laboratories, as there are no building codes and testing procedures specifically for those laboratories.

This isn’t to say that we shouldn’t do any BL-3 or BL-4 research, but we need to be much more rigorous about the costs and benefits of this research. Mistakes will be made. It’s worth keeping this in mind:

This entry was posted in CDC, Public Health. Bookmark the permalink.

2 Responses to BL-3 and BL-4: Mistakes Will Always Be Made

  1. dr2chase says:

    Have you read Normal Accidents by Charles Perrow? If not, do so immediately.

  2. kaleberg says:

    Most industrial plant safety managers have a different attitude. This is from the introduction to Trevor Kletz’s classic “What Went Wrong?”:

    “If an incident that happened in your plant is described, you may notice that one or two details have been changed. Sometimes this has been done to make it harder for people to tell where the incident occurred. Sometimes this has been done to make a complicated story simpler but without affecting the essential message. Sometimes – and this is the most likely reason – the incident did not happen in your plant at all. Another plant had a similar incident.”

    In other words, the typical safety manager assumes that his or her plant is the hell hole, and surely the other guys aren’t having to deal with all the crap and screw ups that they have to deal with. Remember, these are guys who come in to work after hours to deal with an Olympic swimming pool sized gasoline spill out in the welding yard and say “At least the spill wasn’t something really dangerous.” (The gasoline hazard area is rarely larger than the spill itself. Vinyl chloride or some other flashing liquid means you have already been screwed.) You get a similar attitude among IT security managers who also have hellish jobs.

Comments are closed.