Does this sound familiar?
Students and postdocs “do not feel empowered to address their concerns with others within the lab or with the faculty adviser. They also do not believe that they can move forward to effect positive safety changes without negative or punitive consequences … .”
How about this?
“Principal investigators operate autonomously, exercising significant authority over the research and the research personnel in their individual laboratories, and in some cases may regard good safety practices, such as inspections by outsiders or following established safety procedures, as a barrier to research progress and a violation of their academic freedom.”
Many—perhaps most—graduate students and postdocs can undoubtedly identify with these descriptions, which are taken from , an intelligent and informative report published 31 July by the National Academies’ National Research Council (NRC).
The report calls itself a response to recent “serious and sometimes fatal” injuries to rank-and-file lab workers—not only students and postdocs but also technicians. These front-line lab workers face the greatest risk of injury from lax safety practices and know lab conditions best—yet they are often powerless to make their working lives safer.
As is usual for NRC reports, the committee that wrote this one is made up mostly of influential people—academic leaders, distinguished faculty members, and leading experts—so it’s not surprising that most of its conclusions and recommendations take a top-down, long-term view. Two of the recommendations, though, could be implemented fairly quickly and would empower those voiceless researchers who do the bulk of experimental work to do more to help ensure their own safety.
Learning from mistakes
The crux of the committee’s advice is that universities need to build robust and pervasive safety cultures. The concept of safety culture, the report explains, emerged as experts searched for the causes of the 1986 Chernobyl nuclear power plant disaster. Safety culture encompasses “the organizational context in which all actions pertinent to safety occur.” In the sort of “strong, positive” safety culture the committee believes universities should develop—and that already exists in major industrial research labs—people work safely “not because of a set of rules, but because of a commitment to safety throughout an organization” that integrates “safety as an essential element in the daily work of laboratory researchers.” Life in labs with a strong safety culture “supports the free exchange of safety information, emphasizes learning and improvement, and assigns greater importance to identifying and solving problems rather than placing blame.” In such labs, safety has “high importance … all the time, not just when it is convenient or does not threaten personal or institutional productivity goals.”
Such a vision is alien to many universities and would take a lot of time and effort to instill. But work could get underway promptly on a national system for confidential reporting of lab-safety incidents and near misses. The report lists several organizations it says “should work together to establish and maintain [such a] system, building on industry efforts, for centralizing the collection of information about and lessons learned from incidents and near misses in academic laboratories, and linking these data to the scientific literature” for use in safety research and training. “Department chairs and university leadership should incorporate the use of this system into their safety planning. Principal investigators should require their students to” use it.
A second recommendation that could increase lab worker’s ability to improve their own safety calls for “the researcher and principal investigator [to] incorporate hazard analysis into laboratory notebooks prior to experiments, integrate hazard analysis into the research process, and ensure that it is specific to the laboratory and research topic area.” Making such discussions of risk and mitigation routine would increase the attention paid to these crucial issues in many labs.
Here’s an idea that isn’t included in the report: Safety-minded students and postdocs should readand bring its informative analysis and useful suggestions to the attention of their lab chiefs, department chairs, deans, and other university officials.
As Safe Science notes, the “specialized and insular structure and hierarchical nature of academic research can pose challenges to the development” of safety culture. Among the “most recalcitrant” of these is “the attitude, unfortunately often reinforced by principal investigators, that safety practices are time-wasting inhibitions to research productivity.” “Efforts must be found to convince such people that working safely enhances, rather than inhibits, research productivity.”
One method of persuasion that a number of well-regarded safety experts and reports consider necessary is missing from the recommendations: making a PI’s safety record a criterion for funding. “When negligent or cavalier treatment of laboratory safety regulations jeopardizes everybody’s ability to obtain funding, a powerful incentive is created to improve laboratory safety,” states another NRC publication, the 2011 revision of the widely used . The Safe Science committee, however, considers the use of funding as a stick to enforce lab safety to be controversial and prefers to use a “carrot” to drive change. Also controversial, according to some observers, is whether carrots alone can overcome the many formidable barriers blocking change in academic safety culture that the report amply enumerates.