Ten years ago this month, on 16 January 2009, 23-year-old lab technician Sheharbano “Sheri” Sangji died of the burns she sustained when an ill-prepared and risky experiment went calamitously wrong. This totally preventable waste of a young life is the most significant and revealing thing I’ve witnessed in more than 15 years covering early-career science issues, with the catastrophe’s dispiriting aftermath a close second. That’s because no responsibility that universities and faculty members shoulder—not making discoveries, not publishing articles, not winning funding, and certainly not advancing their own careers—matches the moral weight of safeguarding the students, postdocs, technicians, and others whose labor makes all those other things possible.
Working safely in the face of potential danger demands unrelenting, systematic attention to the reality of risk—thinking in advance about the specific hazards inherent in particular tasks—and determination to find ways to both lessen the chance that something will go wrong and protect oneself when it does. This approach explains the strikingly positive safety record of U.S. commercial aviation. University science, however, has generally failed to internalize this way of thinking and convey it to students, observes Harry Elston, editor of the Journal of Chemical Health and Safety. “The reason always revolves around two excuses: ‘We don’t have time in the curriculum’ and ‘We’re not safety experts.’” This month, to honor Sheri’s life, we consider steps that institutions, organizations, and individuals can take to counter this mindset.
Follow the money
Sheri was working on a project funded by the National Institutes of Health (NIH). The agency could have made a powerful impact on labs across the nation, changing incentives in favor of better safety standards, had it taken action after the fire. Instead, NIH did nothing except continue to fund her lab chief, professor Patrick Harran of the University of California, Los Angeles, even though he faced felony charges and served probation in connection with the conditions that caused the death.
In fiscal years 2010 and 2011, according to NIH’s RePORTER grants database, the agency renewed an existing R01 grant, awarding Harran more than $300,000 annually. Felony charges were brought against Harran in December 2011. Then, in fiscal year 2014—the same year that Harran accepted responsibility for the conditions leading to Sheri’s death as part of the settlement of the legal case against him—he was awarded two new R01 grants totaling more than $900,000 a year. Those grants have been renewed every year since.
“A lot of very talented researchers go unfunded every year,” says Sheri’s sister Naveen Sangji, who is now completing her training as a trauma surgeon specializing in burns. “If you don’t fund him, there will be a hundred others who will not be endangering their workers and their researchers, their students and their lab technicians.”
The U.S. Department of Homeland Security (DHS), on the other hand, offers an example of how a funding agency can create pressure for positive change. After a preventable explosion that occurred during a DHS-funded project maimed a graduate student at Texas Tech University in Lubbock in 2010, the DHS Centers of Excellence began requiring funding recipients to submit and maintain research safety plans. In DHS’s Scientific Leadership Award program, the research safety plan counts for 10% of proposed projects’ scores in the internal review phase of the selection process. The plans must, among other things, identify “possible research hazards” and assure that all procedures “conform to generally accepted safety principles,” with “independent review by subject matter experts of the safety protocols and practices.” They also need to guarantee “faculty oversight of student researchers” and “education and training to develop a culture of safety.”
DHS provides only a small portion of the funding that supports academic research, however, so its policy’s influence is necessarily limited. The example set by NIH—which accounts for more than half of U.S. academic research funds—unfortunately speaks much louder.
The “onus is on the scientific community to … demand change and work for it within their own professional associations and their societies and their universities,” Naveen says. For one example, Harran’s selection as a AAAS Fellow in 2015 sparked controversy and was later rescinded following protests. (AAAS publishes Science Careers.) In September of last year, AAAS announced a new policy for revoking a “AAAS Fellow’s lifetime honor” for “proven scientific misconduct or serious breaches of professional ethics.” This “would apply to any case of serious scientific misconduct, such as, for example, the Harran case,” AAAS CEO Rush Holt told Science Careers by email.
The American Chemical Society goes even further. Having declared safety a core professional value, it includes a question about the person’s safety record in all nominations for its national honors.
Efforts and policies like these are a step in the right direction. As Naveen and many other safety advocates believe, serious safety incidents should permanently stain and, if appropriate, even end responsible scientists’ careers.
Universities across the country have also taken a variety of other steps to strengthen their safety cultures and heighten awareness and knowledge. A small sample of replicable ideas that I have heard of lately includes a computerized registration system that connects both students and faculty members to mandatory safety training about the particular risks presented in their laboratory courses. “Soft skills” training for environmental health and safety staffers improves their ability to work collaboratively with researchers. “Safe + Sound Week” observances focus attention on institutions’ commitment to safety with activities such as awarding prizes to people “caught in the act” of working safely. Posters and safety fairs with attractive themes offer the chance for informal learning.
Student safety initiatives provide leadership opportunities while fostering engagement with safety. A group of Northwestern University students, for example, were inspired to start the Research Safety Student Initiative after visiting the Dow Lab Safety Academy. The volunteer organization carries out projects that range from voluntary lab walk-throughs that identify the safest research groups to ice cream socials where students can discuss safety topics with experts.
We can’t know how many disasters haven’t happened because people improved their safety practices—though, as Naveen says, “if we have managed to prevent even one additional incident,” the years of advocacy and effort will have been worthwhile. There have never been reliable statistics on academic lab safety incidents to track progress.
But there is anecdotal evidence that limbs—and maybe even lives—have indeed been saved. To take one example, in 2012, chemistry professor Ian Tonks of the University of Minnesota in Minneapolis experienced a “serious and, frankly, terrifying” mishap, he recalled recently in . But thanks to “years of safety emphasis” that followed Sheri’s death, he continues, it ended noncatastrophically. “[M]y reaction was scaled down, I wore personal protective equipment, and emergency plans were in place and followed.”
Tonks admits that, given his training and experience, he had “thought it could never happen to me.” But so, of course, does everyone, and thus the urgent need for all academic scientists to follow Tonks’s example. Had Harran’s lab taken such precautions, we would not be observing this mournful anniversary.