It’s not every day that you realize you’re a data point in a scientific study—and a misrepresented data point at that. But that’s what happened to a number of current and former scientists—including me—while reading a study reporting that scientific careers have become significantly shorter in the past 50 years, published earlier this week in the Proceedings of the National Academy of Sciences (PNAS).
“Half of academic scientists leave the field within 5 years,”headline trumpeted. “New study says scientists are leaving academic work at unprecedented rate,” read . It’s a message that’s likely familiar to those who follow the plight of today’s early-career researchers, and many shared the paper on social media as yet more evidence that systemic change is urgently needed.
Meanwhile, Twitter also erupted with comments from people who thought that the methods—using paper authorship in a handful of journals as a proxy for academic activity—were flawed, relying on a too-narrow definition of who counts as a scientist. As a prolific associate professor of ecology at the University of Canterbury in Christchurch, New Zealand, tweeted, “Shocked to discover that I am an early career dropout with no lead authorship ever (according to authors’ definition)! Defining research active careers using 9 ecology journals is flawed – even if the message resonates.”
The study sought to figure out whether the “half-life” of a scientific cohort has changed over the past 50 or so years, spurred in part by growing concern about the dysfunction of the academic training model and job market. “We are reaching a point that doesn’t really look sustainable,” says Staša Milojević, the lead author of the study and an associate professor of informatics at Indiana University in Bloomington. Given the number of people who leave academia to pursue jobs elsewhere, Ph.D. programs should do more to recognize diverse career paths and provide appropriate training options, she says. There shouldn’t be a “single train for everybody.”
But many critics say that these limited data are not sufficient to draw the conclusions claimed in the study, with some offering their own publication record as evidence. In addition to the University of Canterbury ecologist, an actively publishing ecosystem scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, wrote on Twitter, “I got a PhD in Ecology in 1988 and have never once published in the journals defined as in my field.”
The “Twitter-storm” of criticism primarily focused on ecology—the field I worked in as a scientist, and one that has evolved rapidly since the study start date in the 1960s. The same issues may not apply to the other disciplines Milojević examined. For astronomy, at least, there’s evidence that the results hold.
“I’m not going to defend their techniques,” Yoachim says. But “astronomy is a really small field” with a limited number of journals, so only looking at six journals in the PNAS study may not have biased the findings too badly, he notes. But he agreed that it could be potentially problematic when looking at other fields, such as ecology.
I’m no longer a practicing scientist—I’m a reporter and editor for Science Careers—but I was one of the 20,704 ecologists whose names were included in the Milojević database, and one whose career trajectory was misrepresented. I published a chapter of my master’s thesis in one of the “chosen” ecology journals—Oecologia—in 2007. That pegged me as an ecologist from the study’s perspective. (Milojević confirmed this.)
The researchers then tried to link that publication with any ecology publications I authored after that—in essence, to figure out how long I stuck around in academia (or, as the study authors phrased it, to figure out my “ultimate survival status in science”). Since that 2007 publication, I’ve earned a Ph.D. in ecology and published 14 peer-reviewed papers in 12 journals. My publication record extends through to this month’s issue of Conservation Genetics. In theory my scientific career length—based on my publication record—should have been estimated to be 13 years. Yet the study’s methods rendered me a “transient”—an author with a single publication whose estimated career length was only 1 year—because none of my pre- or post-2007 papers were published in any of the chosen journals.
The study authors acknowledge in the paper that the dataset’s “incompleteness … may affect the determination of career length.” But they go on to write that it shouldn’t affect analyses looking at trends over time because there’s no reason to believe that the level of “incompleteness” would be different now than, say, 50 years ago.
Is that assumption reasonable? An education researcher isn’t so sure, pointing out that over the study’s timespan the set of journals being examined was likely a “shrinking” percentage of the total number of journals in each field.
To figure out whether that was the case for my (now former) discipline, I took a look at the 30 top-ranked ecology journals based on International Scientific Indexing impact factors. Most of the journals published their first issue in the 1980s or later, long after 1961—the year when the first cohort of ecologists analyzed in the PNAS study began publishing. (Of the “chosen” ecology journals, only two of the nine even existed in 1961.)
That means that an ecologist trying to publish their work in 1961 likely had fewer journals to choose from than an ecologist trying to publish their work in 2010. What’s more, there are more scientists trying to publish now, so competition is more severe. In other words, it’s plausible to me that some modern-day ecologists’ careers could look shorter simply because the PNAS study’s methods are not well equipped to capture the full range of journals they could be publishing in—an issue that may not apply as much to previous generations of ecologists.
For my part, I look forward to seeing that study and finding out whether Milojević’s conclusions hold—not only because reporting on scientific workforce issues is my day job, but also because it’s something I’ve experienced firsthand. I am one of the many publishing scientists who have purposefully (and happily) left academia to pursue a career elsewhere—and I agree with Milojević that Ph.D. programs should do a better job preparing people like me for nonacademic careers.
But until that new study comes out, I think we should avoid pointing to the PNAS study as a solid example of academia’s revolving door, at least for ecology. I’d rather point to a study where I’m an accurate data point.