For many researchers, integrity is akin to a note on a cluttered reminder board. They know it’s important, but other reminders — to run experiments, apply for grants, write papers, and so on — take precedence. Serious thinking about research integrity gets put off to another day.
If all researchers set high standards for responsible behavior in research, it might not be so important to pay attention to integrity. Unfortunately, more than a few do not, and the behavior of those who are willing to bend and sometimes deliberately break the rules can impact even the most principled researcher. Consequently, anyone who places integrity at the bottom of a to-do list does so at some personal risk.
When the active consideration of integrity is put off to another day, it becomes easier to take the first compromising steps toward irresponsible research practices.
When integrity is addressed, the focus is usually on the worst cases of misbehavior, commonly referred to as “research misconduct.” The U.S. government defines research misconduct as “fabrication, falsification, and plagiarism.” Allegations of research misconduct can end careers and cost institutions time and money. Misconduct cases can also create difficult personal situations. What would you do if you were co-author on a paper that had to be retracted because one of the other co-authors engaged in misconduct? Would you list the paper on your resumé to get credit for legitimate work you did, or would you take it off to avoid being associated with a misconduct case? How would your career be affected if a mentor’s or collaborator’s grant ended due to a misconduct finding? Research misconduct impacts the careers of both perpetrators and bystanders.
Misconduct is not, however, the first or even the most important test of integrity in scientific practice. More significant by far are the dozens of routine decisions scientists make every day. The relevance to integrity of these small choices may not be apparent. As small decisions, their consequences are not obvious, which makes it easier to justify bending rules and cutting corners: What difference would it make if you described essentially the same research results in more than one publication without proper notification, added a few references in your notes that may or may not actually support your research, or used a few sentences from someone else’s methods section to describe what you have done? When the active consideration of integrity is put off to another day, it becomes easier to take the first compromising steps toward irresponsible research practices.
So, here are the first two items that should be on every scientist’s integrity to-do list:
1. Fully understand the rules of authorship and credit that apply to your research.
2. Don’t begin a research project until everyone involved agrees who will be listed as an author and in what order.
Laboratory management — especially record keeping — is another area in which researchers can easily get into trouble. Researchers have a responsibility to be good stewards of research funds, to keep meticulous records of the science done in the lab (i.e., keep complete lab notebooks, whether paper or electronic), and to comply with ethics and other regulations. (See Box). Studies suggest that one in three researchers fails to keep proper laboratory records. Irresponsible laboratory practices waste time and funds. They also harm careers when experiments have to be rerun, results cannot be replicated, and papers must be retracted. You can lose your claim to intellectual property if you cannot document when and how something was discovered. Misconduct can go undetected when colleagues fail to keep track of what is going on in their laboratories. Entire university research programs can be suspended if informed consent for a few projects is not properly documented.
To avoid these and other improper laboratory-management practices, two more items should be on every scientist’s integrity to-do list:
3. Take note of and understand the rules and regulations that apply to your research. Use them to guide your day-to-day decisions.
4. Develop a system or routine for reviewing how well you are doing in meeting your laboratory-management responsibilities. Integrity is judged by what you do, not what you intended to do.
In late July, 340 researchers from 51 countries gathered in Singapore for theand produced, collectively, the , which expresses the following principles: – Honesty in all aspects of research – Accountability in the conduct of research – Professional courtesy and fairness in working with others – Good stewardship of research on behalf of others. Additionally, the statement lists 14 responsibilities of researchers and research institutions. The conference co-chairs, Steneck and Tony Mayer, drafted the Singapore Statement with Melissa Anderson, chair of the organizing committee of the upcoming 3rd World Conference on Research Integrity.
Finally, it’s a good idea to give some attention to research environment (see the related article by Beryl Benderly). Studies have suggested that research environments influence how researchers behave. These influences seem to run deeper than “publish or perish.” Most researchers are subject to pressure, but most do not engage in misconduct or routinely cut corners. However, troublingly high numbers do from time to time bend the rules, and they seem to do so more when they don’t trust the integrity of the environment within which they are working.,Accordingly, two final items should be on everyone’s integrity to-do list:
5. Ask yourself whether your research environment encourages you to set high standards for responsible behavior.
6. If it doesn’t, take steps to improve the situation or consider moving to a different work environment.
Researchers are responsible not only for their own integrity but also for the integrity of colleagues and for their collective area of research. Next time a question about proper behavior arises when you are working with colleagues or planning a research project, openly address it rather than put it off to another day. The right decision today could avoid getting into a no-win situation tomorrow.
P. Keith-Spiegel and G. P. Koocher, “The IRB Paradox: Could the Protectors Also Encourage Deceit?” Ethics and Behavior 15, 339 (2005).
B. C. Martinson, A. L. Crain, R. De Vries, and M. S. Anderson, “The Importance of Organizational Justice in Ensuring Research Integrity.” J Empir Res Hum Res Ethics 5, 67 (2010).