Are scientific editors reliable gatekeepers of the publication process?
20 de septiembre de 2019 | Lectura de 5 min
Por Richard B. Primack, Danielle Descoteaux, Vincent Devictor, Laurent Godet, Lucy Zipf
From left to right: Danielle Descoteaux (USA, publisher) Richard Primack (USA), Robin Pakeman (UK), Tracey Regan (Australia), Vincent Devictor (France), Richard Corlett (China), Liba Pejchar (USA), Bea Maas (Austria) and David Johns (USA).
The Biological Conservation editorial team often acts as an editorial gatekeeper, as shown in this 2018 photo in front of an actual gate.
A study of editorial consistency in peer review
The scientific community assumes the publication process is reliable and fair, with the best papers being published only after rigorous review. Scientific editors act as “gatekeepers” in this publishing process, deciding whether a paper is even sent out for peer review, or alternatively “desk rejected”, that is returned to the author without peer review. While the review process has been extensively investigated - for example to determine reviewer consistency, and whether reviewers exhibit gender bias - as far as we know, the topic of editor consistency has never been experimentally examined. Do editors make arbitrary decisions on which papers to send out for review and which ones to desk reject? Or is there consistency in what editors decide to do?
We addressed these questions in respect of manuscripts submitted to the journal Biological Conservation se abre en una nueva pestaña/ventana, and the results of our study were recently published in the journal se abre en una nueva pestaña/ventana., We are reasonably confident, however, that the results are likely to apply to other scientific journals.
Study design
The handling process of Biological Conservation is structured so that all new submissions are sent directly to the Editor in Chief, who either desk rejects them or assigns them to an editors whose expertise is aligned with the subject, for further evaluation and/or initiating the review process. For our study, conducted in 2018, we asked ten editors of the journal to each evaluate the same forty manuscripts, each of which had been submitted to the journal and previously evaluated in 2017 by an editor. All forty manuscripts had been filtered by the Editor-in-Chief in 2017 to be within the scope of the journal and of acceptable quality. They had then been delegated to handling editors that year. The manuscripts selected for the study represented a variety of topics and article types, a mixture of male and female corresponding authors, and involved authors from many countries. Of these 40 papers, the editors in 2017 had sent out 20 of these papers for review and desk rejected the other 20 papers.
The ten editors involved in the 2018 study did not know which manuscripts had actually been sent out for review (20 in total) and which had been desk rejected (20 in total) in 2017. Editors are generally assigned papers only in their area of expertise, so editors in this experiment were sometimes evaluating papers outside their normal topic areas.
Consistency among editors
We were pleasantly surprised to find that there was demonstrable consistency in the evaluations se abre en una nueva pestaña/ventana. While complete agreement among editors (that is, when all editors come to the same decision) was rare, for 73% (29) of manuscripts, seven or more of the 10 editors independently agreed on the decision of whether to review or desk reject the submission in question.
Agreement with past decisions
In 2018, editors tended to make decisions that echoed the actual decisions made in 2017. Of those manuscripts desk rejected in 2017, on average 70% of editors in 2018 agreed they should be rejected “again”. Similarly, when considering manuscripts sent out for review in 2017, on average 67% of editors agreed with the decision in 2018.
Importance of expertise
Editors were also asked if a manuscript was in their area of expertise or not. Surprisingly, editors did not manifest a higher rate of desk rejection for manuscripts outside their area of expertise. We regard this as a positive sign that editors can assess each paper on its own merits regardless of their individual background.
Overall, we found that editors are reasonably consistent in their decisions to send a paper out for review, or to desk reject it, and that they agreed with past decisions. However, disparities in agreement with decisions reveal the unsurprising subjectivity editors bring to the process. We are pleased that this study has demonstrated a significant degree of consistency in the peer review system in the area of editorial decision-making, but we see some room for improvement. One way in which we believe the process could be reinforced is by encouraging editors to seek the opinion of one or two additional editors before making a decision on papers that are not obvious candidates for either review or rejection. Whilst this would slow down the decision process for these papers, it seems reasonable that doing so would add an additional layer of rigor to the decision-making.
Transparency and fairness in publication
The editors of Biological Conservation are committed to improving transparency and fairness in the review and publication process. We have previously examined whether or not gender, nationality, or academic age affect acceptance rates se abre en una nueva pestaña/ventana for papers, how editors use reviews to make decisions se abre en una nueva pestaña/ventana, and the characteristics of our reviewers se abre en una nueva pestaña/ventana. We hope that the open communication of these results and of our practices will also inspire other editors to critically examine their own gatekeeping activities, and in the process promote research that is both robust and just. We welcome comments and questions about the study and how to implement its findings to improve the rigor of peer review; please comment below.