The well-behaved journal

Science is running a poll titled "The Well-Behaved Scientist" this week that asks "how should we promote publication of data that can be replicated and reproduced?" Of the ideas on their list -- more funding from funding agencies, more rewards from institutions -- conspicuous in its absence is the rather fundamental idea that the purpose of scientific journals, including Science, is to publish reproducible research.

A 2003 National Academy of Sciences report on community standards for scientific publication in life sciences called scientific publication a quid pro quo : the journal acts as a proxy for the scientific community and awards priority and recognition, in return for the authors giving a result to the community in the form that it can be verified, reproduced, and extended by the community. That makes the publication process an important checkpoint for reproducible and extensible research, and it makes journals (their editors and reviewers) the community's gatekeepers. But Science, like some other journals, has too often been recalcitrant in helping to uphold community standards for reproducibility and accessibility to published results, and has been too slow to deal with biology's transition to a data-intensive science where large datasets and software codes need to be made available electronically with publications. To see them publish this poll in which most of their ideas point to funders and institutions -- and thus away from themselves -- seems tone-deaf. It's the sort of thing that is leading people to try to start fresh, by starting new journals like the new HHMI/Wellcome/Max Planck journal eLife.

For example, a recent experience of mine with Nature Methods, which went essentially as follows. I receive a manuscript for review, describing a new method implemented in software. The software, though, hadn't been made available for review, and wasn't going to be made available to the community under a license that would allow it to be examined and extended. So I email the editor with my standard response to that sort of situation: I say I'd be happy to review the paper if the authors make their result available (which is in fact Nature Methods' written policy for software). No response. Weeks pass. Then I get an email asking me to rereview the revised manuscript. My brief email has been considered to be review #1! I email a note back to the editor and decline, saying as I didn't ever review the manuscript in the first place, it was obviously a little odd for me to pitch in my review when the authors have already made their revisions in response to reviews. Automated response: "Thank you for your review." (!!) Journals too often don't have an adequate mechanism in place to deal with the basic nuts and bolts of data and software accessibility, ancillary to the actual scientific review of the written paper.