
In January 2023, the journal eLife – long at the vanguard of the progressive movement in scientific publishing – took the radical step of adopting a “publish, review. curate” model. Here, with some background framing, is what it feels like to use it.
The scientific publishing landscape has altered dramatically in the last 20 years or so, but the mechanism of scientific publishing has been slower to change.
For many scientific authors, the process of getting a paper published in the digital age still resembles the time when manuscripts were submitted by post:
- Prepare manuscript for a target journal.
- Send off manuscript to target journal, with a cover letter explaining to the editor(s) why the authors have chosen to send it there.
- The editor(s) decide whether or not to send the paper out for review.
- If a paper is sent for review, the invited reviewers critique the manuscript and return it to the editor(s).
- The authors revise their manuscript in light of the (hopefully constructive) criticism, and resubmit the manuscript to the journal.
- The editor(s) make a final decision on whether or not to accept the paper for publication.
A paper can be rejected by the journal at steps 3, 4, and 6 and if so, this means “go back to step 1, do not pass Go, do not collect 200 dollars”. Some journals may additionally insist on multiple rounds of review, so a paper can get trapped in a loop of steps 3-4-5 until a final decision is reached.
It’s a process in which scientific authors do not have control. They are supplicants, relying on the discernment and to some extent the kindness of the editors who act as gatekeepers to the electronic printing presses. It’s also a process that has received mounting scrutiny, and become a rate-limiting step in the research process.
In fact, it’s pretty common nowadays that by the time a paper is actually published its data are so old that the interest of the group itself has long since moved on to the next question in the research programme. And of course, while the paper remains unpublished the data remain unseen – they become time capsules, a memory of what experiments were being done several years ago.
That “time capsule” element has in the last 5-6 years been smashed by the widespread adoption of preprints in the biological sciences, which already marks a paradigm change in the way that scientific information is disseminated: scientists can now get their data into the public domain at the moment they believe it’s ready for publication. This already has restored a degree of control over the publication process to authors, and peer review of preprints (such as that offered by platforms like Review Commons) enables the preprints to go an additional step along the publication process without journal involvement.
But the basic submit-review-publish publication model offered by journals remains the same, and an unreviewed preprint submitted to a journal still has to go through the same process as outlined above.
This is where the journal eLife has taken the radical step of changing its publication model entirely, adopting a publish-review-curate model.
- “Publish” means publish first – the manuscript must already be online and publicly-available as a preprint.
- “Review” means the same Steps 1-4 as described above.
- “Curate” is the final gloss in which a published, peer-reviewed preprint becomes a Version of Record (with accompanying DOI number).
This inverts the traditional model in which an article is first “curated” through editorial and peer reviewer input before it is “published” and becomes available for consumption.
eLife is by no means the first platform to offer preprint peer review (as already noted, Review Commons is an established and user-friendly alternative), and it’s not the first journal to use a publish-review-curate approach (F1000Research has been blazing this trail for a while), but it arguably is the first major biological journal to switch to a publish-review-curate model, and as a champion of the progressive movement, it’s got the following and the clout to turbocharge that paradigm.
Not only that, but eLife even went a step further by pledging that authors – not editors, not reviewers – would determine what revisions should be made after peer review, and that all reviewed preprints would be published as Versions of Record in eLife if the authors requested it.
But what’s it like in practice?
Taking back control: the good points
My group’s most recent research paper (one of the very last, as the group folded in 2023) was published in eLife as a version of record in November 2024.
The paper was submitted to eLife in February 2024, having already been posted as a preprint on bioRxiv. There was a wait of about 10 days for an editorial decision (about average in terms of duration), and then the paper went out for review.
Plus point: There was an immediate and amazing sense of relief when this happened. From this point onward, the paper was guaranteed a home at eLife. Just from a stress and mental health perspective alone, this was a massive plus.
The reviews arrived 2 months after submission of the paper to eLife, and around 6 weeks after going out for review (again, 6 weeks for peer review is about average). Not only that of course, but eLife has long practiced a consultative peer review model in which the reviewers and editors confer to align on what the most important messages for the authors are.
Plus point: I’ve long been a fan of consultative peer review, and its biggest gift to authors is that it avoids mixed messaging from the reviewers. This case was no exception, and the feedback we received was cordial, constructive, and fair.
Around 3 weeks after the reviews came in, the paper appeared online at eLife as a reviewed preprint. In theory, we could have gone directly to a Version of Record at that point, but – in common with most eLife authors – we wanted to take the reviewers’ input on board and the consensus decision was to implement as many of their recommendations as possible. After revising the paper we uploaded the revised preprint to bioRxiv, and then immediately sent the revised preprint to eLife.
Plus point: It was, yet again, a strange and wonderful feeling to know that we had full control over what and how many of the reviewers comments to implement.
4 weeks after sending in the revised preprint, we received a second round of reviews. This was unexpected but useful in that it allowed the reviewers to assess the extent to which we had taken their feedback on board. The revised, peer-reviewed preprint appeared on eLife around 2 weeks after the reviews came in. At this point, we then proceeded to the Version of Record.
Plus points summary: eLife’s new model definitely does what it promises and gives authors control over the peer review process. My group has previously obtained preprint peer review via Review Commons and while that also works very smoothly, there definitely isn’t the same relief involved because the journal submission process is still to come and there’s no way of knowing whether the paper will find a home.
Not taking back control: the bad points
While the preprint peer review process was a delight, getting the Version of Record published – in theory, a mere formality – was a massive headache.
In all, getting from the revised, peer-reviewed preprint to the published Version of Record took 3 months, around 33% of the total time from submission to publication. Admittedly, this was undoubtedly slowed by the fact that I had already left academia and was in full-time employment in the private sector, so the time I had available to work on it was extremely limited.
The problem? The source data. eLife turns out to have the most extreme requirements of any journal I’ve yet published in when it comes to submitting not just your figures but also the accompanying raw data that underpins them.
I get that this is the way things are heading right now and transparency is no bad thing, but there’s a world of difference between having your data organised so that you can provide anything on demand, and preparing it in a way so that it’s intuitively accessible for everyone. Our paper also had a lot of gel and immunoblot data, and eLife’s source data requirements for these methods are extreme.
Recommendation: Given that presumably most of the people seeking preprint peer review at eLife are intending to publish a Version of Record there, it would be no bad thing if the journal informed authors of its source data requirements and other procedural steps as soon as the preprint goes out for review. That would give the authors time to organise everything well in advance. Now that many journals (including eLife) accept format-free submissions, this means there’s no guarantee that authors will have read every line of the “instructions to authors” in advance. Please, help lessen authors’ stress by giving them a heads-up well in advance – a simple checklist of requirements alone would be enormously helpful.
What made this process even more frustrating was that the requirements turned out to be unevenly implemented. We were in a long exchange with the journal about whether or not we were supposed to provide the several gigabytes of raw imaging data in addition to all our raw gel/blot data, and common sense fortunately prevailed. Conversations with some other authors who’d recently published in eLife echoed this frustration – it took loads of effort to provide something that in all likelihood nobody will ever look at, and one senior author informed me that they wouldn’t publish in eLife again solely because of the excessive post-acceptance requirements compared to other journals.
Recommendation: Rather than compelling authors to spend hours/days assembling files that are unlikely to ever be accessed, it would be better if authors were instead made to sign a pledge that they will make any and all raw data for their paper available upon request, with failure to comply with any such request resulting in an Expression of Concern and potentially even a retraction of the paper. This compels compliance to the same set of transparency rules, but avoids the possibly futile expenditure of time preparing and assembling source data for “all-eyes” consumption.
Bad points summary: While eLife’s new model undoubtedly made us feel in control of the peer review process, the publication of the Version of Record was a reversion to the old system of a journal dictating terms to authors. It’s worth noting that the eLife editorial staff were uniformly friendly and extremely helpful, but the rules they’re being asked to enforce are unpopular and constitute a huge time investment for questionable gain.
In summary then, publish-review-curate certainly puts authors in control of the peer review process, but the process leading to publication of the Version of Record in eLife still leaves something to be desired. Would I submit to eLife again? Absolutely, but it would have helped if I’d been informed about some of the post-review processes in advance.
Brooke,
A delight to read you, as always. No wonder you have moved on to a writing+ job—I am also very glad that you are enjoying it so much 🙂
About this post: one thing that I find very off-putting about the eLife model is that, while they give authors all control, they also give up the responsibility of ‘declaring’ a paper ‘finished’.
I do not see much difference, in ‘added value’, between a Version of Record in eLife and a revised preprint in bioRxiv accompanied by the corresponding ReviewCommons referee comments. The ‘confidence’ for the reader is the same, I would say.
What is it that you are paying for, when you publish in eLife? You can format your paper nicely yourself for free with editing software; generative AI can give a very decent proofreading for free; and the raw data archiving is also available for free in Zenodo and other repositories. Peer review, you can also get through Review Commons, and in any case this is not something that the the journal gives… they just orchestrate it with minor admin input, while the bulk of that process is given for free by the community.
This is a well-trodden matter: one is paying for to attach the prestige of a journal to the paper. A seal of quality. This is, of course, artificial currency, and part of the problem, but for eLife authors, I think this is gone. The only point where the journal takes ‘responsibility’ is when they decide to send papers for review. So, instead of convincing an editor and a small group of reviewers over a protracted period (as you would need to in other journals), the only seal of quality depends on convincing a single journal editor after one read (maybe with some editorial board discussion). I do not think this is the same.
The real-world consequences of this may be that the career value of having a paper in eLife has greatly diminished. As an experiment, I think it is great, as we will learn a lot whether it works or not. If it does not, we need to think about what else needs to change to make academia more functional and less damaging for academics. But if it does, I would really need someone cleverer than me walking me through all the reasons why we should from then on stop scientific publishing at posting preprints with reviews. I think the taxpayer should not be paying just for branded formatting.
I would love your perspective on this!
LikeLike
Hi Joaquín, thanks as always for taking the time to write, massively appreciated! Lots of super-interesting points here… First, a confession: I support preprints, but I (almost) never read them myself – I really appreciate what they add to the research process and how valuable they are in providing ECRs with a citable output, but I actually really value the formatting that professional journals provide. If there really is off-the-shelf formatting software that can readily produce the same results, I would LOVE to hear about it.
Second, the role that I see journals playing in this post-preprint ecosystem is that they become arbiters of quality. This is not the same as prestige! With preprints enabling immediate and wide-ranging dissemination of content, and with the scientific workforce continuing to grow and continuing to increase in terms of productivity, then I think we actually need quality journals more than ever. “Prestige” is the ugly process whereby a cabal of quality journals try to set themselves apart from the rest (usually on grounds of sensationalism, unfortunately), but as any good cell biologist will tell you, the best cell biology papers can be found in JCB and not in Nature. [I blogged about this distinction a while back: https://totalinternalreflectionblog.com/2021/09/26/top-quality/].
Third, and again it’s a minor but non-trivial point, is the Version of Record element. Journals allow that final stamp to mark a work as completed, and that’s something that preprints – even if peer reviewed – don’t quite replicate, or at least not automatically (although it would be possible to achieve a similar degree of rigour outside the journal system).
It’s entirely possible that in a few years we’ll both look back on this thread and laugh at how conservative I was, but at least, that’s where I stand for now. A timid progressive, I guess… 🙂
LikeLike