A few years back, Scholarly Kitchen editor-in-chief David Crotty informally polled a dozen biologists about the burden of peer review. He found that most peer review around 3 papers per month. For senior scientists, that number can reach 15 papers per month.
And yet, no matter how much time they spend reviewing, the credit they get is the same, and it looks like this on their CV:
“Service: Reviewer for Physical Review B and PLOS ONE.”
What if your work could be counted as more than just “service”? After all, peer review is dependent upon scientists doing a lot of intellectual heavy lifting for the benefit of their discipline.
And what if you could track the impacts your peer reviews have had on your field? Credit–in the form of citations and altmetrics–could be included in your CV to show the many ways that you’ve contributed intellectually to your discipline.
The good news? You can get credit for your peer reviews. By participating in Open Peer Review and making reviews discoverable and citable, researchers across the world have begun to get the credit they deserve for improving science for the better.
But this practice isn’t yet widespread. So, we’ve compiled a short guide to getting started with getting credit for your peer reviews.
1. Participate in Open Peer Review
Open Peer Review is a radical notion predicated on a simple idea: that by making author and reviewer identities public, more civil and constructive peer reviews will be submitted, and peer reviews can be put into context.
Here’s how it works, more or less: reviewers are assigned to a paper, and they know the author’s identity. They review the paper and sign their name. The reviews are then submitted to the editor and author (who now knows their reviewers’ identities, thanks to the signed reviews). When the paper is published, the signed reviews are published alongside it.
Sounds simple enough, but if you’re reviewing for a traditional journal, this might be a challenge. Open Peer Review is still rarely practiced by most traditional publishers.
For a very long time, publishers favored private, anonymous (‘blinded’) peer review, under the assumption that it would reduce bias and that authors would prefer for criticisms of their work to remain private. Turns out, their assumptions weren’t backed up by evidence.
Blinded peer review is argued to be beneficial for early career researchers, who might find themselves in a position where they’re required to give honest feedback to a scientist who’s influential in their field. Anonymity would protect these ECR-reviewers from their colleagues, who could theoretically retaliate for receiving critical reviews.
Yet many have pointed out that it can be easy for authors to guess the identities of their reviewers (especially in small fields, where everyone tends to know what their colleagues/competitors are working on, or in lax peer review environments, where all one has to do is ask!). And as Mick Watson argues, any retaliation that could theoretically occur would be considered a form of scientific misconduct, on par with plagiarism–and therefore off-limits to scientists with any sense.
In any event, a consequence of this anonymous legacy system is that you, as a reviewer, can’t take credit for your work. Sure, you can say you’re a reviewer for Physical Review B, but you’re unable to point to specific reviews or discuss how your feedback made a difference. (Your peer reviews go into the garbage can of oblivion once the article’s been published, as illustrated below.) That means that others can’t read your reviews to understand your intellectual contributions to your field, which–in the case of some reviews–can be enormous.
Image CC-BY Kriegeskorte N from “Open evaluation: a vision for entirely transparent post-publication peer review and rating for science” Front. Comput. Neurosci., 2012
So, if you want to get credit for your work, you can choose to review for journals that already offer Open Peer Review. A number of forward-thinking journals allow it (BMJ, PeerJ, and F1000 Research, among others).
To find others, use Cofactor’s excellent journal selector tool:
Head over to the Cofactor journal selector tool
Click “Peer review,”
Select “Fully Open,” and
Click “Search” to see a full list of Open Peer Review journals
Some stand-alone peer review platforms also allow Open Peer Review. Faculty of 1000 Prime is probably the best known example. Publons is the largest platform that offers Open peer review. Dozens of other platforms offer it, too.
Once your reviews are attributable to you, the next step is making sure others can read them.
2. Make your reviews (and references to them) discoverable
You might think that discoverability goes hand in hand with Open Peer Review, but you’d only be half-right. Thing is: URLs break every day. Persistent access to an article over time, on the other hand, will help ensure that those who seek out your work can find it, years from now.
Persistent access often comes in the form of identifiers like DOIs. Having a DOI associated with your review means that, even if your review’s URL were to change in the future, others can still find your work. That’s because DOIs are set up to resolve to an active URL when other URLs break.
Persistent IDs also have another major benefit: they make it easy to track citations, mentions on scholarly blogs, or new Mendeley readers for your reviews. Tracking citations and altmetrics (social web indicators that tell you when others are sharing, discussing, saving, and reusing your work online) can help you better understand how your work is having an impact, and with whom. It also means you can share those impacts with others when applying for jobs, tenure, grants, and so on.
There are two main ways you can get a DOI for your reviews:
Archive your review in a repository that issues DOIs, like Figshare
Once you have a DOI, use it! Include it on your CV (more on that below), as a link when sharing your reviews with others, and so on. And encourage others to always link to your review using the DOI resolver link (these are created by putting “http://dx.doi.org/” in front of your DOI; here’s an example of what one looks like: http://dx.doi.org/10.7287/peerj.603v0.1/reviews/2).
DOIs and other unique, persistent identifiers help altmetrics aggregators like Impactstory and PlumX pick up mentions of your reviews in the literature and on the social web. And when we’re able to report on your citations and altmetrics, you can start to get credit for them!
3. Help shape a system that values peer review as a scholarly output
Peer review may be viewed primarily as a “service” activity, but things are changing–and you can help change ‘em even more quickly. Here’s how.
As a reviewer, raise awareness by listing and linking to your reviews on your CV, adjacent to any mentions of the journals you review for. By linking to your specific reviews (using the DOI resolver link we talked about above), anyone looking at your CV can easily read the reviews themselves.
You can also illustrate the impacts of Open Peer Review for others by including citations and altmetrics for your reviews on your CV. An easy way to do that is to include on your CV a link to the review on your Impactstory or PlumX profile. You can also include other quantitative measures of your reviews’ quality, like Peerage of Science’s Peerage Essay Quality scores, Publons’ merit scores, or a number of other quantitative indicators of peer-review quality. Just be sure to provide context to any numbers you include.
If you’re a decision-maker, you can “shape the system” by making sure that tenure & promotion and grant award guidelines at your organization acknowledge peer review as a scholarly output. Actively encouraging early career researchers and students in your lab to participate in Open Peer Review can also go a long way. The biggest thing you can do? Educate other decision-makers so they, too, respect peer review as a standalone scholarly output.
Finally, if you’re a publisher or altmetrics aggregator, you can help “shape the system” by building products that accommodate and reward new modes of peer review.
Publishers can partner with standalone peer review platforms to accept their “portable peer reviews” as a substitute (or addition to) in-house peer reviews.
Altmetrics aggregators can build systems that better track mentions of peer reviews online, or–as we’ve recently done at Impactstory–connect directly with peer review platforms like Publons to import both the reviews and metrics related to the reviews. (See our “PS” below for more info on this new feature!)
How will you take credit for your peer review work?
Do you plan to participate in Open Peer Review and start using persistent identifiers to link to and showcase your contributions to your field? Will you start advocating for peer review as a standalone scholarly product to your colleagues? Or do you disagree with our premise, believing instead that traditional, blinded peer review–and our means of recognizing it as service–are just fine as-is?
We want to hear your thoughts in the comments below!
- “What is Open Peer Review?” by F1000Research’s Eva Amsen
- “Where did our Peer Review mojo go?” by ScienceOpen’s Alexander Grossman
ps. Impactstory now showcases your open peer reviews!
Starting today, there is one more great way to get credit by your peer reviews, in addition to those above: on your Impactstory profile!
We’re partnering with Publons, a startup that aggregates Open and anonymous peer reviews written for PeerJ, GigaScience, Biology Direct, F1000 Research, and many other journals.
Have you written Open reviews in these places? Want to feature them on your Impactstory profile, complete with viewership stats? Just Sign up for a Publons account and then connect it to your Impactstory profile to start showing off your peer reviewing awesomeness :).