Don’t have time to stay on top of the most important Open Science and Altmetrics news? We’ve gathered the very best of the month in this post. Read on!
Funding agencies denying payments to scientists in violation of Open Access mandates
Want to actually get paid from those grants you won? If you haven’t made publications about your grant-funded research Open Access, it’s possible you could be in violation of funders’ public access mandates–and may lose funding because of it.
Richard Van Noorden of Nature News reports,
The London-based Wellcome Trust says that it has withheld grant payments on 63 occasions in the past year because papers resulting from the funding were not open access. And the NIH…says that it has delayed some continuing grant awards since July 2013 because of non-compliance with open-access policies, although the agency does not know the exact numbers.
Post-enforcement, compliance rates increased 14% at the Wellcome Trust and 7% and the NIH. However, they’re still both a ways from seeing full compliance with the mandates.
And that’s not the only shakeup happening in the UK: the higher ed funding bodies warned researchers that any article or conference paper accepted after April 1, 2016 that doesn’t comply with their Open Access policy can’t be used for the UK Research Excellence Framework, by which universities’ worthiness to receive funding is determined.
That means institutions now have a big incentive to make sure their researchers are following the rules–if their researchers are found out of compliance, the institutions’ funding will be in jeopardy.
Post-publication peer review getting a lot of comments
Post-publication peer review via social media was the topic of Dr. Zen Faulkes’ “The Vaccuum Shouts Back” editorial, published in Neuron earlier this month. In it, he points out:
Postpublication peer review can’t do the entire job of filtering the scientific literature right now; it’s too far from being a standard practice….[it’s] an extraordinarily valuable addition to, not a substitute for, the familiar peer review process that journals use before publication. My model is one of continuous evaluation: “filter, publish, and keep filtering.”
So what does that filtering look like? Comments on journal and funder websites, publisher-hosted social networks, and post-pub peer review websites, to start with. But Faulkes argues that “none of these efforts to formalize and centralize postpublication peer review have come close to the effectiveness of social media.” To learn why, check out his article on Neuron’s website.
New evidence supports Faulkes’ claim that post-publication peer review via social media can be very effective. A study by Paul S. Brookes, published this month in PeerJ, found post-publication peer review using blogs makes corrections to the literature an astounding eight times as likely to happen than corrections reported to journal editors in the traditional (private) manner.
For more on post-publication peer review, check out this classic Frontiers in Computational Neuroscience special issue, Tim Gower’s influential blog post, “How might we get to a new model of mathematical publishing?,” or Faculty of 1000 Prime, the highly respected post-pub peer review platform.
Recent altmetrics-related studies of interest
What disciplines have the highest presence of altmetrics? Hint: it’s not the ones you think. Turns out, a higher percentage of humanities and social science articles have altmetrics than for those in the biomedical and life sciences. Researchers also found that only 7% of all papers found in Web of Science had Altmetric.com data.
Video abstracts lead to more readers: For articles in the New Journal of Physics, video abstract views correlate to increased article usage counts, according to a study published this month in the Journal of Librarianship and Scholarly Communication.
New data sources available for Impactstory & Altmetric.com
New data sources include post-publication peer review sites Publons and PubPeer, and microblogging site Weibo Sina (the “Chinese Twitter”). Since we get data from Altmetric, that means Impactstory will be reporting this data soon, too!
And another highly-demanded data source will be opening up in the near future: Zotero. The Sloan Foundation has backed research and development for the open source reference management software that will eventually help Zotero build “a preliminary public API that returns anonymous readership counts when fed universal identifiers (e.g. ISBN, DOI).” So, some day soon, we’ll be able to report Zotero readership information alongside Mendeley stats in your profile–a feature that many of you have been asking us about for a long time.
Altmetric.com offering new badges
Altmetric.com founder Euan Adie announced that for those who want to de-emphasize numeric scores on content, the famous “donut” badges will now be available sans Altmetric score–a move heralded by many in the altmetrics research community as being a good move away from “one score to rule them all.”
Must-read blog posts about ORCID and megajournals
We’ve been on a tear publishing about innovations in Open Science and altmetrics on the Impactstory blog. Here are two of our most popular posts for the month:
Do you blog on altmetrics or Open Science and want to share your posts with us? Let us know on our Twitter, Google+, Facebook, or LinkedIn pages. We might just feature your work in next month’s roundup!
And if you don’t want to miss next month’s news, remember that you can sign up to get these posts and other Impactstory news delivered straight to your inbox.