Open Science & Altmetrics Monthly Roundup (November 2014)

In this month’s roundup: OpenCon 2014, “the crappy Gabor paper” snafu, sexier scientific graphics and 7 other ways November was a big month for Open Science and altmetrics. Read on!

“Generation Open” organizes itself at OpenCon 2014

An international group of students and early career researchers met at the first annual OpenCon meeting in Washington DC. While we couldn’t attend in person, we followed the conference via Twitter and livestream, as did many others from around the world. It was excellent.

Among the participants were Patrick Brown (PLOS), Victoria Stodden (Stanford), Erin McKiernan (Wilfrid Laurier University), and Pete Binfield (PeerJ), all of whom shared two important messages with the attendees: 1) even as a junior scientist, you can make choices that support open access, but 2) don’t feel bad if you’re not yet an OA revolutionary–it’s the responsibility of more senior scientists to help pave the way and align your career incentives with the OA movement’s aims.

You can read a great summary of OpenCon on the Absolutely Maybe blog, and watch OpenCon sessions for yourself on YouTube.

“The crappy Gabor paper” citation snafu

A recently published journal article gained some unwanted attention in November thanks to a copyediting error: the authors’ note-to-self to cite “the crappy Gabor paper” was left in the version of the article that made it to press, which someone found and shared on Twitter.

Because of the increased social media attention the article received, its altmetrics shot through the roof, causing cynics to point out that altmetrics must be flawed because they’re measuring the attention a paper received not for its quality but instead for a silly citation mistake.

We have a different perspective. Altmetrics aggregators like Altmetric.com are useful precisely because they can capture and share data about this silly mistake. After all, we expose the underlying, qualitative data that shows exactly what people are saying when they mention a paper:

Screen Shot 2014-12-09 at 9.58.09 AM.png

That exposure is crucial to allowing viewers to better understand why a paper’s been mentioned. Compare that to how traditional citation indices tend to compile citation numbers of a paper: simple numbers, zero context.

Towards sustainable software at WSSSPE2

The 2nd Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2) met in New Orleans to discuss what barriers exist to the long-term sustainability of code. Turns out, there are many, and they’re mostly rooted in a lack of proper incentives for software creators.

Among the many interesting papers presented at the meeting were:

Links to all conference papers and detailed notes from the meeting can be found on the WSSSPE2 website.

Other altmetrics & open science news

  • Are you a science blogger? Paige needs you! Paige Brown Jarreau (also known as Twitter’s @fromthelabbench) is studying science blogging practices for her PhD research. Respond to her “Survey for Science Bloggers” here.

  • Major UK research funder describes how they use altmetrics to gauge public engagement for the studies they fund: the Wellcome Trust recently published an article in PLOS Biology that outlines how the group uses altmetrics to learn about the societal impacts of the work they support. They also suggest that altmetrics may play an important role in research evaluation in the years to come. Read the full article here.

  • Meet Impactstory’s Advisor of the Month, Open Access superadvocate Keita Bando:  We chatted with Keita about his work founding MyOpenArchive, a precursor to Figshare. Keita also shared his perspective on librarians’ role in helping scientists navigate the often-complicated options they have for practicing Open Science, and gave us a look at his app that makes it much easier for researchers to sync their Mendeley and ORCID profiles. Read the full interview here on the Impactstory blog.

  • How a rip-off website is polluting Google Scholar profiles with spam citations: the Retraction Watch blog reports that a website created to illegally resell content to Chinese academics is inadvertently adding fake publications to some scientists’ Google Scholar profiles. Individual scholars can remove spam citations from their profiles themselves. Read more on Retraction Watch.

  • You can now batch upload publications to your ORCID profile: researcher identification service ORCID is great at finding your publications and other scholarly outputs automatically, but sometimes you need to add content manually. Luckily, ORCID just added a BibTeX import option, meaning you can upload many documents at once using a file that’s easily exported from Mendeley, ADS, and many other reference management applications.

  • Standards for scientific graphics from 1914 could make modern Open Science better: Jure Triglav recaps some important reports from the early 1900’s that have largely been forgotten to date. In them, some of science’s brightest minds set rules for creating scientific graphics, including no-brainers like, “Display the data alongside the graph”–which for some reason we don’t regularly practice today. Triglav also offers a proof-of-concept for how open data can be used to create better, more informative graphics in the modern era. Read the full article here.

  • Older articles more likely to be cited, now that they’re increasingly online: Google Scholar reports that there’s been a 28% growth in citations to older articles between 1990 and 2013 due to their increased availability online. It’s a great argument for “opening up” your older scholarship, isn’t it?

What was your favorite open science or altmetrics happening from November?

Ours was probably the Impactstory November Impact Challenge, but we couldn’t fit it into this roundup! (See what we did there?) Share your favorite news in the comments below.

Leave a Reply