What’s our impact? (Dec. 2014)

Back in August, we started sharing our outreach and growth statistics, warts and all. That’s because we’re committed to radical transparency, and had a hunch that our users–who are interested in quantitative measures of impact–would be curious to see the numbers.

After using this format to share our stats for five months, we’ve decided to move away from blogging them in favor of a centralized, easier to read format: a simple Google Spreadsheet, open to all.

Below, we share our numbers in blog format for the final time, and provide a link to the Google Spreadsheet where we’ll share our stats from here on out.

Here are our outreach numbers for December 2014*.

impactstory.org traffic

  • Visitors: 3,504 total; 2,189 unique
  • New Users: 195
  • Conversion rate: 8.9% (% of visitors who signed up for a trial account)

Blog stats

  • Unique visitors: 6,911
  • Clickthrough rate: 0.8% (% of people who visited Impactstory.org from the blog)
  • Conversion rate: 17.5% (% of visitors to Impactstory.org from blog who then signed up for a trial Impactstory account)
  • Percent of new user signups: 5.1%

Twitter stats

  • New followers  262
  • Increase in followers from November 5.3%
  • Mentions 173 (We’re tracking this to answer the question, “How engaged are our followers?”)
  • Tweet reach ~398,800 (We’re tracking this–the number of people who potentially saw a tweet mentioning Impactstory or our blog–to understand our brand awareness)
  • Clickthroughs: 131
  • Conversions: 14

What does it all mean?

We’re seeing a familiar, seasonal dip in traffic for our blog, Twitter, and impactstory.org as the semester winds down and researchers take leave from the lab to celebrate the holidays. However, impactstory.org traffic is up 25% from the same period last year, our Twitter followers more than doubled since late January, and we’ve seen a steady growth in traffic to our blog over the past twelve months.

We’re pleased with our growth to date, and look forward to sharing our future growth here.

Thanks for joining us in this experiment in radical transparency! And Happy New Year!

* These numbers were recorded at 10 am MDT on Dec. 31st, 2014.

Open Science & Altmetrics Monthly Roundup (November 2014)

In this month’s roundup: OpenCon 2014, “the crappy Gabor paper” snafu, sexier scientific graphics and 7 other ways November was a big month for Open Science and altmetrics. Read on!

“Generation Open” organizes itself at OpenCon 2014

An international group of students and early career researchers met at the first annual OpenCon meeting in Washington DC. While we couldn’t attend in person, we followed the conference via Twitter and livestream, as did many others from around the world. It was excellent.

Among the participants were Patrick Brown (PLOS), Victoria Stodden (Stanford), Erin McKiernan (Wilfrid Laurier University), and Pete Binfield (PeerJ), all of whom shared two important messages with the attendees: 1) even as a junior scientist, you can make choices that support open access, but 2) don’t feel bad if you’re not yet an OA revolutionary–it’s the responsibility of more senior scientists to help pave the way and align your career incentives with the OA movement’s aims.

You can read a great summary of OpenCon on the Absolutely Maybe blog, and watch OpenCon sessions for yourself on YouTube.

“The crappy Gabor paper” citation snafu

A recently published journal article gained some unwanted attention in November thanks to a copyediting error: the authors’ note-to-self to cite “the crappy Gabor paper” was left in the version of the article that made it to press, which someone found and shared on Twitter.

Because of the increased social media attention the article received, its altmetrics shot through the roof, causing cynics to point out that altmetrics must be flawed because they’re measuring the attention a paper received not for its quality but instead for a silly citation mistake.

We have a different perspective. Altmetrics aggregators like Altmetric.com are useful precisely because they can capture and share data about this silly mistake. After all, we expose the underlying, qualitative data that shows exactly what people are saying when they mention a paper:

Screen Shot 2014-12-09 at 9.58.09 AM.png

That exposure is crucial to allowing viewers to better understand why a paper’s been mentioned. Compare that to how traditional citation indices tend to compile citation numbers of a paper: simple numbers, zero context.

Towards sustainable software at WSSSPE2

The 2nd Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2) met in New Orleans to discuss what barriers exist to the long-term sustainability of code. Turns out, there are many, and they’re mostly rooted in a lack of proper incentives for software creators.

Among the many interesting papers presented at the meeting were:

Links to all conference papers and detailed notes from the meeting can be found on the WSSSPE2 website.

Other altmetrics & open science news

  • Are you a science blogger? Paige needs you! Paige Brown Jarreau (also known as Twitter’s @fromthelabbench) is studying science blogging practices for her PhD research. Respond to her “Survey for Science Bloggers” here.

  • Major UK research funder describes how they use altmetrics to gauge public engagement for the studies they fund: the Wellcome Trust recently published an article in PLOS Biology that outlines how the group uses altmetrics to learn about the societal impacts of the work they support. They also suggest that altmetrics may play an important role in research evaluation in the years to come. Read the full article here.

  • Meet Impactstory’s Advisor of the Month, Open Access superadvocate Keita Bando:  We chatted with Keita about his work founding MyOpenArchive, a precursor to Figshare. Keita also shared his perspective on librarians’ role in helping scientists navigate the often-complicated options they have for practicing Open Science, and gave us a look at his app that makes it much easier for researchers to sync their Mendeley and ORCID profiles. Read the full interview here on the Impactstory blog.

  • How a rip-off website is polluting Google Scholar profiles with spam citations: the Retraction Watch blog reports that a website created to illegally resell content to Chinese academics is inadvertently adding fake publications to some scientists’ Google Scholar profiles. Individual scholars can remove spam citations from their profiles themselves. Read more on Retraction Watch.

  • You can now batch upload publications to your ORCID profile: researcher identification service ORCID is great at finding your publications and other scholarly outputs automatically, but sometimes you need to add content manually. Luckily, ORCID just added a BibTeX import option, meaning you can upload many documents at once using a file that’s easily exported from Mendeley, ADS, and many other reference management applications.

  • Standards for scientific graphics from 1914 could make modern Open Science better: Jure Triglav recaps some important reports from the early 1900’s that have largely been forgotten to date. In them, some of science’s brightest minds set rules for creating scientific graphics, including no-brainers like, “Display the data alongside the graph”–which for some reason we don’t regularly practice today. Triglav also offers a proof-of-concept for how open data can be used to create better, more informative graphics in the modern era. Read the full article here.

  • Older articles more likely to be cited, now that they’re increasingly online: Google Scholar reports that there’s been a 28% growth in citations to older articles between 1990 and 2013 due to their increased availability online. It’s a great argument for “opening up” your older scholarship, isn’t it?

What was your favorite open science or altmetrics happening from November?

Ours was probably the Impactstory November Impact Challenge, but we couldn’t fit it into this roundup! (See what we did there?) Share your favorite news in the comments below.

Why Nature’s “SciShare” experiment is bad for altmetrics

Early last week, Nature Publishing Group announced that 49 titles on Nature.com will be made free to read for the next year. They’re calling this experiment “SciShare” on social media; we’ll use the term as a shorthand for their initiative throughout this post.

Some have credited Nature on their incremental step towards embracing Open Access. Other scientists criticise the company for diluting true Open Access and encouraging scientists to share DRM-crippled PDFs.

As staunch Open Access advocates ourselves, we agree with our board member John Wilbanks: this ain’t OA. “Open” means open to anyone, including laypeople searching Google, who don’t have access to Nature’s Magic URL. “Open” also means open for all types of reuse, including tools to mine and build next-generation value from the scholarly literature.

But there’s another interesting angle here, beyond the OA issue: this move has real implications for the altmetrics landscape. Since we live and breathe altmetrics here at Impactstory, we thought it’d be a great time to raise some of these issues.

Some smart people have asked, “Is SciShare an attempt by Nature to ‘game’ their altmetrics?” That is, is SciShare an attempt to force readers to view content on Nature.com, thereby increasing total pageview statistics for the company and their authors?

Postdoc Ross Mounce explains:

If [SciShare] converts some dark social sharing of PDFs into public, trackable, traceable sharing of research via non-dark social means (e.g. Twitter, Facebook, Google+ …) this will increase the altmetrics of Nature relative to other journals and that may in-turn be something that benefits Altmetric.com [a company in which Macmillian, Nature’s parent company, is an investor].

No matter Nature’s motivations, SciShare, as it’s implemented now, will have some unexpected negative effects on researchers’ ability to track altmetrics for their work. Below, we describe why, and point to some ways that Nature could improve their SciShare technology to better meet researchers’ needs.

How SciShare works

SciShare is powered by ReadCube, a reference manager and article rental platform that’s funded by Macmillan via their science start-up investment imprint, Digital Science.

Researchers with subscription access to an article on Nature.com copy and paste a special, shortened URL (i.e. http://rdcu.be/bKwJ) into email, Twitter, or anywhere else on the Web.

Readers who click on the link are directed to a version of the article that they can freely read and annotate in their browser, thanks to ReadCube. Readers cannot download, print, or copy from the ReadCube PDF.

The ReadCube-shortened URL resolves to a Nature-branded, hashed URL that looks like this:

Screen Shot 2014-12-04 at 4.18.16 PM.png

The resolved URL doesn’t include a DOI or other permanent identifier.

In the ReadCube interface, users who click on the “Share” icon see a panel that includes a summary of Altmetric.com powered altmetrics (seen here in the lower left corner of the screen):

Screen Shot 2014-12-04 at 6.11.41 PM.png

The ReadCube-based Altmetric.com metrics do not include pageview numbers. Because ReadCube doesn’t work with assistive technology like screen readers, it also disallows for the tracking of the small portion of traffic that visually-impaired readers might account for.

That said, the potential for tracking new, ReadCube-powered metrics is interesting. ReadCube allows annotations and highlighting of content, and could potentially report both raw numbers and also describe the contents of the annotations themselves.

Number of redirects from the ReadCube-branded, shortened URLs could also be illuminating, especially when reported alongside direct traffic to the Nature.com-hosted version of the article. (Such numbers could provide hard evidence as to the proportion of OA vs toll access use of Nature journal articles.) And sources of Web traffic give a lot of context to the raw pageview numbers, as we’ve seen from publishers like PeerJ:

Screen Shot 2014-12-04 at 6.26.31 PM.png

After all, referrals from Reddit usually means something very different than referrals from PubMed.

Digital Science’s Timo Hannay hints that Nature will eventually report download metrics for their authors. There’s no indication as to whether Nature intends to disclose any of the potential altmetrics described above, however.

So, now that we know how SciShare works and the basics of how they’ve integrated altmetrics, let’s talk about the bigger picture. What does SciShare mean for researcher’s altmetrics?

How will SciShare affect researchers’ altmetrics?

Let’s start with the good stuff.

Nature authors will probably reap a big benefit in thanks to SciShare: they’ll likely have higher pageview counts for the Nature.com-hosted version of their articles.

Another positive aspect of SciShare is that it provides easy access to Altmetric.com data. That’s a big win in a world where not all researchers are aware of altmetrics. Thanks to ReadCube’s integration of Altmetric.com, now more researchers can find their article’s impact metrics. (We’re also pleased that Altmetric.com will get a boost in visibility. We’re big fans of their platform, as well as customers–Impactstory’s Twitter data comes from Altmetric.com).

SciShare’s also been implemented in such a way that the ReadCube DRM technology doesn’t affect researchers’ ability to bookmark SciShare’d articles on reference managers like Mendeley. Quick tests for Pocket and Delicious bookmarking services also seems to work well. That means that social bookmarking counts for an author’s work will likely not decline. (I point this out because when I attempted to bookmark a ReadCube.com-hosted article using my Mendeley browser bookmarklet Thursday, Dec. 4th, I was prevented from doing so, and actually redirected to a ReadCube advertisement. I’m glad to say this no longer seems to be true.)

Those are the good things. But there’s also a few issues to be concerned about.

SciShare makes your research metrics harder to track

The premise of SciShare is that you’ll no longer copy and paste an article’s URL when sharing content. Instead, they encourage you to share the ReadCube-shortened URL. That can be a problem.

In general, URLs are difficult to track: they contain weird characters that sometimes break altmetrics aggregators’ search systems, and they go dead often. In fact, there’s no guarantee that these links will be live past the next 12 months, when the SciShare pilot is set to end.

Moreover, neither the ReadCube URL–nor the long, hashed, Nature.com-hosted URL that it resolves to–contain the article’s DOI. DOIs are one of the main ways that altmetrics tracking services like ours at Impactstory can find mentions of your work online. They’re also preferable to use when sharing links because they’ll always resolve to the right place.

So what SciShare essentially does is introduce two new messy URLs that will shared online, and that have a high likelihood of breaking in the future. That means there’s a bigger potential for messier data to appear in altmetrics reports.

SciShare’s metrics aren’t as detailed as they could be

The Altmetric.com-powered altmetrics that ReadCube exposes are fantastic, but they lack two important metrics that other data providers expose: citations and pageviews.

On a standard article page on Nature.com, there’s an Article Metrics tab. The Metrics page includes data not only from Altmetric.com, but also CrossRef, Web of Science, and Scopus’s citation counts, and also pageview counts. And on completely separate systems like Impactstory.org and PlumX, still more citation data is exposed, sourced from Wikipedia and PubMed. (We’d provide pageview data if we could. But that’s currently not possible. More on that in a minute.)

ReadCube’s deployment of Altmetric.com data also decontextualizes articles’ metrics. They have chosen only to show the summary view of the metrics, with a link out to the full Altmetric.com report:

Screen Shot 2014-12-05 at 10.11.47 AM.png

Compare that to what’s available on Nature.com, where the Metrics page showcases the Altmetric.com summary metrics plus Altmetric.com-sourced Context statements (“This article is in the 98th percentile compared to articles published in the same journal”), snippets of news articles and blog posts that mention the article, a graph of the growth in pageviews over time, and a map that points to where your work was shared internationally:

Screen Shot 2014-12-04 at 3.59.38 PM.png

More data and more context are very valuable to have when presenting metrics. So, we think this is a missed opportunity for the SciShare pilot.

SciShare isn’t interoperable with all altmetrics systems

Let’s assume that the SciShare experiment results in a boom in traffic to your article on Nature.com. What can you do with those pageview metrics?

Nature.com–like most publishers–doesn’t share their pageview metrics via API. That means you have to manually look up and copy and paste those numbers each time you want to record them. Not an insurmountable barrier to data reuse, but still–it’s a pain.

Compare that to PLOS. They freely share article view and download data via API, so you can easily import those numbers to your profile on Impactstory or PlumX, or export them to your lab website, or parse them into your CV, and so on. (Oh, the things you can do with open altmetrics data!)

You also cannot use the ReadCube or hashed URLs to embed the article full-text into your Impactstory profile or share it on ResearchGate, meaning that it’s as difficult as ever to share the publisher’s version of your paper in an automated fashion. It’s also unclear whether the “personal use” restriction on SciShare links means that researchers will be prohibited from saving links publicly on Delicious, posting them to their websites, and so on.

How to improve SciShare to benefit altmetrics

We want to reiterate that we think that SciShare’s great for our friends at Altmetric.com, due to their integration with ReadCube. And the greater visibility that their integration brings to altmetrics overall is important.

That said, there’s a lot that Nature can do to improve SciShare for altmetrics. The biggest and most obvious idea is to do away with SciShare altogether and simply make their entire catalogue Open Access. But it looks like Nature (discouragingly) is not ready to do this, and we’re realists. So, what can Nature do to improve matters?

  • Open up their pageview metrics via API to make it easier for researchers to reuse their impact metrics however they want
  • Release ReadCube resolution, referral traffic and annotation metrics via API, adding new metrics that can tell us more about how content is being shared and what readers have to say about articles
  • Add more context to the altmetrics data they display, so viewers have a better sense of what the numbers actually mean
  • Do away with hashed URLs and link shorteners, especially the latter which make it difficult to track all mentions of an article on social media

We’re hopeful that SciShare overall is an incremental step towards full OA for Nature. And we’ll be watching how the SciShare pilot changes over time, especially with respect to altmetrics.

Update: Digital Science reports that the ReadCube implementation has been tested to ensure compatibility with most screen readers.

What’s our impact? (September & October 2014)

As a platform for users interested in data, we want to share some data-driven stories about our successes (and challenges) in spreading the word about Impactstory throughout September and October.

Here are our numbers for September and October 2014.

Organic site traffic

We saw a spike in traffic for August & September that’s unusual. It’s due to announcements we made about Impactstory’s changing business model, which became subscription based on September 15. October’s traffic is a return to pre-announcement levels.

  • Unique Visitors: 12,303 total
    • 4263 visitors in September
    • 3249 visitors in October
  • New Users: 748
  • Conversion rate: ~9.9% (% of visitors who signed up for a new user account)

Blog stats

  • Unique visitors: 13,077
    • 5866 visitors in September – up 33.9% from August
    • 7211 visitors in October – up 22.9% from September
  • Clickthrough rate: 2.01% total (% of people who visited Impactstory.org from the blog)
    • 2.4% in September
    • 1.6% in October
  • Conversion rate: 20.3% (9.8% for August) (% of visitors to impactstory.org from blog who signed up for an Impactstory.org account)
  • Percent of new user signups: 6.9% (1.8% for August)

Twitter stats

  • New followers 482 (215 for August)
  • Increase in followers from August 11.19%
    • 4.29% growth in Sept
    • 5.38% growth in Oct
  • Mentions 688 (346 for August) (We’re tracking this to answer the question, “How engaged are our followers?”)
  • Tweet reach 8,703,155 (3,543,827 for August) (We’re tracking this–the number of people who potentially saw a tweet mentioning Impactstory or our blog–to understand our brand awareness)
  • Referrals to Impactstory.org: 618 (271 for August)
  • Signups: 43 (32 for August)

What does it all mean?

Not all of these numbers are directly comparable, as they track growth over a 60 day period rather than a 30 day period, which is what our August numbers are based on.

  • Impactstory.org: Our conversion rates on Impactstory.org went down from August, meaning that the site design changes we enacted in September had a different effect than we anticipated. We’re continuing to roll out changes that will improve conversion rates in the coming months.

  • Blog: We’re continuing to write blogs that our users find worthy of sharing and reading, as evidenced by our double-digit readership growth throughout September and October. (Looks like we really nailed the content marketing goals we set for ourselves in the last post.) Our conversion rates went down from September to October, however, perhaps due to the fact that we didn’t blog about Impactstory much during October–meaning that new readers had little reason to click through and sign up for an Impactstory account. In addition to our November Impact Challenge, we’ll probably do more blogging in November about Impactstory itself.

  • Twitter: Our Twitter followers and mentions continue to grow at the same pace as in August, while our tweet reach grew slightly. That indicates that we’re doing about the same amount of engagement, while also writing tweets that our followers find more worthy of mentioning to others via retweets. We have yet to crack the 5,000 follower mark, meaning we missed our goal that we set for ourselves in August. We’re aiming meet that goal by the end of November, instead.

We’re going to continue to blog our progress over the next few months, while also keeping our eyes peeled for ways to share this data in a more automated fashion. If you have questions or comments, we welcome them in the comments below.

Updated 12/15/2014 and 12/31/2014 to fix a data reporting error in impactstory.org visits (previously reported values reflected total visitors, not unique visitors) and conversion rates for visitors to impactstory.org from blog.

Open Science & Altmetrics Monthly Roundup (October 2014)

Open Access Week dominated Open Science conversation this month, along with interesting UK debates on metrics and several valuable studies being released. Read on for more on all of it!

UK debates use of metrics in research evaluation

Academia’s biggest proponents and critics of altmetrics descended on the University of Sussex on October 7 for the event, “In Metrics We Trust?”.

Some of the most interesting finds shared at the meeting?

  • REF peer reviewers admitted they spend less than 15 minutes reviewing papers for quality, due to the sheer volume of products that need evaluation,

  • Departmental h-indices tend to correlate with REF/RAE evaluations, leading some to argue that time and money could be saved by replacing future REF exercises with metrics, and

  • Leading bibliometrics researchers disagree on whether altmetrics could be used for evaluation. Some said they cannot, no matter what; others said that they can, because altmetrics measure different impacts than citations measure.

The meeting ended with no clear answer as to whether metrics are definitely right (or wrong) for use in the next REF. We’ll have to wait for the HEFCEmetric committee’s recommendations when they issue their report in June.

Until then, check out the full debate in our Storify of the event, as well as Ernesto Priego’s archive of related tweets.

OA Week 2014 recap

The Impactstory team is still recovering from Open Access Week 2014, which saw us talking to over 100 researchers and librarians in 9 countries over 5 days. A full recap of our talks can be found on the Impactstory blog, along with “The Right Metrics for Generation Open: a Guide to getting credit for Open Science”, based on our most popular webinar from the week.

Interest in Open Access and Open Science has risen over the past year, making this year’s Open Access Week successful according to all reports. As Heather Morrison has documented on her blog, the past year has seen an increase in the availability OA documents and data–ArXiv.org alone has grown by 11%!

Other altmetrics & Open Science news

Did we miss anything?

What was your favorite event or new study released this month? Share it in the comments below, or on Twitter (you can find us @Impactstory).

Open Access Week 2014 – a look back and ahead

118A3TUPCSiw6Y.gif

Much like Lil Bub, we’re bushed. Our Open Access Week 2014 was very eventful–we spoke with more than 100 researchers and librarians in 9 countries over 5 days. Here’s how we spent our time.

Throughout the week, Stacy hosted several sessions of “The Right Metrics for Generation Open: a guide to getting credit for Open Science,” where she talked about how Generation Open’s needs are evolving beyond that of previous generations of scientists. Altmetrics are particularly well-suited to meet those needs. You can view her slides on Slideshare, and read a long-form blogpost based on the presentation here on our blog.

Tuesday saw Stacy talking with faculty and librarians at the University of Alberta and the University of Memphis, where she explained “Why Open Research is Critical to your career”. (tl;dr: Change is coming in scholarly communication–so you should get on board and start making the most of the great opportunities that Open Science and altmetrics can offer you.) Check out her slides on Google Docs.

On Wednesday, Stacy had the pleasure of hangin’ virtually with librarians and library students at the University of Wisconsin, where they talked about the fact that “Altmetrics are here: are you ready to help your faculty?” After all, who’s a better neutral third-party to help faculty navigate this new world of altmetrics than librarians? Slides from that presentation are available on Google Docs.

Jason gave his popular talk, “Altmetrics & Revolutions: How the web is transforming the measure and practice of science” to researchers at the University of New Brunswick on Thursday. His slides are available on Google Docs.

Stacy rounded out the week by chatting with researchers and librarians at the University of Cape Town on Friday. Her presentation on the basics of altmetrics and how to use them–”Altmetrics 101: How to make the most of supplementary impact metrics”–is available for viewing on Google Docs.

Heather’s going to be the one in need of a nap over the next two weeks–she’ll be presenting on open data and altmetrics throughout Australia. Here are the events ahead:

  • Melbourne, Mon, 27 Oct, 9am–12.30pm:  Creating your research impact story:  Workshop at eResearch Australasia. Also featuring: Pat Loria, CSU, and Natasha Simons, ANDS. (sold out)

  • Melbourne, Wed, 29 Oct, 10–11am: Keynote presentation at eResearch Australasia. Register for conference

  • Brisbane, Mon, 3 Nov, 1–4.30pm: Uncovering the Impact Story of Open Research and Data. Also featuring Paula Callan, QUT, and Ginny Barbour, PLOS. QUT: Owen J Wordsworth Room. Level 12, S Block, QUT Gardens Point. (sold out)

  • Sydney, Wed, 5 Nov, 1.30–4.30pm: An afternoon of talks featuring Heather Piwowar. Also featuring Maude Frances, UNSW, and Susan Robbins, UWS. ACU MacKillop Campus, North Sydney: The Peter Cosgrove Centre, Tenison Woods House, 8-20 Napier Street, North Sydney. (sold out)

Are you ready, Oz?!

UuIom9saJP5eg.gif

Open Science & Altmetrics Monthly Roundup (September 2014)

September 2014 saw Elsevier staking its claim in altmetrics research, one scientist’s calculations of the “opportunity cost” of practicing Open Science, and a whole lot more. Read on!

Hundreds attend 1am:London Conference

Jennifer Lin of PLOS presents at 1am:London (photo courtesy of Mary Ann Zimmerman)

PLOS’s Jennifer Lin presents at 1am:London (photo courtesy of Mary Ann Zimmerman)

Researchers, administrators, and librarians from around the world convened in London on September 25 and 26 to debate and discover at 1am:London, a conference devoted exclusively to altmetrics.

Some highlights: Sarah Callaghan (British Atmospheric Data Centre), Salvatore Mele (CERN) and Daniel Katz (NSF) discussed the challenges of tracking impacts for data and software; Dan O’Connor (Wellcome Trust) outlined the ethical implications of performing altmetrics research on social media, and our Director of Marketing & Research, Stacy Konkiel, shared where Impactstory has been in the past year, and where we’re headed in the next (check out her slides here).

As you might expect, 1am:London got a lot of social media coverage! Check out the Twitter archive here, watch videos of all the sessions here, and read recaps of the entire meeting over on the conference blog.

Elsevier announces increased focus on altmetrics

Elsevier is pledging increased organizational support for altmetrics research initiatives across the company in the coming year. According to their Editors Update newsletter, the publishing monolith will begin experimenting with the display of Altmetric.com data on journal websites. (Likely related: this altmetrics usability study, for which Elsevier is offering participants $100USD honorariums; sign up here to participate.) The company also recently announced that Mendeley will soon integrate readership data into authors’ dashboards.

NISO survey results reveal more concern with definitions than gaming

The American information standards organization, NISO, surveyed researchers to determine the most important “next steps” for altmetrics standards and definitions development. Interestingly, one of the most common concerns related to the use of altmetrics in assessment–gaming–ranked lower than setting definitions. Promoting the use of persistent identifiers and determining the types of research outputs that are best to track altmetrics for also ranked highly. Check out the full results over on the NISO site.

Other Open Science & Altmetrics news

  • California becomes first US state to pass an Open Access bill: The California Taxpayer Access to Publicly Funded Research Act (AB609) was signed into law by Gov. Jerry Brown in late September, making California the first state in the nation to mandate Open Access for state-funded research. Specifically, the bill requires researchers funded by the CA Department of Public Health to make copies of resulting articles available in a publicly accessible online database. Let’s hope the saying, “As Calfornia goes, so goes the nation” proves true with respect to Open Access! Read more about the bill and related news coverage on the SPARC website.

  • Nature Communications is going 100% Open Access: the third-most cited multidisciplinary journal in the world will go fully Open Access in October 2014. Scientists around the world cheered the news on Twitter, noting that Nature Communications will offer CC-BY as the default license for articles. Read more over on Wired UK.

  • “Science” track proposals announced for Mozilla Festival 2014: The proposals include killer Open Science events like “Open Science Badges for Contributorship,” “Curriculum Mapping for Open Science,” and “Intro to IPython Notebook.” The Festival will occur in London on October 24-26. To see the full list of proposed Science sessions and to register, visit the Mozilla Festival website.

  • Impactstory launches new features, sleek new look: last month, we unveiled cool new functionalities for Impactstory profiles, including the ability to add new publications to your profile just by sending an email. The redesigned site also better showcases the works and metrics you’re most proud of, with new “Selected Works” and “Key Metrics” sections on your profile’s homepage. Check out our blog for more information, or login to your Impactstory profile to discover our new look.

  • Research uncovers a new public impact altmetrics flavor–“good for teaching”: bibliometrician Lutz Bornmann has shown that papers tagged on F1000 as being “good for teaching” tend to have higher instances of Facebook and Twitter metrics–types of metrics long assumed to relate more to “public” impacts. Read the full study on ArXiv.

  • PLOS Labs announces Citation Hackathon: citations aren’t as good as they could be: they lack the structure needed to be machine-readable, making them less-than-useful for web-native publishing and citation tracking. PLOS is working to change that. Their San Francisco-based hackathon will happen on Saturday, October 18. Visit the PLOS Labs website for more information.

  • What’s the opportunity cost of Open Science? According to Emilio Bruna, it’s 35 hours and $690 dollars. In a recent blog post, Bruna calculates the cost–both in manhours and cash–of making his research data, code, and papers Open Access. Read his full account on the Bruna Lab blog.

What was your favorite Open Science or altmetrics happening from September?

We couldn’t cover everything in this roundup. Share your news in the comments below!

Join Impactstory for Open Access Week 2014!

This year, we’re talking Open Science and altmetrics in an Open Access Week 2014 webinar, “The right metrics for Generation Open: a guide to getting credit for practicing Open Science.” We’re also scheduling a limited number of customizable presentations for universities around the world–read on to learn more!

Register for “The Right Metrics for Generation Open”

The traditional way to understand and demonstrate your impact–through citation counts–doesn’t meet the needs of today’s researchers. What Generation Open needs is altmetrics.

In this presentation, we’ll cover:

  • what altmetrics are and the types of altmetrics today’s researchers can expect to receive,
  • how you can track and share those metrics to get all the credit you deserve, and
  • real life examples of scientists who used altmetrics to get grants and tenure

Scientists and librarians across all time zones can attend, because we’re offering it throughout the week, at times convenient for you:

Learn more and register here!

Schedule a customizable presentation on Open Science and altmetrics for your university

We’re offering a limited number of customizable, virtual presentations for researchers at institutions around the world on the following topics during Open Access Week 2014 (Oct. 20-26, 2014):

  • The right metrics for Generation Open: a guide to getting credit for practicing Open Science
  • Altmetrics 101: how to make the most of supplementary impact metrics
  • Why Open Research is critical to your career

Learn more about our webinars and schedule one for your department or university here.

What are you doing for Open Access Week?

Will you be attending one of our webinars? Presenting to your department or lab on your own Open Science practices? Organizing a showing of The Internet’s Own Boy with students? Leave your event announcements in the comments below, and over at OpenAccessWeek.org, if you haven’t already!

What Open Science Framework and Impactstory mean to these scientists’ careers

Yesterday, we announced three winners in the Center for Open Science’s random drawing to win a year’s subscription to Impactstory for users that connected their Impactstory profile to their Open Science Framework (OSF) profile: Leonardo Candela (OSF, Impactstory), Rebecca Dore (OSF, Impactstory), and Calvin Lai (OSF, Impactstory). Congrats, all!

We know our users would be interested to hear from other researchers practicing Open Science, especially how and why they use the tools they use. So, we emailed our winners who graciously agreed to share their experiences using the OSF (a platform that supports project management with collaborators and project sharing with the public) and Impactstory (a webapp that helps researchers discover and share the impacts of all their research outputs). Read on!

What’s your research focus?

Leonardo: I’m a computer science researcher. My research interests include Data Infrastructures, Virtual Research Environments, Data Publication, Open Science, Digital Library [Management] Systems and Architectures, Digital Libraries Models, Distributed Information Retrieval, and Grid and Cloud Computing.

Rebecca: I am a PhD student in Developmental Psychology. Broadly, my research focuses on children’s experiences in pretense, fiction and fantasy. How do children understand these experiences? How might these experiences affect children’s behaviors, beliefs and abilities?

Calvin: I’m a doctoral student in Social Psychology studying how to change unconscious or automatic biases. In their most insidious forms, unconscious biases lead to discrepancies between what people value (e.g., egalitarianism) and how people act (e.g., discriminating based on race). My interest is in understanding how to change these unconscious thoughts so that they’re aligned with our conscious values and behavior.

How do you use the Open Science Framework in the course of your research?

Leonardo: Rather than an end user of the system for supporting my research tasks, I’m interested in analysing and comparing the facilities offered by such an environment and the concept of Virtual Research Environments.

Rebecca: At this stage, I use the OSF to keep all of the information about my various projects in one place and to easily make that information available to my collaborators–it is much more efficient to stay organized than constantly exchanging and keeping track of emails. I use the wiki feature to keep notes on what decisions were made and when and store files with drafts of materials and writing related to each project. Version control of everything is very convenient.

Calvin: For me, the OSF encompasses all aspects of the research process – from study inception to publication. I use the OSF as a staging ground in the early stages for plotting out potential study designs and analysis plans. I will then register my study shortly before data collection to gain the advantage of pre-registered confirmatory testing. After data collection, I will often refer back to the OSF as a reminder of what I did and as a guide for analyses and manuscript-writing. Finally, after publication, I use the OSF as a repository for public access to my data and study materials.

What’s your favorite Impactstory feature? Why?

Leonardo: I really appreciate the effort Impactstory is posing on collecting metrics on the impact my research products have on the web. I like its integration with ORCID and the recently supported “Key profile metrics” since it gives a nice overview of a researcher impact.

Rebecca: I had never heard of ImpactStory before this promotion, and it has been really neat to start testing out. It took me 2 minutes to copy my publication DOIs into the system, and I got really useful information that shows the reach of my work that I hadn’t considered before, for example shares on Twitter and where the reach of each article falls relative to other psychology publications. I’m on the job market this year and can see this being potentially useful as supplementary information on my CV.

Calvin: Citation metrics can only tell us so much about the reach of a particular publication. For me, Impactstory’s alternative metrics have been important for figuring out where else my publications are having impact across the internet. It has been particularly valuable for pointing out connections that my research is making that I wasn’t aware of before.

Thanks to all our users who participated in the drawing by connecting their OSF and Impactstory profiles! Both of our organizations are proud to be working to support the needs of researchers practicing Open Science, and thereby changing science for the better.

To learn more about our open source non-profits, visit the Impactstory and Open Science Framework websites.

Open Science & Altmetrics Monthly Roundup (August 2014)

August was a huge month for open science and altmetrics. Here are some of the highlights:

AAAS shrugs off scientists voicing concern

More than 100 scientists signed a letter of concern addressed to AAAS regarding their new “open access” journal, Science Advances–specifically, the journal’s exorbitant publication fees and restrictive licensing requirements.

As Liz Allen over on the ScienceOpen blog reports, the AAAS issued a “classic PR” piece in response. AAAS’s post doesn’t directly address the letter, and doubles down on their commitment to keeping Science Advances prohibitively expensive to publish in and difficult to remix and reuse for the benefit of science.

After a private phone call between AAAS’s Marcia McNutt and Jon Tennant and no indication that AAAS would reconsider their stance, Erin McKeirnan and Jon Tennant penned a final article expressing their disappointment in the organization.

Be sure to follow Jon Tennant and Erin McKeirnan, who spearheaded the effort to write the letters of concern and are talking candidly on Twitter about further developments (or lack thereof).

International Research, Science and Education Organizations tell STM Publishers: No New Licenses!

In early August, a similar kerfuffle emerged over the issue of content licensing for scientific publications. Creative Commons licenses have been the defacto standard for scientific publishing for years due to their simplicity of use and recognition, but the Association of Scientific, Technical, and Medical Publishers have released a suite of specialized licenses that some say intentionally confuse authors.

From the PLOS website:

The Association of Scientific, Technical and Medical Publishers has recently released a set of model licenses for research articles. In their current formulation, these licenses would limit the use, reuse and exploitation of research. They would make it difficult, confusing or impossible to combine these research outputs with other public resources and sources of knowledge to the benefit of both science and society. There are many issues with these licenses, but the most important is that they are not compatible with any of the globally used Creative Commons licenses. For this reason, we call on the STM Association to withdraw them and commit to working within the Creative Commons framework. [Click to read the full letter.]

The Association of STM Publishers issued a response, which unfortunately dismissed the  concerns raised. Catriona MacCallum and Cameron Neylon at PLOS continue to coordinate outreach on the issue; check out the PLOS Opens blog for the most up-to-date information, and consider contacting your favorite journals’ editorial boards to voice support for Creative Commons licenses.

Other Altmetrics and Open Science News

  • Shape the way publishers credit academic labor and expertise in scientific author lists: What can we do about honorary authorships and uncredited work in academia? CASRAI and ORCID have an idea: create a taxonomy for scientific author roles to help clarify who gets credited (and for what) on the byline of academic articles. Head over to the F1000Research blog to learn more and offer your feedback before Sept. 12.

  • Some scientists don’t find the Kardashian Index very funny: Since we covered a parody impact measure called the Kardashian Index in last month’s roundup, many have weighed in. Turns out, not everyone thought it was very funny, and many (rightly) called out the article for its belittling of scientists who engage others via social media. To read the responses and share your thoughts, visit the LSE Impact Blog.

  • More scholars post, discuss, and comment on research on Twitter than academic social networks like ResearchGate: Nature News surveyed more than 3,000 scientists on their use of social networks, and some of the results were surprising. For example, scientists are primarily on Academia.edu and ResearchGate for the same reason they’re on LinkedIn: just in case they’re contacted. And they more often share their work and follow conversations on Twitter than academic social networking sites. Check out the rest of the reported results over on Nature News.

  • Impactstory, PlumX, and Altmetric add new altmetric indicators: August saw an increase in the types of metrics reported by altmetrics aggregators. Impactstory recently added viewership statistics letting users know how often their embedded content has been viewed on Impactstory.org. PlumX rolled out GoodReads metrics, increasing altmetrics coverage for books. And Altmetric.com now tracks mentions of research articles in policy documents–a big win for understanding how academic research influences public policy.

  • “GitHub for research” raising $1.5 million in funding: The creators of PubChase, Zappy Lab, are seeking funding for Protocols.io, a scientific protocols sharing and reuse repository. In addition to the private, “angel” funding they’ve raised to date, they’re also pursuing crowdfunding via Kickstarter. Check it out today.

  • You can now share your work directly on Impactstory: we’ve gotten a lot of love this week for our newest feature: embeddable content. It’s just one of the many we’re rolling out before September 15. Here’s how embedded articles, slides, and code look on others’ profiles; login to your profile and start sharing your work!

Stay connected

Do you love these updates? You should follow us!

We share altmetrics and Open Science news as-it-happens on our Twitter, Google+, Facebook, or LinkedIn pages. And if you don’t want to miss next month’s news roundup, remember that you can sign up to get these posts and other Impactstory news delivered straight to your inbox.