Why Nature’s “SciShare” experiment is bad for altmetrics

Early last week, Nature Publishing Group announced that 49 titles on Nature.com will be made free to read for the next year. They’re calling this experiment “SciShare” on social media; we’ll use the term as a shorthand for their initiative throughout this post.

Some have credited Nature on their incremental step towards embracing Open Access. Other scientists criticise the company for diluting true Open Access and encouraging scientists to share DRM-crippled PDFs.

As staunch Open Access advocates ourselves, we agree with our board member John Wilbanks: this ain’t OA. “Open” means open to anyone, including laypeople searching Google, who don’t have access to Nature’s Magic URL. “Open” also means open for all types of reuse, including tools to mine and build next-generation value from the scholarly literature.

But there’s another interesting angle here, beyond the OA issue: this move has real implications for the altmetrics landscape. Since we live and breathe altmetrics here at Impactstory, we thought it’d be a great time to raise some of these issues.

Some smart people have asked, “Is SciShare an attempt by Nature to ‘game’ their altmetrics?” That is, is SciShare an attempt to force readers to view content on Nature.com, thereby increasing total pageview statistics for the company and their authors?

Postdoc Ross Mounce explains:

If [SciShare] converts some dark social sharing of PDFs into public, trackable, traceable sharing of research via non-dark social means (e.g. Twitter, Facebook, Google+ …) this will increase the altmetrics of Nature relative to other journals and that may in-turn be something that benefits Altmetric.com [a company in which Macmillian, Nature’s parent company, is an investor].

No matter Nature’s motivations, SciShare, as it’s implemented now, will have some unexpected negative effects on researchers’ ability to track altmetrics for their work. Below, we describe why, and point to some ways that Nature could improve their SciShare technology to better meet researchers’ needs.

How SciShare works

SciShare is powered by ReadCube, a reference manager and article rental platform that’s funded by Macmillan via their science start-up investment imprint, Digital Science.

Researchers with subscription access to an article on Nature.com copy and paste a special, shortened URL (i.e. http://rdcu.be/bKwJ) into email, Twitter, or anywhere else on the Web.

Readers who click on the link are directed to a version of the article that they can freely read and annotate in their browser, thanks to ReadCube. Readers cannot download, print, or copy from the ReadCube PDF.

The ReadCube-shortened URL resolves to a Nature-branded, hashed URL that looks like this:

Screen Shot 2014-12-04 at 4.18.16 PM.png

The resolved URL doesn’t include a DOI or other permanent identifier.

In the ReadCube interface, users who click on the “Share” icon see a panel that includes a summary of Altmetric.com powered altmetrics (seen here in the lower left corner of the screen):

Screen Shot 2014-12-04 at 6.11.41 PM.png

The ReadCube-based Altmetric.com metrics do not include pageview numbers. Because ReadCube doesn’t work with assistive technology like screen readers, it also disallows for the tracking of the small portion of traffic that visually-impaired readers might account for.

That said, the potential for tracking new, ReadCube-powered metrics is interesting. ReadCube allows annotations and highlighting of content, and could potentially report both raw numbers and also describe the contents of the annotations themselves.

Number of redirects from the ReadCube-branded, shortened URLs could also be illuminating, especially when reported alongside direct traffic to the Nature.com-hosted version of the article. (Such numbers could provide hard evidence as to the proportion of OA vs toll access use of Nature journal articles.) And sources of Web traffic give a lot of context to the raw pageview numbers, as we’ve seen from publishers like PeerJ:

Screen Shot 2014-12-04 at 6.26.31 PM.png

After all, referrals from Reddit usually means something very different than referrals from PubMed.

Digital Science’s Timo Hannay hints that Nature will eventually report download metrics for their authors. There’s no indication as to whether Nature intends to disclose any of the potential altmetrics described above, however.

So, now that we know how SciShare works and the basics of how they’ve integrated altmetrics, let’s talk about the bigger picture. What does SciShare mean for researcher’s altmetrics?

How will SciShare affect researchers’ altmetrics?

Let’s start with the good stuff.

Nature authors will probably reap a big benefit in thanks to SciShare: they’ll likely have higher pageview counts for the Nature.com-hosted version of their articles.

Another positive aspect of SciShare is that it provides easy access to Altmetric.com data. That’s a big win in a world where not all researchers are aware of altmetrics. Thanks to ReadCube’s integration of Altmetric.com, now more researchers can find their article’s impact metrics. (We’re also pleased that Altmetric.com will get a boost in visibility. We’re big fans of their platform, as well as customers–Impactstory’s Twitter data comes from Altmetric.com).

SciShare’s also been implemented in such a way that the ReadCube DRM technology doesn’t affect researchers’ ability to bookmark SciShare’d articles on reference managers like Mendeley. Quick tests for Pocket and Delicious bookmarking services also seems to work well. That means that social bookmarking counts for an author’s work will likely not decline. (I point this out because when I attempted to bookmark a ReadCube.com-hosted article using my Mendeley browser bookmarklet Thursday, Dec. 4th, I was prevented from doing so, and actually redirected to a ReadCube advertisement. I’m glad to say this no longer seems to be true.)

Those are the good things. But there’s also a few issues to be concerned about.

SciShare makes your research metrics harder to track

The premise of SciShare is that you’ll no longer copy and paste an article’s URL when sharing content. Instead, they encourage you to share the ReadCube-shortened URL. That can be a problem.

In general, URLs are difficult to track: they contain weird characters that sometimes break altmetrics aggregators’ search systems, and they go dead often. In fact, there’s no guarantee that these links will be live past the next 12 months, when the SciShare pilot is set to end.

Moreover, neither the ReadCube URL–nor the long, hashed, Nature.com-hosted URL that it resolves to–contain the article’s DOI. DOIs are one of the main ways that altmetrics tracking services like ours at Impactstory can find mentions of your work online. They’re also preferable to use when sharing links because they’ll always resolve to the right place.

So what SciShare essentially does is introduce two new messy URLs that will shared online, and that have a high likelihood of breaking in the future. That means there’s a bigger potential for messier data to appear in altmetrics reports.

SciShare’s metrics aren’t as detailed as they could be

The Altmetric.com-powered altmetrics that ReadCube exposes are fantastic, but they lack two important metrics that other data providers expose: citations and pageviews.

On a standard article page on Nature.com, there’s an Article Metrics tab. The Metrics page includes data not only from Altmetric.com, but also CrossRef, Web of Science, and Scopus’s citation counts, and also pageview counts. And on completely separate systems like Impactstory.org and PlumX, still more citation data is exposed, sourced from Wikipedia and PubMed. (We’d provide pageview data if we could. But that’s currently not possible. More on that in a minute.)

ReadCube’s deployment of Altmetric.com data also decontextualizes articles’ metrics. They have chosen only to show the summary view of the metrics, with a link out to the full Altmetric.com report:

Screen Shot 2014-12-05 at 10.11.47 AM.png

Compare that to what’s available on Nature.com, where the Metrics page showcases the Altmetric.com summary metrics plus Altmetric.com-sourced Context statements (“This article is in the 98th percentile compared to articles published in the same journal”), snippets of news articles and blog posts that mention the article, a graph of the growth in pageviews over time, and a map that points to where your work was shared internationally:

Screen Shot 2014-12-04 at 3.59.38 PM.png

More data and more context are very valuable to have when presenting metrics. So, we think this is a missed opportunity for the SciShare pilot.

SciShare isn’t interoperable with all altmetrics systems

Let’s assume that the SciShare experiment results in a boom in traffic to your article on Nature.com. What can you do with those pageview metrics?

Nature.com–like most publishers–doesn’t share their pageview metrics via API. That means you have to manually look up and copy and paste those numbers each time you want to record them. Not an insurmountable barrier to data reuse, but still–it’s a pain.

Compare that to PLOS. They freely share article view and download data via API, so you can easily import those numbers to your profile on Impactstory or PlumX, or export them to your lab website, or parse them into your CV, and so on. (Oh, the things you can do with open altmetrics data!)

You also cannot use the ReadCube or hashed URLs to embed the article full-text into your Impactstory profile or share it on ResearchGate, meaning that it’s as difficult as ever to share the publisher’s version of your paper in an automated fashion. It’s also unclear whether the “personal use” restriction on SciShare links means that researchers will be prohibited from saving links publicly on Delicious, posting them to their websites, and so on.

How to improve SciShare to benefit altmetrics

We want to reiterate that we think that SciShare’s great for our friends at Altmetric.com, due to their integration with ReadCube. And the greater visibility that their integration brings to altmetrics overall is important.

That said, there’s a lot that Nature can do to improve SciShare for altmetrics. The biggest and most obvious idea is to do away with SciShare altogether and simply make their entire catalogue Open Access. But it looks like Nature (discouragingly) is not ready to do this, and we’re realists. So, what can Nature do to improve matters?

  • Open up their pageview metrics via API to make it easier for researchers to reuse their impact metrics however they want
  • Release ReadCube resolution, referral traffic and annotation metrics via API, adding new metrics that can tell us more about how content is being shared and what readers have to say about articles
  • Add more context to the altmetrics data they display, so viewers have a better sense of what the numbers actually mean
  • Do away with hashed URLs and link shorteners, especially the latter which make it difficult to track all mentions of an article on social media

We’re hopeful that SciShare overall is an incremental step towards full OA for Nature. And we’ll be watching how the SciShare pilot changes over time, especially with respect to altmetrics.

Update: Digital Science reports that the ReadCube implementation has been tested to ensure compatibility with most screen readers.

Impact Challenge Day 4: Connect with other researchers on Mendeley.com

Next up for our Impact Challenge is Mendeley.

Are you surprised? While there was pushback against Mendeley after it was unexpectedly bought by Elsevier a few years ago, and it is marketed more as a reference manager than a social network, Mendeley remains popular with many academics and librarians. It offers ways to connect with other researchers that you can’t find on other platforms.

Mendeley Web (the online counterpart to the desktop reference management software) is similar to Google Scholar in several ways. What’s distinctive about Mendeley is that it offers better opportunities to interact with other researchers and get your research in front of communities that might be interested in it, in a context where they’re largely interacting with scholarship they intend to actually read and cite.

Moreover, Mendeley’s Readership Statistics can tell you a lot about the demographics that have bookmarked your work–an important indicator of who’s reading your work and who might cite it in the future.

We’re also going to talk in this post about Zotero, which is quite similar to Mendeley. We’re big supporters of Zotero because it’s an open-source non-profit, and we see that as a killer feature for science tools. However, although it really shines as a reference manager, Zotero’s community features are less powerful–mostly because they have less activity. So we’ll provide links and information on how to do some of these steps in Zotero, but not in as much detail.

Step 1: Create a profile

Logon to Mendeley.com and click the “Create a free profile” button. Create a login and, on the next screen, enter your general field of study and your academic status (student, professor, postdoc, etc).

As you advance to the next screen, beware: Mendeley Desktop will automatically start downloading to your computer. (You’ll need it to make the next step a bit easier on yourself, but you can also make do without it. Your call.) Download it and install it if you plan to use it for the next step–importing your publications.

Zotero alternative: Logon to Zotero.org, click “Register” in the upper right-hand corner, and register for an account. Once you’ve validated your new account, click your username in the upper right-hand corner (where it says, “Welcome, username!”) and then click on the “Edit Profile” link on the next screen to head to the Profile section of your Zotero settings. There, you can create a profile.

Step 2: Import your publications

If you didn’t install Mendeley Desktop, here’s how to add your references manually using Mendeley Web:

  • Click the “My Library” tab, then the “Add Document” icon.

  • On the “Add New Document” dialog box that appears, select “My Publications” from the “Add to” drop-down menu, then use the “Type” drop-down menu to specify what type of document you’re adding to your “My Publications” list (article, book section, thesis, etc).

  • The dialog box will automatically expand, giving you many fields to fill out with descriptive information for that publication. Complete as many as possible, so others can find your publication more easily. If an Open Access link to the full-text of your publication exists, provide it in the URL box. And be sure to add a DOI, if you’ve got one. Click “Save” when finished.

  • Rinse and repeat as necessary, until all your articles are added to your profile.

If you’ve got Mendeley Desktop installed, your job is much easier. Export your publications in .bib format from Google Scholar (which we covered in yesterday’s challenge), and then:

  • Fire up Desktop and select “My Publications” from the “My Library” panel in the upper left corner of the screen.

  • Click File > Import > BibTeX (.bib) on the main menu.

  • On your computer, find the citations.bib file you exported from Google Scholar, select it, and click “Open”. Mendeley will begin to import these publications automagically.

  • In the dialog box that appears, confirm that you are the author of the documents that you’re importing, and that you have the rights to share them on Mendeley. Click “I agree.”

  • Click the “Sync” button at the top of the Desktop screen to Sync your local Mendeley library with your Mendeley Web library.

That’s it! You’ve just added all your publications to your Mendeley profile. And you know how to add any missing publications that didn’t auto-import, to boot.

Here’s what your profile page will look like, now that you’ve added publications to your My Publications library:

Screen Shot 2014-11-04 at 9.12.32 PM.png

Zotero alternative: to auto-import your publications from a BiBTeX file, follow these instructions. To manually add publications, follow these instructions.

Step 3: Follow other researchers

Now you’re ready to connect with other researchers. Consider this step akin to introducing yourself at a conference over coffee: informal, done in passing, and allowing others to put a face to a name.

First, you’ll need to find others to follow. Search for colleagues or well-known researchers in your field by name from the Mendeley search bar in the upper right-hand screen of Mendeley Web:

Screen Shot 2014-11-04 at 9.17.39 PM.png

Be sure to select “People” from the drop-down menu, so you search for profiles and not for papers that they’ve authored.

When you find their profile, click on their name in the search results, and then click the “Follow” button on the right-hand side of the profile:

Screen Shot 2014-11-04 at 9.20.56 PM.png

That’s it! Now you’ll receive updates on your Mendeley homepage when they’ve added a new publication to their profile or done something else on the site, like join a group.

Zotero alternative: Zotero works in a very similar way. Search for your colleague, find their profile, and click the red “Follow” button at the top-right of their profile to begin following them.

Step 4: Join groups relevant to your research

If Step 3 was like introducing yourself during a conference coffee break, Step 4 is like joining a “Birds of a Feather” group over lunch, to talk about common interests and get to know each other a bit better.

Screen Shot 2014-11-04 at 9.27.13 PM.png

Mendeley groups are places where researchers interested in a common topic can virtually congregate to post comments and share papers. It’s a good place to find researchers in your field who might be interested in your publications. And it’s also the single best place on the platform to learn about work that’s recently been published and is being talked about in your discipline.

To find a group, search for a subject using the search toolbar you used for Step 3, making sure to select “Groups” from the drop-down menu. Look through the search results and click through to group pages to determine if the group is still active (some groups were abandoned long-ago).

If so, join it! And then sit back and enjoy all the new knowledge that your fellow group members will drop on you in the coming days, which you can view from either the group page or your Mendeley homescreen.

And you can feel free to drop some knowledge on them, too. Share your articles, if relevant to the group’s scope. Pose questions and answer others’ questions. Openly solicit collaborators if you’ve got an interesting project in the pot that you need help on, like Abbas here has:

Screen Shot 2014-11-04 at 9.43.54 PM.png

Use groups like you would any other professional networking opportunity: as a place to forge new connections with researchers you might not have a chance to meet otherwise.

Zotero alternative: Zotero works in a very similar way. Search for a group topic, find a group you want to join, and click the red “Join Group” button at the top of the page.

Step 5: Learn who’s bookmarking your work

Once your work is on Mendeley, you can learn some basic information about who’s saving it in their libraries via Mendeley’s Readership Statistics. And that’s interesting to know because Mendeley bookmarks are a leading indicator for later citations.

To see the readership demographics for your publications, head to the article’s page on Mendeley. On the right side of the screen, you’ll see a small Readership Statistics panel:

Screen Shot 2014-11-04 at 9.58.08 PM.png

Readership Statistics can tell you how many readers you have on Mendeley (meaning, how many people have bookmarked your publication), what discipline they belong to, their academic status, and their country. Very basic information, to be sure, but it’s definitely more than you’d know about your readers if you were looking at the number of readers alone.

Zotero alternative: Zotero doesn’t yet offer readership statistics or any other altmetrics for publications on their site, but they will soon. Stay tuned!

Limitations

Perhaps the biggest limitation to Mendeley is their association with Elsevier. Mendeley was acquired by the publishing behemoth in early 2013, while the ghastly, Elsevier-backed Research Works Act fail was still fresh in many academics’ minds.

As danah boyd points out, even after Elsevier dropped support for the RWA and the “#mendelete” fracas ended, Elsevier was (and is) still doing a lot that’s not researcher-friendly. And yet, some of us continue to eat at McDonald’s knowing what goes into their chicken nuggets. Like any big organization, Elsevier does some stuff right and some stuff wrong, and it’s up to researchers to decide how it all balances out; there’s lots of room for reasonable folks to disagree. For what it’s worth: at Impactstory, one of us is a Zotero early adopter and code-contributor, one of us has switched from Mendeley to Zotero, and one of us uses both 🙂

Drawbacks to the platform itself? You can’t easily extract readership information for your publications unless you use Mendeley’s open API (too high a barrier for many of us to pass). So, you’ll need to cut-and-paste that information into your website, CV, or annual review, just as you would when using Google Scholar. (It’s relatively easy to extract readership numbers using third-party services like Impactstory, on the other hand. More on that in the days to come.)

A final drawback: if you want to add new publications, you’ll have to do it yourself. Mendeley doesn’t auto-add new publications to your profile like Google Scholar or other platforms can.

Homework

First, complete your profile by manually adding any works that the BibTeX import from Google Scholar didn’t catch.

Next, build your network by following at least five other researchers in your field, and joining at least two groups. On each of the groups you’ve joined, share at least one publication, whether it’s one you’ve authored or one written by someone else. Remember, make sure they’re relevant to the group, or else you’ll be pegged as a spammer.

Over the next few days, log onto Mendeley Web (or Zotero Web) at least one more time, and become acquainted with your homescreen timeline to stay abreast of new research that’s been added to groups or your colleagues’ profiles.

Finally, learn how to export your publications–and the rest of your library–from Mendeley, so you don’t have to reinvent the wheel attempting to set up a profile for your publications on another platform. Here’s how to get your library out of Mendeley in BibTeX format:

  • In Mendeley Desktop, select all publications you want to export.

  • From the main menu, click File > Export.

  • In the dialog box that appears, choose BibTeX from the drop-down menu, rename your bibliography if you want, and choose a safe place to store the .bib file. Click “Save” and you’re done!

Are you hangin’ in there?

You’ve now completed your Day 4 challenge, meaning you’re over halfway finished with Week 1, and over 10% finished with the entire month. That’s some free math, from us to you 🙂

X7GsMZcVH1erS.gif

Tomorrow, we’ll master LinkedIn. Get ready!

Are you ready to take the November Impact Challenge?

Graphic for the challenge shows a boxer poised, ready to bout.

In a hugely competitive research landscape, scientists can no longer afford to just publish and hope for the best. To leave a mark, researchers have to take their impact into their own hands.

But where do you start? There are so many ways to share, promote, and discuss your research, especially online. It’s tough to know where to begin.

Luckily, we’ve got your back.

Drawing on years of experience measuring and studying research impact, we’ve created a list of 30 effective steps for you to make sure your hard work gets out there, gets attention, and makes a difference–in your field and with the public.

We’ll share one of these a day in November, and we challenge you to follow along and give each one a try.

If you’re up to the challenge, we guarantee that by the end of the month, your research will get a boost in exposure and you’ll also have made important connections with other scientists around the world.

Join us here on our blog on Monday, November 3rd for the Impact Challenge kickoff, or follow along via email.

What Jeffrey Beall gets wrong about altmetrics

Not long ago, Jason received an email from an Impactstory user, asking him to respond to the anti-altmetrics claims raised by librarian Jeffrey Beall in a blogpost titled, “Article-Level Metrics: An Ill-Conceived and Meretricious Idea.”

Beall is well-known for his blog, which he uses to expose predatory journals and publishers that abuse Open Access publishing. This has been valuable to the OA community, and we commend Beall’s efforts. But we think his his post on altmetrics was not quite so well-grounded.

In the post, Beall claims that altmetrics don’t measure anything of quality. That they don’t measure the impact that matters. That altmetrics they can be easily gamed.

He’s not alone in making these criticisms; they’re common. But they’re also ill-informed. So, we thought that we’d make our responses public, because if one person is emailing to ask us about them, others must have questions, too.

Citations and the journal impact factor are a better measure of quality than altmetrics

Actually, citations and impact factors don’t measure quality.

Did I just blow your mind?

What citations actually measure

Although early theorists emphasized citation as a dispassionate connector of ideas, more recent research has repeatedly demonstrated that citation actually has more complex motivations, including often as a rhetorical tool or a way to satisfy social obligations (just ask a student who’s failed to cite their advisor). In fact, Simkin and Roychowdhury (2002) estimate that as few as 20% of citers even read the paper they’re citing. That’s before we even start talking about the dramatic disciplinary differences in citation behavior.

When it comes down to it, because we can’t identify citer motivations when looking at a citation count alone (and to date efforts to use sentiment analysis to understand citation motivations have failed to be widely adopted) the only bulletproof way to understand the intent behind citations is to read the paper that cites.

It’s true that some studies have shown that citations correlate with other measures of scientific quality like awards, grant funding, and peer evaluation. We’re not saying they’re not useful. But citations do not directly measure quality, which is something that some scientists seem to forget.

What journal impact factors actually measure

We were surprised that Beall holds up the journal impact factor as a superior way to understand the quality of individual papers. The journal impact factor has been repeatedly criticized throughout the years, and one issue above all others renders Beall’s argument moot: the impact factor is a journal-level measure of impact, and therefore irrelevant to the measure of article-level impact.

What altmetrics actually measure

The point of altmetrics isn’t to measure quality. It’s to better understand impact: both the quantity of impact and the diverse types of impact.

And when we supplement traditional measures of impact like citations with newer, altmetrics-based measures like post-publication peer review counts, scholarly bookmarks, etc we have a better picture of the full extent of impact. Not the only picture. But a better picture.

Altmetrics advocates aim to make everything a number. Only peer review will accurately get at quality.

This criticism is only half-wrong. We agree that informed, impartial expert consensus remains the gold standard for scientific quality. (Though traditional peer-review is certainly far from bullet-proof when it comes to finding this.)

But we take exception to the charge that we’re only interested in quantifying impact. In fact, we think that the compelling thing about altmetrics services is that they bring together important qualitative data (like post-publication peer reviews, mainstream media coverage, who’s bookmarking what on Mendeley, and so on) that can’t be summed up in a number.

The scholarly literature on altmetrics is growing fast, but it’s still early. And altmetrics reporting services can only improve over time, as we discover more and better data and ways to analyze it. Until then, using an altmetrics reporting service like our own (Impactstory), Altmetric.com or PlumX is the best way to discover the qualitative data at the heart of diverse impacts. (More on that below.)

There’s only one type of important impact: scholarly impact. And that’s already quantified in the impact factor and citations.

The idea that “the true impact of science is measured by its influence on subsequent scholarship” would likely be news to patients’ rights advocates, practitioners, educators, and everyone else that isn’t an academic but still uses research findings. And the assertion that laypeople aren’t able to understand scholarship is not only condescending, it’s wrong: cf. Kim Goodsell, Jack Andraka, and others.

Moreover, who are the people and groups that argue in favor of One Impact Above All Others, measured only through the impact factor and citations? Often, it’s the established class of scholars, most of whom have benefited from being good at attaining a very particular type of impact and who have no interest in changing the system to recognize and reward diverse impacts.

wvCL4TW.png

Even if we were to agree that scholarly impact were of paramount importance, let’s be real: the impact factor and citations alone aren’t sufficient to measure and understand scholarly impact in the 21st century.

Why? Because science is moving online. Mendeley and CiteULike bookmarks, Google Scholar citations, ResearchGate and Academia.edu pageviews and downloads, dataset citations, and other measures of scholarly attention have the potential to help us define and better understand new flavors of scholarly attention. Citations and impact factors by themselves just don’t cut the mustard.

I heard you can buy tweets. That proves that altmetrics can be gamed very easily.

There’s no denying that “gaming” happens, and it’s not limited to altmetrics. In fact, there have recently been journals that have been banned from Thomson-Reuters Journal Citation List due to impact factor manipulation, and papers retracted after a “citation ring” was busted. And researchers have proven just how easy it is to game Google Scholar citations.

Most players in the altmetrics world are pretty vigilant about staying one step ahead of the cheaters. (Though, to be clear, there’s not much evidence that scientists are gaming their altmetrics, since altmetrics aren’t yet central to the review and rewards systems in science.) Some good examples are SSRN’s means for finding and banning fraudulent downloaders, PLOS’s “Case Study in Anti-Gaming Mechanisms for Altmetrics,” and Altmetric.com’s thoughts on the complications of rooting out spammers and gamers. And we’re seeing new technology debut monthly that helps us uncover bots on Twitter and Wikipedia, fake reviews and social bookmarking spam.

Crucially, altmetrics reporting services make it easier than ever to sniff out gamed metrics by exposing the underlying data. Now, you can read all the tweets about a paper in one place, for example, or see who’s bookmarking a dataset on Delicious. And by bringing together that data, we help users decide for themselves whether that paper’s altmetrics have been gamed. (Not dissimilar from Beall’s other blog posts, which bring together information on predatory OA publishers in one place for others to easily access and use!)

Altmetrics advocates just want to bring down The Man

We’re not sure about what that means. But we sure are interested in bringing down barriers that keep science from being as efficient, productive, and open as it should be.  One of those barriers is the current incentive system for science, which is heavily dependent upon proprietary, opaque metrics such as the journal impact factor.

Our true endgame is to make all metrics–including those pushed by The Man–accurate, auditable, and meaningful. As Heather and Jason explain in their “Power of Altmetrics on a CV” article in the ASIS&T Bulletin:

Accurate data is up-to-date, well-described and has been filtered to remove attempts at deceitful gaming. Auditable data implies completely open and transparent calculation formulas for aggregation, navigable links to original sources and access by anyone without a subscription. Meaningful data needs context and reference. Categorizing online activity into an engagement framework helps readers understand the metrics without becoming overwhelmed. Reference is also crucial. How many tweets is a lot? What percentage of papers are cited in Wikipedia? Representing raw counts as statistically rigorous percentiles, ideally localized to domain or type of product, makes it easy to interpret the data responsibly.

That’s why we incorporated as a non-profit: to make sure that our goal of building an Open altmetrics infrastructure–which would help make altmetrics accurate, auditable, and meaningful–isn’t corrupted by commercial interests.

Do you have questions related to Beall’s–or others’–claims about altmetrics? Leave them in the comments below.

Starting today, Impactstory profiles will cost $5/month. Here’s why that’s a good thing.

Starting today, Impactstory profiles cost $5 per month.

Why? Because our goal has always been for Impactstory to support a second scientific revolution, transforming how academia finds, shares, understands, and rewards research impact. That’s why we’re a nonprofit, and always will be. But (news flash), that transformation is not going to happen overnight. We need a sustainability model that can grow with us, beyond our next year of Sloan and NSF funding. This is that model.

So what does five bucks a month buy you? It buys you the best place in the world to learn and share your scholarly impact. It buys you a profile not built on selling your personal data, or cramming your page with ads, our ability to hustle up more funding, or a hope that Elsevier buys us (nonprofits don’t get acquired).

Five bucks buys you a profile built on a simple premise: we’ll deliver real, practical value to real researchers, every day. And we’ll do it staying a nonprofit that’s fiercely committed to independence, openness, and transparency. Want to fork our app and build a better one? Awesome, here’s all our code. Want access to the data behind your profile? Of course: it’s one click away, in JSON or CSV, as open as we can make it. And that ain’t changing. It’s who we are.

We’ve talked to a lot of users that feel $5/month is a fair deal. Which is great; we agree. But we know some folks may feel differently, and that’s great too. Because if you’re in that second group, we want to hear from you. We’re passionate about building the online profile you do think is worth $5 a month. In fact, we’re doing a huge round of interviews right now…if you’ve got ideas, drop us a line at team@impactstory.org and we’ll schedule a chat. Let’s change the world, together.

New signups will get a 14-day free trial. If you’re a user now, you’ll also get a 14-day trial; plus if you subscribe you’ll get a cool  “Impactstory: Early Adopter” sticker for your laptop. If you’re in a spot where you can’t afford five bucks a month, we understand.  We’ve got a no-questions-asked waiver; just drop us a line showing us how you’re linking to your Impactstory profile in your email signature and we’ll send you a coupon for a free account.

We’re nervous about this change in some ways; it’s not exactly what we’d imagined for Impactstory from the beginning. But we’re confident it’s the right call, and we’re excited about the future. We’re changing the world. And we’re delivering concrete value to users. And we’re not gonna stop.

Is your CV as good as you are?

DlcTBLR.png

When’s the last time you updated your CV?

Adding new papers to a CV is a real pain, and it gets is worse as we start publishing more types of products more often — preprints, code, slides, posters, and so on.  A stale CV reveals an incomplete, dated, less-good version of ourselves — at just the moment when we want to put our best foot forward.

Starting today, Impactstory helps you keep your online identity up to date — we’ve begun automatically finding and adding your new research products to your impact profile, so you don’t have to!

You can now connect your other online accounts to Impactstory in a few seconds. We’ll then watch those accounts; when new products appear there, they’ll automatically show up in your Impactstory profile, too.  Right now you can connect your GitHub, figshare, SlideShare, and ORCID accounts, but that’s just the beginning; we’ll be adding lots more in the coming months. We’re especially excited about adding ways to keep your scholarly articles up-to-date, like Google Scholar does.

Do you want to fill the gaps in your CV with an up-to-date, comprehensive picture of your research and its impact? There’s no better way than with an Impactstory profile. Our signup process is smoother than ever, give it a go!

Top 5 altmetrics trends to watch in 2014

Last year was an exciting one for altmetrics. But it’s history. We were recently asked: what’s 2014 going to look like? So, without further ado, here are our top 5 trends to watch:

Openness: This is just part of a larger trend toward open science–something altmetrics is increasingly (and aptly) identified with. In 2013, it became more clear than ever before that we’re winning the fight for universal OA. Since metrics are qualitatively more valuable when we verify, share, remix, and build on them, we see continued progress toward making both  traditional and novel metrics more open. But closedness still offers quick monetization, and so we’ll see continued tension here.

Acquisitions by the old guard: Last year saw the big players start to move in the altmetrics space, with EBSCO getting Plum Analytics, and Elsevier grabbing Mendeley. In 2014 we’ll likely see more high-profile altmetrics acquisitions, as established megacorps attempt to hedge their bets against industry-destabilizing change.  We’re not against this, per se; it’s a sign that altmetrics are quickly coming of age. But we also think it underscores the importance of having a nonprofit, scientist-run altmetrics provider, too.

More complex modelling: Lots of money got invested in altmetrics in 2013. This year it’ll get spent, largely on improving the descriptive power of altmetrics tools. We’ll see more network-awareness (who tweeted or cited your paper? how authoritative are they?), more context mining (is your work cited from methods or discussion sections?), more visualization (show me a picture of all my impacts this month), more digestion (are there three or four dimensions that can represent my “scientific personality?”), more composite indices (maybe high Mendeley plus low Facebook is likely to be cited later, but high on both not so much). The low-hanging altmetrics fruit–thing like simply counting tweets–are increasingly plucked. In 2014 we’ll see the beginning of what comes next.

Growing interest from administrators and funders: We gave multiple invited talks at the NSF, NIH, and White House this year to folks highly placed in the research funding ecosystem. These leaders are keenly aware of the shortcomings of traditional impact assessment, and eager to supplement it with new data. Administrators too want to tell more meaningful, textured stories about faculty impact. So in 2014, we’ll see several grant, hiring, and T&P guidelines suggest applicants include altmetrics when relevant.

Empowered scientists: But this interest from the scholarly communications superstructure is tricky. Historically, metrics of scholarly impact have often been wielded as technologies of control: reductionist, Taylorist management tools. There’s been concern that more metrics will only tighten this control. That’s not misplaced. But nor is it the only story: we believe 2014 will also see the emergence of the opposite trend. As scientists use tools like Impactstory to gather, analyze, and share their own stories, comprehensive metrics become a way for them to articulate more textured, honest narratives of impact in decisive, authoritative terms. Altmetrics will give scientists growing opportunities to show they’re more than their h-indices.

And there you have it, our top five altmetrics trends for 2014. Are we missing any? Let us know in the comments!

What level of Open Access scholar are you?

Today is a feast for Open Access fans at Impactstory!

Your scholarship is more valuable when it’s available to everyone: free to be widely read, discussed, and used.  Realizing this, funders increasingly mandate that articles be made freely available, and OA journals and repositories make it increasingly easy.

And today at Impactstory, we make it visible!

Where your articles have free fulltext available somewhere online, your Impactstory profile now links straight to it (we’ve found many of these automatically, but you can add links manually, too). Now along with seeing the impacts of your work, folks checking out your profile can read the papers themselves.

But openness is more than just a handy bonus: it’s an essential qualification for a modern scholar. That’s why there’s growing interest in finding good ways to report on scholars’ openness–and it’s why we’re proud to be rolling out new Open Access awards. If 10% of your articles are OA (gold or green), you get an Open Access badge at the top of your profile. For the more dedicated, there are Bronze (30% OA) and Silver (50%) award levels. The elite OA vanguard with over 80% OA articles get the coveted Gold-level award. So…which award did you get? How open are you? Check Impactstory to find out!

To celebrate the launch, we’re giving away this awesome “i ♥ OA” tshirtfeaturing the now-classic OA icon and our new logo, to one randomly-drawn Bronze or higher level OA scholar on Monday.

Don’t have a Bronze level award yet? Want to see some more of those “unlocked” icons on your profile?  Great! Just start uploading those preprints to get improve your OA level, and get your chance for that t-shirt. 🙂

Finally, we’ve saved the most exciting Impactstory OA news for last: we’ll also be sending one of these new t-shirts to Heather Joseph, Executive Director of SPARC.  Why? Well, partly because she is and has been one of the OA movement’s most passionate, strategic, and effective leaders. But, more to the point, because we’re absolutely thrilled to be welcoming Heather to Impactstory’s Board of Directors.  Heather joins John Wilbanks on our board, filling the vacancy left by Cameron Neylon as his term ends.  Welcome Heather!

Bringing article fulltext to your Impactstory profile

Your work’s impact helps define your identity as a scientist; that’s why we’re so excited about being the world’s most complete impact profile. But of course the content of your work is a key part of your identity, too.

This week, we’re launching a feature that’ll bring that content to your Impactstory profile: if there’s a free fulltext version of one of your articles, we’ll find it and automatically link to it from your profile.

We’ll be automatically checking tons of places to find where an article’s freely available:

  • Is the article in PMC?
  • Is it published in a journal listed in the Directory of Open Access Journals?
  • Is it published in a journal considered “open access” by Mendeley?
  • Does PubMed LinkOut to a free and full-text resource version?
  • If it’s in none of these, is it in our custom-built list of other open sources (including arXiv, figshare, and others)?

Of course, even with all these checks, we’re going to miss some sources–especially self-archiving in institutional repositories. So we’ll be improving our list over time, and you’ll be able to easily add your own linkouts when we miss them.

We’re excited to pull all this together; it’s another big step toward making your Impactstory profile a great place to share your scholarly identity online. Look for the release in a few days!

Welcome to our new hire: Stacy Konkiel!

We’re thrilled to announce that Stacy Konkiel will be joining Impactstory as our Director of Marketing and Research.

Stacy has been in the altmetrics vanguard from the beginning, contributing to PLOS’s early article-level metrics working group, and championing the use of altmetrics in libraries as part of her recent role as Science Data Management Librarian at Indiana University.

Stacy’s communication skills, credibility in the open science and altmetrics communities and–most importantly–her real passion for improving scholarly communication make her the perfect fit at Impactstory. We’re elated to have Stacy as our first hire.

More details to come when Stacy starts in March… we’re just really excited and couldn’t wait to share the news 🙂