Impactstory Advisor of the Month: Chris Chan (January 2015)

Photograph of Chris Chan

The first Impactstory Advisor of the Month for 2015 is Chris Chan, Head of Information Services at Hong Kong Baptist University Library.

We interviewed Chris to learn more about his crucial role in implementing ORCID identifiers for HKBU faculty, and also why he’s chosen to be an Impactstory Advisor. Below, he also describes his vision for the role librarians can play in bringing emerging scholarly communication technologies to campus–a vision with which we wholeheartedly agree!

Tell us a bit about your role as the Head of Information Services at the Hong Kong Baptist University Library.

My major responsibilities at HKBU Library include overseeing our instruction and reference services, and advising the senior management team on the future development and direction of these services. I’m fortunate to work with a great team of librarians and paraprofessionals, and never tire of providing information literacy instruction and research help to our students and faculty.

Scholarly communication is a growing part of my duties as well. As part of its strategic plan, the Library is exploring how it can better support the research culture at the University. One initiative that has arisen from this strategic focus is our Research Visibility Project, for which I am the coordinator.

Why did you initially decide to join Impactstory?

Scholarly communication and bibliometrics have been of great interest to me ever since I first encountered them as a newly-minted academic librarian. Furthermore, the strategic direction that the Library is taking has made keeping up to date with the latest developments in this area a must for our librarians.

When I came across Impactstory I was struck by how useful and relatively straightforward (even in that early incarnation) it was for multiple altmetrics to be presented in an attractive and easy to understand way. At the time, I had just been discussing with some of our humanities faculty how poorly served they were by traditional citation metrics. I saw immediately in Impactstory one way that this issue could be addressed.

Why did you decide to become an Advisor?

As mentioned above, in the past year or so I have become heavily involved in our scholarly communication efforts. When the call for applications to be an Advisor came out, I saw it as an opportunity to get the inside scoop on one of the tools that I am most enthusiastic about.

What’s your favorite Impactstory feature?

I would have to say that my favourite feature is the ability to add an ORCID iD to the Impactstory profile! More on why that is below.

You’ve been hard at work recently implementing ORCID at HKBU. (I especially like this video tutorial you produced!) How do you envision the library working in the future to support HKBU researchers using ORCID and other scholarly communication technologies?

Academic libraries around the world are re-positioning themselves to ensure that their collections and services remain relevant to their members. The scholarly communication environment is incredibly dynamic, and I think that librarians have an opportunity to provide tremendous value to our institutions by serving as guides to, and organizers of, emerging scholarly communication technologies.

Our ORCID initiative at HKBU is a good example of this. We have focused heavily on communicating the benefits having an ORCID iD and how in the long run this will streamline research workflows and ensure scholars receive the proper credit for their work. Another guiding principle has been to make adoption as painless as possible for our faculty. They will be able to create an ORCID iD, connect it with our system, and automatically populate it with their latest five years’ of research output (painstakingly checked for accuracy by our team), all in just a few minutes.

I believe that as information professionals, librarians are well-positioned to take on such roles. Also, in contrast to some of our more traditional responsibilities, these services bring us into close contact with faculty, raising the visibility of librarians on campus. These new relationships could open doors to further collaborations on campus.

Thanks, Chris!

As a token of our appreciation for Chris’s outreach efforts, we’re sending him an Impactstory travel mug from our Zazzle store.

Chris is just one part of a growing community of Impactstory Advisors. Want to join the ranks of some of the Web’s most cutting-edge researchers and librarians? Apply to be an Advisor today!

Open Science & Altmetrics Monthly Roundup (December 2014)

In this month’s roundup: a university allegedly attempts to hire its way to the top of the rankings, NISO’s altmetrics initiative enters its next phase, and seven other ways December was an interesting month for Open Science and altmetrics. Read on!

An altmetrics-flavored look back at 2014

What were the most popular articles of 2014? Altmetric.com let us know with their year-end roundup, which detailed the 100 most shared and discussed scholarly articles of 2014. At the top of the list was the controversial “emotional contagion” study co-authored by Facebook researchers. See more highlights on the full list, and download the full altmetrics data for the study on Figshare.

Did our altmetrics predictions for 2014 come true? Back in February, we wagered some bets on how the field would evolve throughout 2014, and as expected we got some right and some wrong. Probably our biggest win? That more researchers would become empowered to show the world how they’re winning by using altmetrics. Our biggest miss? Sadly, that altmetrics did not become more “open”.

University criticized for alleged attempts to hire its way to the top of rankings

The Daily Cal reports that staff from King Abdulaziz University in Saudi Arabia recently contacted several University of California professors with opportunities to be hired as “distinguished adjunct professors.” Respected researchers are regularly contacted with job opportunities, but this was different, according to the article:

“KAU offered [Jonathan Eisen] $72,000 per year and free business-class airfare and five-star hotel stays for him to visit KAU in Jeddah, Saudi Arabia…In exchange, Eisen was told he would be expected to work on collaborations with KAU local researchers and also update his Thomson Reuters’ highly cited researcher listing to include a KAU affiliation. He would also be expected to occasionally publish some scientific journal articles with the Saudi university’s name attached.”

Eisen and other scientists interviewed suggest that their high citation rates are at the heart of KAU’s interest, as their affiliation with KAU would boost the university’s international rankings. Read more on The Daily Cal.

NISO votes to create standards for altmetrics

NISO approved Phase 2 of the organization’s altmetrics initiative in December, which will include the creation of standards and recommended practices on the following:

Phase 2 of the project will be to develop standards or recommended practices in the prioritized areas of definitions, calculation methodologies, improvement of data quality, and use of persistent identifiers in alternative metrics. As part of each project, relevant use cases and how they apply to different stakeholder groups will be developed.

This should come as no surprise to those who’ve been following NISO-related altmetrics developments. In September, NISO released results from their community survey, which showed more concern with standards and definitions than issues like gaming.

Want to have a voice in altmetrics standards development? Join any of NISO’s four working groups before 1 Feb., 2015. More information can be found on the NISO website.

We’ll be watching future developments with interest, as any standards and recommended practices developed will affect the way we and other altmetrics aggregators collect, display, and archive altmetrics data in Impactstory profiles.

Other altmetrics & open science news

  • ArXiv hits the 1 million paper milestone: Nature News reports that one of the world’s most famous and respected preprint servers, ArXiv, is now home to more than 1 million articles and receives 10 million download requests per month. Incredibly, ArXiv manages to make this treasure-trove of scholarly information freely available to the public at a cost of less than $10 per paper–much less than the reported $50 million per year it takes to operate Science. For an overview of ArXiv’s history, check out Nature News.

  • New altmetrics studies confirm that citations don’t correlate with quality (or do they?), OA publications get more downloads & more: Five studies of interest to the altmetrics community were publicized in December. They included a study that shows a lack of correlation between citations and quality (as measured by expert peer review); another, conflicting study that may hold the “secret sauce” formula for turning citations into indicators of quality; a study that found–yet again–that Open Access publications receive more downloads; the results of one conference’s experiment with peer review, which showed that peer review is “close to random” in terms of what reviewers agree to accept and reject; and a paper on “negative links,” which may have future applications for context-aware altmetrics.

  • Meet Open Science champion and Impactstory Advisor Dr. Lorena Barba: We recently interviewed Lorena to learn more about her lab’s Open Science manifesto, her research in computational methods in aeronautics and biophysics, and George Washington University’s first Massive Open Online Course, “Practical Numerical Methods with Python”. To read more about her work, visit the Impactstory blog.

  • Altmetrics can help measure attention and influence for education-oriented journal articles: PLOS Computational Biology recently shared a thoughtful perspective on an editorial titled, “An Online Bioinformatics Curriculum.” To look at citations to the 2012 paper, you’d think it wasn’t a success–but you’d be wrong. PLOS’s article-level metrics show that the editorial has been viewed over 77,000 times, bookmarked more than 300 times, and has received a great deal of attention on social media. It’s just one more example of ways in which altmetrics can measure attention and influence of scholarship beyond those traditionally valued.

  • Nature takes a lot of heat for its experiment in free-to-access–but not Open Access–journal articles: Nature Publishing Group announced its intent to offer free access to articles in many of its journals over the next year. The plan allows those with subscription access to an article to generate a link that will allow others to read the article for free–but not download or copy the article’s content. Many scientists criticized the move, pointing out the many restrictions that are placed on content shared. We also shared our concerns, particularly with respect to the negative effects the program could have on altmetrics. But many scientists also lauded Nature’s experiment, and shared their appreciation for NPG’s attempt to make content more accessible. To learn more, check out Timo Hannay’s blog and John Wilbank’s thoughts on “Nature’s Shareware Moment.”

  • Impactstory’s “30-Day Impact Challenge” released as an ebook: To download a free copy of our new ebook based on the popular November Impact Challenge, visit our blog. You can also purchase a copy for your Kindle.

Want updates like these delivered to your inbox weekly? Sign up for our open science & altmetrics newsletter! In addition to open science and altmetrics news, you can also get news from us here at Impactstory, the only altmetrics non-profit.

Announcing Impactstory “office hours” for librarians

Over the next two months, we’re experimenting with providing increased support to librarians, many of whom are their on-campus resource for all-things-altmetrics.

If you’re a librarian with questions about Impactstory, altmetrics, or just about anything else related to measuring and demonstrating impact, you can message Stacy on Skype during the following times:

  • 7 pm to 9 pm on Mondays (Mountain time here in the US; better for folks east of the Central European Time zone–India, Japan, China, Australia, New Zealand)

  • 9 am to 12 pm on Fridays (Mountain time here in the US; better for folks west of CET–USA, Western & Central Europe)

To confirm when Stacy will be online in your time zone, we recommend checking out EveryTimeZone.com.

To connect with Stacy, open Skype and go to the Contacts drop-down menu, select “Add contact,” search for Stacy by her name or username (stacy.konkiel), then click the green “Add contact” button. Please include a short note about who you are and why you want to chat when sending the invitation to connect, as it helps keeps the spammers away.

Stacy will be keeping these office hours starting Monday, January 12 through March 20, 2015. Talk to you soon!

What’s our impact? (Dec. 2014)

Back in August, we started sharing our outreach and growth statistics, warts and all. That’s because we’re committed to radical transparency, and had a hunch that our users–who are interested in quantitative measures of impact–would be curious to see the numbers.

After using this format to share our stats for five months, we’ve decided to move away from blogging them in favor of a centralized, easier to read format: a simple Google Spreadsheet, open to all.

Below, we share our numbers in blog format for the final time, and provide a link to the Google Spreadsheet where we’ll share our stats from here on out.

Here are our outreach numbers for December 2014*.

impactstory.org traffic

  • Visitors: 3,504 total; 2,189 unique
  • New Users: 195
  • Conversion rate: 8.9% (% of visitors who signed up for a trial account)

Blog stats

  • Unique visitors: 6,911
  • Clickthrough rate: 0.8% (% of people who visited Impactstory.org from the blog)
  • Conversion rate: 17.5% (% of visitors to Impactstory.org from blog who then signed up for a trial Impactstory account)
  • Percent of new user signups: 5.1%

Twitter stats

  • New followers  262
  • Increase in followers from November 5.3%
  • Mentions 173 (We’re tracking this to answer the question, “How engaged are our followers?”)
  • Tweet reach ~398,800 (We’re tracking this–the number of people who potentially saw a tweet mentioning Impactstory or our blog–to understand our brand awareness)
  • Clickthroughs: 131
  • Conversions: 14

What does it all mean?

We’re seeing a familiar, seasonal dip in traffic for our blog, Twitter, and impactstory.org as the semester winds down and researchers take leave from the lab to celebrate the holidays. However, impactstory.org traffic is up 25% from the same period last year, our Twitter followers more than doubled since late January, and we’ve seen a steady growth in traffic to our blog over the past twelve months.

We’re pleased with our growth to date, and look forward to sharing our future growth here.

Thanks for joining us in this experiment in radical transparency! And Happy New Year!

* These numbers were recorded at 10 am MDT on Dec. 31st, 2014.

Our 2014 predictions for altmetrics: what we nailed and what we missed

Back in February, we wagered some bets about how the altmetrics landscape would evolve throughout 2014. As you might expect, we got some right, and some wrong.

Let’s take a look back at how altmetrics as a field fared over the last year, through the framework of our 2014 predictions.

More complex modelling

“We’ll see more network-awareness (who tweeted or cited your paper? how authoritative are they?), more context mining (is your work cited from methods or discussion sections?), more visualization (show me a picture of all my impacts this month), more digestion (are there three or four dimensions that can represent my “scientific personality?”), more composite indices (maybe high Mendeley plus low Facebook is likely to be cited later, but high on both not so much).”

Visualizations were indeed big this year: we debuted our new Maps feature, which tells you where your work in the world has been viewed, bookmarked, or tweeted about. We also added “New this week” indicators to the Metrics page on your profile.

And both PlumX and Altmetric.com added new visualizations, too: the cool-looking “Plum Print” and the Altmetric bar visualization were introduced.

Impactstory also launched a new, network-aware feature that shows you what Twitter users gave you the most exposure when they tweeted your work. And we also debuted your profile’s Fans page, which tells you who’s talking about your work and how often, exactly what they’re saying, and how many followers they have.

And a step forward in context mining has come from the recently launched CRediT taxonomy. The taxonomy allows researchers to describe how co-authors on a paper have contributed–whether by creating the study’s methodology, cleaning and maintaining data, or in any of twelve other ways. The taxonomy will soon be piloted by publishers, funders, and other scholarly communication organizations like ORCID.

As for other instances of network-awareness, context mining, digestion, and composite indices? Most of the progress in these areas came from altmetrics researchers. Here are some highlights:

  • This study on ‘semantometrics’ posits that more effective means of determining impact can be found by looking at the full-text of documents, and by measuring the interdisciplinarity of the papers and the articles they cite.

  • A study on the size of research teams since 1900 found that a larger (and more diverse) number of collaborators generally leads to more impactful work (as measured in citations).

  • This preprint determined that around 9% of tweets about ArXiv.org publications come from bots, not humans–which may have big implications for how scholars use and interpret altmetrics.

  • A study showed that papers tagged on F1000 as being “good for teaching” tend to have higher instances of Facebook and Twitter metrics–types of metrics long assumed to relate more to “public” impacts.

  • A study published in JASIST (green OA version here) found that mentions of articles on scholarly blogs correlate to later citations.

Growing interest from administrators and funders

“So in 2014, we’ll see several grant, hiring, and T&P guidelines suggest applicants include altmetrics when relevant.”

Several high-profile announcements from funding agencies confirmed that altmetrics was a hot topic in 2014. In June, the Autism Speaks charity announced that they’d begun using PlumX to track the scholarly and social impacts of the studies they fund. And in December, the Wellcome Trust published an article describing how they use altmetrics in a similar manner.

Are funders and institutions explicitly suggesting that researchers include altmetrics in their applications, when relevant? Not as often as we had hoped. But a positive step in this direction has been from the NIH, which released a new biosketch format that asks applicants to list their most important publications or non-publication research outputs. It also prompts scientists to articulate why they consider those outputs to be important.

The NIH has said that by moving to this new biosketch format, it “will help reviewers evaluate you not by where you’ve published or how many times, but instead by what you’ve accomplished.” We applaud this move, and hope that other funders adopt similar policies in 2015.

Empowered scientists

“As scientists use tools like Impactstory to gather, analyze, and share their own stories, comprehensive metrics become a way for them to articulate more textured, honest narratives of impact in decisive, authoritative terms. Altmetrics will give scientists growing opportunities to show they’re more than their h-indices.”

We’re happy to report that this prediction came true. This year, we’ve heard from more scientists and librarians than ever before, all of whom have used altmetrics data in their tenure dossiers, grant applications and reports, and in annual reviews. And in one-on-one conversations, early career researchers are telling us how important altmetrics are for showcasing the impacts of their research when applying for jobs.

We expect that as more scientists become familiar with altmetrics in the coming year, we’ll see even more empowered scientists using their altmetrics to advance their careers.

Openness

“Since metrics are qualitatively more valuable when we verify, share, remix, and build on them, we see continued progress toward making both  traditional and novel metrics more open. But closedness still offers quick monetization, and so we’ll see continued tension here.”

This is one area where we weren’t exactly wrong, but we weren’t 100% correct, either. Everything stayed more or less the same with regard to openness in 2014: Impactstory continued to make our data available via open API, as did Altmetric.com.

We hope that our prediction will come true in 2015, as the increased drive towards open science and open access puts pressure on those metrics providers that haven’t yet “opened up.”

Acquisitions by the old guard

“In 2014 we’ll likely see more high-profile altmetrics acquisitions, as established megacorps attempt to hedge their bets against industry-destabilizing change.”

2014 didn’t see any acquisitions per se, but publishing behemoth Elsevier made three announcements that hint that the company may be positioning itself for such acquisitions soon: a call for altmetrics research proposals, the hiring of prominent bibliometrician (and co-author of the Altmetrics Manifesto) Paul Groth to be the Disruptive Technology Director of Elsevier Labs, and the launch of  Axon, the company’s invitation-only startup network.

Where do you think altmetrics will go in 2015? Leave your predictions in the comments below.

What’s our impact? (November 2014)

As a platform for users interested in data, we want to share some stats about our successes (and challenges) in spreading the word about Impactstory.

Here are our outreach numbers for November 2014.

impactstory.org traffic

  • Visitors: 4,361 total; 2,754 unique
  • New Users: 247
  • Conversion rate: 8.9% (% of visitors who signed up for a trial account)

Blog stats

  • Unique visitors: 9,443 (31% growth from October)
  • Clickthrough rate: 0.75% (% of people who visited Impactstory.org from the blog)
  • Conversion rate: 19.7% (% of visitors to impactstory.org from blog who went on to sign up for a trial Impactstory account)
  • Percent of new user signups: 5.7%

Twitter stats

  • New followers 318 in November
  • Increase in followers from October 6.7%
  • Mentions 380 (We’re tracking this to answer the question, “How engaged are our followers?”)
  • Tweet reach 840,174 (We’re tracking this–the number of people who potentially saw a tweet mentioning Impactstory or our blog–to understand our brand awareness)
  • Clickthroughs: 180
  • Conversions: 5

What does it all mean?

impactstory.org: Overall traffic to the site was down, consistent with patterns of use we’ve seen in years past. (An end-of-the-semester dip in traffic is common for academic sites.) Conversion rates on impactstory.org went slightly down from October. We’re confident that new landing pages and general homepage changes we make in the coming months will improve conversion rates.

Blog: November saw an increase in unique visitors (another month of double-digit growth!), but what does that mean for our organization? Conversion rates actually went down from October, as did the blog’s share of new user signups for Impactstory. This points to a need to share more Impactstory-related content on the blog, and experiment with unobtrusive sidebars, slide-ins, and other ways that can point people to our main website.

That said, blogging doesn’t always result in direct signups, nor is it meant to. The primary aim of blogging is to educate people about open science and altmetrics (as a non-profit, we’re big on advocacy). And it helps familiarize people with our organization, too, which can result in indirect signups (i.e., readers might come back later and sign up for Impactstory).

Twitter: Our Twitter followers and mentions increased from October by about 1.5% and 25%, respectively. We’ll aim to continue that growth throughout December. (After all, we’re active on Twitter for the same reason we blog–as a form of outreach and advocacy.) We also passed an exciting benchmark: 5,000 Twitter followers!

We’ll continue to blog our progress, while also thinking on ways to share this data in a more automated fashion. If you have questions or comments, we welcome them in the comments below.

Updated 12/31/2014 to fix error in reporting conversion rates of impactstory.org visitors from blog.

Impactstory Advisor of the Month: Lorena Barba (December 2014)

A photograph of Impactstory Advisor Lorena Barba2014’s final Impactstory Advisor of the month is Lorena Barba. Lorena is an associate professor of mechanical and aerospace engineering at the George Washington University in Washington DC, and an advocate for open source, open science, and open education initiatives.

We recently interviewed Lorena to learn more about her lab’s Open Science manifesto, her research in computational methods in aeronautics and biophysics, and George Washington University’s first Massive Open Online Course, “Practical Numerical Methods with Python” (aka “Numerical MOOC”).

Tell us a bit about your research.

I have a PhD in Aeronautics from Caltech and I specialized in computational fluid dynamics. From that launching pad, I have veered dangerously into applied mathematics (working on what we call fast algorithms), supercomputing (which gets you into really techy stuff like cache-aware and memory-avoiding computations, high-throughput and many-core computing), and various application cases for computer simulation.

Fluid dynamics and aerodynamics are mature fields and it’s hard to make new contributions that have impact. So I look for new problems where we can use our skills as computational scientists to advance a field. That’s how we got into biophysics: there are models that apply to interactions of proteins that use electrostatic theory and can be solved computationally with methods similar to ones used in aeronautics, believe it or not.

We have been developing models and software to compute electrostatic interactions between bio-molecules, first, and between bio-molecules and nano-surfaces, more recently. Our goal is to contribute simulation power for aiding in the design of efficient biosensors. And going back to my original passion, aerodynamics, we found an area where there is still much to be discovered: the aerodynamics of flying and gliding animals (like flying snakes).

Why did you initially decide to join Impactstory?

For a long time, I’ve been thinking that science and scientists need to take control of their communication channels and use the web deliberately to convey and increase our impact. I have been sharing the research and educational products of my group online for years and we have a consistent policy with regards to publication that includes, for example, always uploading a preprint to the arXiv repository at the time of submitting a paper for publication. If a journal does not have an arXiv-friendly policy, we don’t submit there and look for another appropriate journal. We started uploading data sets, figures, posters and other research objects to the figshare repository since its beginning, and I’m also a figshare advisor.

Impactstory became part of my communications and impact arsenal immediately, because it aggregates links, views and mentions of our products. And with the latest profile changes, it also offers an elegant online presence.

Why did you decide to become an Advisor?

So many of my colleagues are apathetic to the dire control they put in the hands of for-profit publishers, and simply accept the status quo. I want to be an agent of change in regards to how we measure and communicate the importance of what we do. Part of it is simply being willing to do it yourself, and show by example how these new tools can work for us.

What’s your favorite Impactstory feature?

The automatic aggregation of research objects using my various online IDs, like ORCID, Google Scholar and GitHub. The map is pretty cool, too!

You’ve done a lot to “open up” education in computational methods to the public, in particular via your Numerical MOOC and Youtube video lectures. What have been your biggest successes and challenges in getting these courses online and accessible to all?

In my opinion, the biggest success is doing these things at the grassroots level, with hardly any funding (I had some seed funding for #numericalmooc, but none of the previous efforts had any) or institutional involvement. When I think of how the university, in each case, has been involved in my open education efforts, the most appropriate way to characterize it is that they have let me to do what I wanted to do, staying out of the way. There have not been technologists or instructional designers or any of that involved; I just did it all myself.

The biggest challenge? Resources, I guess—time and money. My scarcest resource is time, and when I work to create open educational resources, I’m stealing time away from research. This gets me disapproving looks, thoughts and comments from my peers. Why am I spending time in open education? “This won’t get you promoted.” SIGH. As for money, I raised some funds for #numericalmooc, but it’s not a lot: merely to cover the course platform and the salary of my teaching assistants. Funding efforts in open education—as an independent educator, rather than a Silicon Valley start-up—is really tough.

Thanks, Lorena!

As a token of our appreciation for Lorena’s outreach efforts, we’re sending her an Impactstory item of her choice from our Zazzle store.

Lorena is just one part of a growing community of Impactstory Advisors. Want to join the ranks of some of the Web’s most cutting-edge researchers and librarians? Apply to be an Advisor today!

Open Science & Altmetrics Monthly Roundup (November 2014)

In this month’s roundup: OpenCon 2014, “the crappy Gabor paper” snafu, sexier scientific graphics and 7 other ways November was a big month for Open Science and altmetrics. Read on!

“Generation Open” organizes itself at OpenCon 2014

An international group of students and early career researchers met at the first annual OpenCon meeting in Washington DC. While we couldn’t attend in person, we followed the conference via Twitter and livestream, as did many others from around the world. It was excellent.

Among the participants were Patrick Brown (PLOS), Victoria Stodden (Stanford), Erin McKiernan (Wilfrid Laurier University), and Pete Binfield (PeerJ), all of whom shared two important messages with the attendees: 1) even as a junior scientist, you can make choices that support open access, but 2) don’t feel bad if you’re not yet an OA revolutionary–it’s the responsibility of more senior scientists to help pave the way and align your career incentives with the OA movement’s aims.

You can read a great summary of OpenCon on the Absolutely Maybe blog, and watch OpenCon sessions for yourself on YouTube.

“The crappy Gabor paper” citation snafu

A recently published journal article gained some unwanted attention in November thanks to a copyediting error: the authors’ note-to-self to cite “the crappy Gabor paper” was left in the version of the article that made it to press, which someone found and shared on Twitter.

Because of the increased social media attention the article received, its altmetrics shot through the roof, causing cynics to point out that altmetrics must be flawed because they’re measuring the attention a paper received not for its quality but instead for a silly citation mistake.

We have a different perspective. Altmetrics aggregators like Altmetric.com are useful precisely because they can capture and share data about this silly mistake. After all, we expose the underlying, qualitative data that shows exactly what people are saying when they mention a paper:

Screen Shot 2014-12-09 at 9.58.09 AM.png

That exposure is crucial to allowing viewers to better understand why a paper’s been mentioned. Compare that to how traditional citation indices tend to compile citation numbers of a paper: simple numbers, zero context.

Towards sustainable software at WSSSPE2

The 2nd Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2) met in New Orleans to discuss what barriers exist to the long-term sustainability of code. Turns out, there are many, and they’re mostly rooted in a lack of proper incentives for software creators.

Among the many interesting papers presented at the meeting were:

Links to all conference papers and detailed notes from the meeting can be found on the WSSSPE2 website.

Other altmetrics & open science news

  • Are you a science blogger? Paige needs you! Paige Brown Jarreau (also known as Twitter’s @fromthelabbench) is studying science blogging practices for her PhD research. Respond to her “Survey for Science Bloggers” here.

  • Major UK research funder describes how they use altmetrics to gauge public engagement for the studies they fund: the Wellcome Trust recently published an article in PLOS Biology that outlines how the group uses altmetrics to learn about the societal impacts of the work they support. They also suggest that altmetrics may play an important role in research evaluation in the years to come. Read the full article here.

  • Meet Impactstory’s Advisor of the Month, Open Access superadvocate Keita Bando:  We chatted with Keita about his work founding MyOpenArchive, a precursor to Figshare. Keita also shared his perspective on librarians’ role in helping scientists navigate the often-complicated options they have for practicing Open Science, and gave us a look at his app that makes it much easier for researchers to sync their Mendeley and ORCID profiles. Read the full interview here on the Impactstory blog.

  • How a rip-off website is polluting Google Scholar profiles with spam citations: the Retraction Watch blog reports that a website created to illegally resell content to Chinese academics is inadvertently adding fake publications to some scientists’ Google Scholar profiles. Individual scholars can remove spam citations from their profiles themselves. Read more on Retraction Watch.

  • You can now batch upload publications to your ORCID profile: researcher identification service ORCID is great at finding your publications and other scholarly outputs automatically, but sometimes you need to add content manually. Luckily, ORCID just added a BibTeX import option, meaning you can upload many documents at once using a file that’s easily exported from Mendeley, ADS, and many other reference management applications.

  • Standards for scientific graphics from 1914 could make modern Open Science better: Jure Triglav recaps some important reports from the early 1900’s that have largely been forgotten to date. In them, some of science’s brightest minds set rules for creating scientific graphics, including no-brainers like, “Display the data alongside the graph”–which for some reason we don’t regularly practice today. Triglav also offers a proof-of-concept for how open data can be used to create better, more informative graphics in the modern era. Read the full article here.

  • Older articles more likely to be cited, now that they’re increasingly online: Google Scholar reports that there’s been a 28% growth in citations to older articles between 1990 and 2013 due to their increased availability online. It’s a great argument for “opening up” your older scholarship, isn’t it?

What was your favorite open science or altmetrics happening from November?

Ours was probably the Impactstory November Impact Challenge, but we couldn’t fit it into this roundup! (See what we did there?) Share your favorite news in the comments below.

Why Nature’s “SciShare” experiment is bad for altmetrics

Early last week, Nature Publishing Group announced that 49 titles on Nature.com will be made free to read for the next year. They’re calling this experiment “SciShare” on social media; we’ll use the term as a shorthand for their initiative throughout this post.

Some have credited Nature on their incremental step towards embracing Open Access. Other scientists criticise the company for diluting true Open Access and encouraging scientists to share DRM-crippled PDFs.

As staunch Open Access advocates ourselves, we agree with our board member John Wilbanks: this ain’t OA. “Open” means open to anyone, including laypeople searching Google, who don’t have access to Nature’s Magic URL. “Open” also means open for all types of reuse, including tools to mine and build next-generation value from the scholarly literature.

But there’s another interesting angle here, beyond the OA issue: this move has real implications for the altmetrics landscape. Since we live and breathe altmetrics here at Impactstory, we thought it’d be a great time to raise some of these issues.

Some smart people have asked, “Is SciShare an attempt by Nature to ‘game’ their altmetrics?” That is, is SciShare an attempt to force readers to view content on Nature.com, thereby increasing total pageview statistics for the company and their authors?

Postdoc Ross Mounce explains:

If [SciShare] converts some dark social sharing of PDFs into public, trackable, traceable sharing of research via non-dark social means (e.g. Twitter, Facebook, Google+ …) this will increase the altmetrics of Nature relative to other journals and that may in-turn be something that benefits Altmetric.com [a company in which Macmillian, Nature’s parent company, is an investor].

No matter Nature’s motivations, SciShare, as it’s implemented now, will have some unexpected negative effects on researchers’ ability to track altmetrics for their work. Below, we describe why, and point to some ways that Nature could improve their SciShare technology to better meet researchers’ needs.

How SciShare works

SciShare is powered by ReadCube, a reference manager and article rental platform that’s funded by Macmillan via their science start-up investment imprint, Digital Science.

Researchers with subscription access to an article on Nature.com copy and paste a special, shortened URL (i.e. http://rdcu.be/bKwJ) into email, Twitter, or anywhere else on the Web.

Readers who click on the link are directed to a version of the article that they can freely read and annotate in their browser, thanks to ReadCube. Readers cannot download, print, or copy from the ReadCube PDF.

The ReadCube-shortened URL resolves to a Nature-branded, hashed URL that looks like this:

Screen Shot 2014-12-04 at 4.18.16 PM.png

The resolved URL doesn’t include a DOI or other permanent identifier.

In the ReadCube interface, users who click on the “Share” icon see a panel that includes a summary of Altmetric.com powered altmetrics (seen here in the lower left corner of the screen):

Screen Shot 2014-12-04 at 6.11.41 PM.png

The ReadCube-based Altmetric.com metrics do not include pageview numbers. Because ReadCube doesn’t work with assistive technology like screen readers, it also disallows for the tracking of the small portion of traffic that visually-impaired readers might account for.

That said, the potential for tracking new, ReadCube-powered metrics is interesting. ReadCube allows annotations and highlighting of content, and could potentially report both raw numbers and also describe the contents of the annotations themselves.

Number of redirects from the ReadCube-branded, shortened URLs could also be illuminating, especially when reported alongside direct traffic to the Nature.com-hosted version of the article. (Such numbers could provide hard evidence as to the proportion of OA vs toll access use of Nature journal articles.) And sources of Web traffic give a lot of context to the raw pageview numbers, as we’ve seen from publishers like PeerJ:

Screen Shot 2014-12-04 at 6.26.31 PM.png

After all, referrals from Reddit usually means something very different than referrals from PubMed.

Digital Science’s Timo Hannay hints that Nature will eventually report download metrics for their authors. There’s no indication as to whether Nature intends to disclose any of the potential altmetrics described above, however.

So, now that we know how SciShare works and the basics of how they’ve integrated altmetrics, let’s talk about the bigger picture. What does SciShare mean for researcher’s altmetrics?

How will SciShare affect researchers’ altmetrics?

Let’s start with the good stuff.

Nature authors will probably reap a big benefit in thanks to SciShare: they’ll likely have higher pageview counts for the Nature.com-hosted version of their articles.

Another positive aspect of SciShare is that it provides easy access to Altmetric.com data. That’s a big win in a world where not all researchers are aware of altmetrics. Thanks to ReadCube’s integration of Altmetric.com, now more researchers can find their article’s impact metrics. (We’re also pleased that Altmetric.com will get a boost in visibility. We’re big fans of their platform, as well as customers–Impactstory’s Twitter data comes from Altmetric.com).

SciShare’s also been implemented in such a way that the ReadCube DRM technology doesn’t affect researchers’ ability to bookmark SciShare’d articles on reference managers like Mendeley. Quick tests for Pocket and Delicious bookmarking services also seems to work well. That means that social bookmarking counts for an author’s work will likely not decline. (I point this out because when I attempted to bookmark a ReadCube.com-hosted article using my Mendeley browser bookmarklet Thursday, Dec. 4th, I was prevented from doing so, and actually redirected to a ReadCube advertisement. I’m glad to say this no longer seems to be true.)

Those are the good things. But there’s also a few issues to be concerned about.

SciShare makes your research metrics harder to track

The premise of SciShare is that you’ll no longer copy and paste an article’s URL when sharing content. Instead, they encourage you to share the ReadCube-shortened URL. That can be a problem.

In general, URLs are difficult to track: they contain weird characters that sometimes break altmetrics aggregators’ search systems, and they go dead often. In fact, there’s no guarantee that these links will be live past the next 12 months, when the SciShare pilot is set to end.

Moreover, neither the ReadCube URL–nor the long, hashed, Nature.com-hosted URL that it resolves to–contain the article’s DOI. DOIs are one of the main ways that altmetrics tracking services like ours at Impactstory can find mentions of your work online. They’re also preferable to use when sharing links because they’ll always resolve to the right place.

So what SciShare essentially does is introduce two new messy URLs that will shared online, and that have a high likelihood of breaking in the future. That means there’s a bigger potential for messier data to appear in altmetrics reports.

SciShare’s metrics aren’t as detailed as they could be

The Altmetric.com-powered altmetrics that ReadCube exposes are fantastic, but they lack two important metrics that other data providers expose: citations and pageviews.

On a standard article page on Nature.com, there’s an Article Metrics tab. The Metrics page includes data not only from Altmetric.com, but also CrossRef, Web of Science, and Scopus’s citation counts, and also pageview counts. And on completely separate systems like Impactstory.org and PlumX, still more citation data is exposed, sourced from Wikipedia and PubMed. (We’d provide pageview data if we could. But that’s currently not possible. More on that in a minute.)

ReadCube’s deployment of Altmetric.com data also decontextualizes articles’ metrics. They have chosen only to show the summary view of the metrics, with a link out to the full Altmetric.com report:

Screen Shot 2014-12-05 at 10.11.47 AM.png

Compare that to what’s available on Nature.com, where the Metrics page showcases the Altmetric.com summary metrics plus Altmetric.com-sourced Context statements (“This article is in the 98th percentile compared to articles published in the same journal”), snippets of news articles and blog posts that mention the article, a graph of the growth in pageviews over time, and a map that points to where your work was shared internationally:

Screen Shot 2014-12-04 at 3.59.38 PM.png

More data and more context are very valuable to have when presenting metrics. So, we think this is a missed opportunity for the SciShare pilot.

SciShare isn’t interoperable with all altmetrics systems

Let’s assume that the SciShare experiment results in a boom in traffic to your article on Nature.com. What can you do with those pageview metrics?

Nature.com–like most publishers–doesn’t share their pageview metrics via API. That means you have to manually look up and copy and paste those numbers each time you want to record them. Not an insurmountable barrier to data reuse, but still–it’s a pain.

Compare that to PLOS. They freely share article view and download data via API, so you can easily import those numbers to your profile on Impactstory or PlumX, or export them to your lab website, or parse them into your CV, and so on. (Oh, the things you can do with open altmetrics data!)

You also cannot use the ReadCube or hashed URLs to embed the article full-text into your Impactstory profile or share it on ResearchGate, meaning that it’s as difficult as ever to share the publisher’s version of your paper in an automated fashion. It’s also unclear whether the “personal use” restriction on SciShare links means that researchers will be prohibited from saving links publicly on Delicious, posting them to their websites, and so on.

How to improve SciShare to benefit altmetrics

We want to reiterate that we think that SciShare’s great for our friends at Altmetric.com, due to their integration with ReadCube. And the greater visibility that their integration brings to altmetrics overall is important.

That said, there’s a lot that Nature can do to improve SciShare for altmetrics. The biggest and most obvious idea is to do away with SciShare altogether and simply make their entire catalogue Open Access. But it looks like Nature (discouragingly) is not ready to do this, and we’re realists. So, what can Nature do to improve matters?

  • Open up their pageview metrics via API to make it easier for researchers to reuse their impact metrics however they want
  • Release ReadCube resolution, referral traffic and annotation metrics via API, adding new metrics that can tell us more about how content is being shared and what readers have to say about articles
  • Add more context to the altmetrics data they display, so viewers have a better sense of what the numbers actually mean
  • Do away with hashed URLs and link shorteners, especially the latter which make it difficult to track all mentions of an article on social media

We’re hopeful that SciShare overall is an incremental step towards full OA for Nature. And we’ll be watching how the SciShare pilot changes over time, especially with respect to altmetrics.

Update: Digital Science reports that the ReadCube implementation has been tested to ensure compatibility with most screen readers.

Impact Challenge Day 30: Create a comprehensive impact profile at Impactstory.org

Yesterday, we covered all the ways that you can dig up evidence of your impacts online. You learned that metrics for your research exist across more than 18 platforms all around the Web. That’s a lot of data to manage.

What you need now is a single place to view your metrics (and the underlying qualitative data). You also need a way to share your metrics with others. That’s where Impactstory comes in.

Impactstory is a non-profit webapp that compiles data from across the Web on how often (and by whom) your research is being shared, saved, discussed, cited and more.

We automate much of the work of collecting impact metrics, so you don’t have to. And we provide rich, contextualized, Open metrics alongside the underlying data, so you can learn a lot in one place (and reuse most of the metrics however you want).

In today’s challenge, you’ll explore creating a comprehensive impact profile on Impactstory.org. Let’s get started!

Step 1. Explore an Impactstory profile

Screen Shot 2014-12-01 at 9.08.17 PM.png

One of our favorite Impactstory profiles belongs to genomics researcher Holly Bik. Her profile epitomizes all of the cool things you can do on Impactstory:

  • Discover metrics for your work from scholarly and popular social media
  • Import all of your papers, datasets, software, slide decks, and other scholarly products into a single profile
  • Highlight the scholarship and metrics you’re most proud of in your “Selected Works” and “Key Metrics” sections of your profile homepage
  • Learn who’s talking about your work and what they’re saying by drilling down into the metrics and underlying data
  • Connect your account to third-party services like Figshare, ORCID, and GitHub to get automatic updates & import your new research

Go ahead and poke around a bit on Holly’s profile. Take 5 minutes or so to explore. Go ahead, we’ll wait here.

Not everyone’s profile will look like Holly’s, to be sure. But no matter your career stage, chances are that an Impactstory profile will give you a lot of insight as to your many research impacts.

Step 2. Sign up for Impactstory

Now let’s get you set up with a free Impactstory trial.

You might have heard: we’re a subscription-based service ($60/year or $10/month). But we’re not going to make a hard pitch for you to subscribe.

Instead, you’re going to sign up for a free, 30 day trial, during which you’ll get a better chance to decide if Impactstory is right for you (and worth paying for*). Here’s how:

That’s it! Easy, huh?

Next, let’s walk through the simple steps it takes to get your scholarship onto Impactstory.

* We also offer fee waivers for anyone who can’t afford a subscription.

Step 3. Automate your Impactstory profile

Screen Shot 2014-12-01 at 8.29.17 PM.png

You’re now on the “Master Import Controls” page.

Next, you’ll be prompted to connect your accounts from across the Web. This will allow you to batch import many of your publications, software, data, and other scholarship that’s hosted elsewhere. And, once connected, we’ll automatically import your new scholarship, as it’s created.

As of this writing, you can connect Figshare, ORCID, GitHub, Publons, Slideshare, and Twitter for auto-importing of data and scholarship. You can also add a link to your Google Scholar profile and import those publications all at once using BibTeX.

We’ll use Figshare as an example for how to connect your Impactstory account to other services. To get started:

  • Click on the tile for the service you want to connect (in this case, Figshare)
  • Open a new browser window and get your Figshare author page URL (login to Figshare, click on your name and photo in the upper-right hand corner, click “My profile,” and then copy the URL that appears in your browser’s address bar.)
  • Switch back to the Impactstory browser window. In the Figshare pop-up, paste your Figshare author page URL into the box under “figshare author page URL”
  • Click the green “Connect to Figshare” button
  • You’re now connected!

Impactstory will then auto-import all of your public Figshare products and their metrics, and also update your Impactstory profile weekly with any new Figshare products and metrics.

The instructions above work for ORCID, GitHub, Publons, Slideshare, and Twitter, too. Just login to that appropriate web service to get your URL, username, or ORCID ID, and click the appropriate tile on Impactstory “Master import controls” page to insert the URL.

Step 4. Import your other scholarship to Impactstory

Screen Shot 2014-12-01 at 8.34.25 PM.png

It’s possible that you’ve got scholarly products squirreled away in places we can’t automatically import from. Maybe you’ve contributed to a GitHub repository that you don’t own, have a standalone website devoted to a research project, or have a video abstract for one of your articles.

No matter what you want to add to your profile as an individual product, here’s how to do it.

From the Main Import Controls page:

  • Click the “Add products individually by ID” link
  • On the next page, paste the identifier(s) for the product(s) you want to track. If you are adding more than one individual product at a time, be sure to add only one identifier per line.
  • Once you’ve added the identifiers for all the products you want to track, click the blue “Import” button. The products will be added to your profile.

Step 5. Step back and admire your profile so far

Now you’ve got all your scholarly products on Impactstory. Let’s take a look at how they look on the genre pages.

From your main profile page, click on the links in the left-hand navigation bar that correspond with the scholarly genre you want to explore.

For example, if you’ve got articles on your profile, go ahead and click on the “articles” link. Here’s what Holly Bik’s Articles page looks like:

Screen Shot 2014-12-01 at 8.40.08 PM.png

 You can hover over any of the blue or green badges to see the underlying data that document your scholarly and public impacts:

Screen Shot 2014-12-01 at 8.47.03 PM.png

 Or you can click on any title to see an in-depth description of the article and a summary of metrics. We auto-import as much information as possible, including your full citation and your abstract:

Screen Shot 2014-12-01 at 8.49.45 PM.png

 Click on the “Full-text” icon to see an embedded version of your paper (and you can add a link to the full-text, Open Access version of your paper, if we didn’t auto-import it for you–more on that below).

Click on the “Metrics” icon to see a drill-down view of your paper’s metrics, along with important context that we provide in percentiles:

Screen Shot 2014-12-01 at 8.53.24 PM.png

 And you can click through any of the specific metrics to go to the data provider website, where you can explore the underlying data:

Screen Shot 2014-12-01 at 8.55.43 PM.png

 Back on your profile, you can also click the “Map” icon to learn about where in the world your paper has been bookmarked on Mendeley, tweeted about, or viewed on Impactstory.org:

Screen Shot 2014-12-01 at 8.56.51 PM.png

 Hovering over any country gives you more information about the impacts that have happened in that country; you can also drill down into each country’s activity using the handy table at the lower-left of the page.

Step 6. Add links to your open access work

Now that you’ve seen all the ways your Open Access work is being reused online, let’s get more of your OA work onto your Impactstory profile.

For any article, dataset, or other scholarly product that’s not already embedded in your Impactstory profile:

  • Go to the main item page
  • Click on the “Full-text” icon
  • You’ll see an option to “Share your article” by uploading a full-text copy of your work or providing a URL.
  • Upload your article or provide your URL, and you’re done!

Step 7. Pretty up your profile

Now it’s time to put the finishing touches on your entire profile.

On your main profile page, add a short bio and a photo of yourself.

On your product pages for your most important research, add keywords and abstracts that’ll help others find your work more easily.

To add the bio, keywords, and abstract, just click on the field you want to edit, type in what you want to add, and then click the blue checkmark icon to save it to your profile.

That’s it! You now have a beautiful, complete Impactstory profile! Congrats!

Step 8. Dig into your metrics & notification emails

Now that your profile is complete, you’ll have 30 days’ worth of free trial to discover new metrics that your work has received.

Impactstory updates your profile with new metrics (and imports new products) on a weekly basis. Any new metrics will appear on your badges like so: Screen Shot 2014-12-01 at 9.23.11 PM.png.

We also will send you notification emails on a weekly basis that highlight your top 10 “greatest hits” metrics for the week.

Screen Shot 2014-12-01 at 9.28.07 PM.png

 Your notification emails will usually include milestone metrics (“You’ve just passed 2,000 views on your Slideshare slides!”) and sometimes it will include incremental metrics for your less popular research materials (“You got 1 Figshare view for your 2001 dataset, ‘Datum Obscurus.’ That brings your total views up to 7.”)

These notifications include contextual information, such as your total number of metrics to date for that item, and what percentile your item’s in, relative to other research products created in the same year or published in the same discipline.

If you’d rather receive your Notification emails more frequently, less frequently, or not at all, you can change your settings at impactstory.org/settings/notifications.

Step 9. Share your success far and wide

Now that you’ve got your Open impact data, how will you use it?

Well, some researchers use altmetrics to document their impact for grant applications and tenure. We’ve also heard of scientists using them for promotion and annual reviews. Consider whether these scenarios would work for you. The latter scenario in particular is a great way to test the water, to see if your supervisors and colleagues are amenable to altmetrics.

You can also share altmetrics-inspired warm and fuzzies with your collaborators. Email your co-authors with a link to your articles on Impactstory, so they can check out the data for themselves. It’s a great feeling when you see in black and white the effect your work’s having on others. Share it! :)

We also suggest putting a link to your Impactstory profile on your website or blog, and in your email signature. All super-effective ways to quickly share both your research and your impact with your colleagues.

When sharing your Impactstory data and profile, keep in mind that numbers are only one useful part of the data. You can print out your impact map and include it in an annual review; quote from open peer reviews that praise the quality of your research in your tenure dossier; and learn who’s sharing your work so you can connect with them via social media.

But ultimately, it’s up to you to decide what uses will be the best for you, depending upon your academic environment. Once you decide, let us know! We love to hear how scientists are using their Impactstory profiles.

Limitations

Many popular data providers including Google Scholar, Academia.edu, and ResearchGate won’t share their data with us (or anyone else) via Open API. So, we unfortunately can’t import metrics from those profiles to your Impactstory account.

It’s also hard for us (and all other altmetrics aggregators) to track scholarly products by URL alone. There simply aren’t great data sources for doing that ever since Topsy got bought out by Apple. We’re continuing to look for ways to get you this data. But in the meantime, we encourage you to mint DOIs for your work, so we can track it.

Homework

Now that you’ve got an Impactstory profile, make it awesome! Fill in the gaps in your publication history, add your most impactful work, connect your accounts, and so on. At the very least, information for all of your most important research products should be in your profile.

For your five most important products, add links to the Open Access versions of those works, if they’re available and you have the rights to post them. (If you remember, publishers’ restrictions might prohibit you from posting certain versions of your articles online.)

Once everything’s imported, it’s time to clean up your profile data. We import and clean up a lot of dirty and duplicate data for you, but some things might fall through the cracks. Here’s what to look for:

  • Mislabeled products: add missing descriptive information (journal names, authors, abstracts, and keywords that can help others find your work). It’s as easy as clicking in the area that needs to be updated, adding the info, and then clicking the blue checkmark button to save it.

  • Duplicate products: choose which version you’d like to delete, tick the box next to it, and click the trashcan icon at the top of your profile to get rid of it.

  • Miscategorized products: sometimes, products will end up in the Webpages genre or in other inappropriate places on your profile, due to incomplete descriptive information. To move a product from one genre page to another, check the box next to the item(s) to be moved, then click the “Move” folder icon at the top of your profile, select the appropriate genre from the drop-down menu, and you’re done!

Your final, enjoyable task is to now dig into the data that your Impactstory profile provides. Find unexpected mentions or reuse of your work online. Think about how you might use that data in a professional context. And give yourself a big pat on the back for completing the final Impact Challenge.

Congratulations!

105m7dgltRB2Ok.gif

You’ve successfully made it through all 30 days of the Impact Challenge! We’re proud of you!

You’re now an Open, web-savvy scientist who’s made valuable connections online and in real life. You’re sharing more of your work than you were before, and have found many new ways to get your work to those who are interested. And you’re able to track the success of your efforts, and the real-time impact of your scholarship.

We’ve had a lot of fun writing these Impact Challenges and talking with all of you who’ve participated. Thanks for joining in! And feel free to reach out if you’ve got ideas for future Impact Challenges.

PS

If you’ve accomplished all 30 Impact Challenges, we’ve got a gift for you and all other FINISHERs!

B1nbsyEIMAA6-0P.png

 The full rules for claiming your shirt can be found here.