You don't need to be signed in to read BMJ Blogs, but you can register here to receive updates about other BMJ products and services via our site.

Article-level metrics: which service to choose?

26 Oct, 12 | by Claire Bower, Digital Comms Manager, @clairebower

Article-level metrics (or ALMs) were a hot topic at this week’s HighWire publisher meeting in Washington. (Highwire hosts both the BMJ and its stable of 42 specialist journals). From SAGE to eLife, publishers seem sold on the benefits of displaying additional context to articles, thereby enabling readers to assess their impact. These statistics range from traditional indicators, such as usage statistics and citations, to alternative values (or altmetrics) like mentions on Twitter and in the mainstream media.

So, what services are available to bring this information together in one simple interface? There are quite a few contenders in this area, including Plum Analytics, PLoS Article-Level Metrics application, Science Card, CitedIn and ReaderMeter. One system in particular has received a good deal of attention in the past few weeks; ImpactStory, a relaunched version of total-impact. It’s a free, open-source webapp that’s been built with financial help from the Sloan Foundation (and others) “to help researchers uncover data-driven stories about their broader impacts”.

To use ImpactStory, authors first need to point to their scholarly products: whether articles from Google Scholar Profiles or ORCID, presentations on SlideShare, or datasets on Dryad. The system then searches over a dozen Web APIs to learn where these outputs are making an impact, including facebook, pubmed, wikipedia, scopus, citeulike, mendeley and dryad. The impacts are then categorised along two dimensions: audience (scholars or the public) and type of engagement with research (view, discuss, save, cite, and recommend). The analysis goes even further:

In each dimension, they also figure out a percentile score compared to a baseline; in the case of articles, the baseline is “articles indexed in Web of Science that year.” If your 2009 paper has 17 Mendeley readers, for example, that puts you in the 87th-98th percentile of all WoS-indexed articles published in 2009 (we report percentiles as a range expressing the 95% confidence interval).

In similar news, NPG announced yesterday that article-level metrics are now available on twenty of their journals, powered by Altmetric (another big player in this field). Nature.com users can view an article’s citation data, page views, news mentions, blog posts and social shares including Facebook and Twitter.

“These metrics transparently show the impact and reach of articles published in our journals, enable evaluation, and help readers see what others find interesting and notable,” said Kira Anthony, Editorial Development Manager, NPG. “The interest in both traditional metrics and ‘altmetrics’ from the research community is clear, and NPG is pleased to offer this improved functionality and service to our readers and authors. Institutions, funders and those mining data are also beginning to look at this information, for example for the 2014 Research Excellence Framework (REF) evaluation in the UK. Some REF panels will look at article citations and consider other measures of tracking research impact.”

While publishers are more and more convinced of the advantages associated with displaying such statistics to their readers and authors, the issue now seems to be one of deciding which service to use from the plethora on offer. It will certainly be interesting to see if any of these systems achieve market dominance amongst users and publishers.

By submitting your comment you agree to adhere to these terms and conditions
You can follow any responses to this entry through the RSS 2.0 feed.
BMJ Journals Development blog homepage

BMJ Web Development Blog

Keep abreast of the technological developments being implemented on the BMJ journal websites.



Creative Comms logo