Skip navigation
The Black Hole

Measuring the non-academic impact of your science

BY DAVID KENT | SEP 23 2013

Citations are the standard benchmark for scientists to assess the impact of their work. Highly cited papers have clearly influenced the field and few would dispute their importance. What citations do not measure, though, is the wider impact of a paper – do industrial projects build from these discoveries, are school children interested in them, will it inspire governments and funders to direct more resources into research efforts? Tools for measuring such impact are rare, but recently I was introduced to a quick and easy web-based program for getting a bird’s eye view of the non-academic impact of papers.

Altmetric is very simple and can quickly help determine the wider impact of a paper. It has developed a free toolbar app that everyone can use in addition to several advanced programs that can be purchased for those interested in doing higher-level metrics. Out of curiosity, I ran the tool on some papers to see if anything interesting popped up and was impressed with the breadth of data captured as well as the easy to use/navigate web interface.

What does it measure?

From what I’ve seen so far, Altmetric collects data from Twitter, blogs, news agencies, Faculty of 1000, Google+, etc., and uses it to assess a paper’s relative impact to other articles. For some articles, it has sufficient data to compare it to articles from the same journal or all articles in their database. It then produces a score that represents the non-citation activity around an article.

Sometimes things agree quite well – I have a paper in 2008 and one in 2009 in the journal Blood. Citation tracking tells me that the latter has had a larger impact (86 vs. 24 citations) and the Altmetric scores are 7 and 1, respectively. However, when I compared the 84-citation paper to a 2007 Cell Stem Cell paper I was part of which also scored an Altmetric 7, it comes in substantially lower than the CSC paper’s 178 citations. Could this be down to journal source alone or does it actually reflect a similar impact outside of peer-reviewed academic articles? Significantly more analysis of other articles would likely need to be done to assess such possibilities.

What else do you get?

An excellent component of the Altmetric page is the “score” function which shows how an article compares to other articles in the same journal and from the same time. For example, the paper we  published in PLoS Biology this past June scores an Altmetric 16 and was measured to be in the 79th percentile of all PLoS Bio articles, 92nd percentile for articles of a similar age, and 95th percentile of all articles measured in the database. Currently there are no papers citing this article as it is only three months old, but when one is going on the job hunt, these additional metrics might be useful for hiring committees to determine the relative standing of a paper.

Is it all just vanity?

Many people will look at these types of metrics because they are curious about their own articles and many  will simply dismiss the whole altmetric exercise as one of pure vanity and therefore of no added benefit to science or scientists. No doubt, there is an element to the tool that will encourage scientists to look up their own article, but the utility of Altmetric is quite a lot broader. It collates related news stories and blog entries, it assesses papers relative to other papers in the same journal from the same time, and it actively excludes citations from its calculations, thereby offering a novel way of assessing papers.

And importantly, we should also ask what exactly citations measure – in many cases, article citations are accumulated by things that do not reflect the quality or impact of their work – prolific authors self-citing, field popularity (e.g., stem cells, microRNAs), perceived journal importance (e.g., Nature articles get read because they are in Nature). At least Altmetric offers a way to assess articles on their individual merits and assess their broader importance (e.g., impact in industry news).

Bias toward new articles

Social media, especially as it pertains to discussing scientific articles, is very new and therefore all articles written before a certain time suffer disproportionately when it comes to Twitter, Google+, etc.  Furthermore, there are inevitably new social media applications that will emerge in future years, making this a persistent problem for Altmetric. We’ll have to wait and see how this will affect the tool as it develops in the future.

Until then, we will continue to update readers on novel tools for assessing biomedical researchers. For more information, check out our previous entries:

 

 

ABOUT DAVID KENT
David Kent
Dr. David Kent is a principal investigator at the York Biomedical Research Institute at the University of York, York, UK. He trained at Western University and the University of British Columbia before spending 10 years at the University of Cambridge, UK where he ran his research group until 2019. His laboratory's research focuses on the fundamental biology of blood stem cells and how changes in their regulation lead to cancers. David has a long history of public engagement and outreach including the creation of The Black Hole in 2009.
COMMENTS
Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.

Your email address will not be published. Required fields are marked *

  1. SB / September 25, 2013 at 03:09

    I think the impact of an article in the wider world is a difficult thing to measure, and I suspect that the number of non-journal citations will tell only part of the story (and will be heavily influenced by factors arguably unrelated to the science itself, such as the authors’ social media prowess, popularity, access to a well-oiled university PR machine…) Still, altmetric is a cool concept, especially given the paucity of similar tools for measuring “broader impacts” of one’s research.

    I am concerned, however, about the additional incentive these types of metrics create for scientists to over-hype their science while ostensibly doing public outreach. Metrics based on sheer quantity of press (or social media) coverage of an article cannot discriminate between meaningful public engagement and overselling your science to attract attention; in fact, they arguably reward the latter behaviour, which I believe has a detrimental rather than a positive impact on the scientific enterprise, and should not be encouraged (I remember hearing very similar criticisms voiced about the 3-minute thesis competitions). They’re also probably subject to the same field popularity biases as journal citations. I remain skeptical for now but hopeful that better systems for measuring “broader impacts” will be developed in the future.

    Of course, that didn’t stop me from using Altmetric to look up the score of papers I’ve been an author on. I was surprised to find out that our recent NCB paper ranked in the 77th percentile of all articles ever tracked, owing to 4 tweets (3 of which were from the journal, our lab account and me) and 1 Facebook share. I guess the take-home message is that at present, most scientists are making little to no effort to promote their science through the channels evaluated by this tool.

  2. Shari Graydon / September 25, 2013 at 12:31

    Very interesting article about a potentially very useful tool. And to the last point made in SB’s comment above — “most scientists are making little to no effort to promote their science through the channels evaluated by this tool” — that’s deeply unfortunate. Although not an early adopter of social media, I’ve become a convert about Twitter’s capacity to engage a broader audience for important information that might otherwise remain confined to journals read by a very small audience. If the research you’re doing has the capacity to contribute to positive developments in even a small corner of any world, it’s worth expanding the potential number of people who hear about it.

Click to fill out a quick survey