The number of published papers using Altmetrics ‘attention scores’ as a data source to measure impact is rising. According to Google Scholar, there are over 28,000 papers mentioning Altmetrics and impact.
This latest analysis published in PeerJ finds a positive correlation between citation rates and the Altmetric score for papers published in ecology & conservation journals over a 10 year period (2005-2015). This implies: the more a paper gets tweeted, blogged, or talked about in online popular media, the more it will be cited.
This seems commonsense. The more exposure a paper gets online, compared to traditional exposure via journal alerts to the limited number of subscribers, the more people will be aware of it and potentially cite it. This is why we do scicomm. (Although, hopefully people read a paper first and decide on its quality and relevance before citing.)
But the increasing excitement over Altmetrics as a data source is a little concerning. Sure, it’s easily accessible open data that can be used to answer multiple questions. But no index is perfect. Each one is limited by the way it’s measured, how it’s used and the quality of data it’s based on. This applies to all ‘open’ data sources.
The limitations of Altmetrics only came to my attention recently. It’s a cool index and it gives a little insight into a paper’s social media reach. But…
Last year I co-authored this paper on the value of blogs for ecology academics – incidentally, our key argument is about how influential blogs can be for the science community, which is also one of the key findings in the new PeerJ paper.
Now, I write a blog post for nearly every paper I publish, so of course I did so for this one, as did most of my co-authors. A few weeks later, I decided to have a look at the Altmetrics index of our paper to see how it was tracking. I was surprised to see that my own blog post wasn’t listed in the ‘Blogs’ section (or any other section), even though the blog posts of most of my co-authors were.
Intriguing. My own blog post on my own paper on blogging wasn’t registering at Altmetrics. That was kind of weird.
I had a look through their FAQs, without much insight. I had linked correctly, according to their instructions, and my blog had been an established website for nearly 10 years, and fit all their criteria for indexing…so I wasn’t sure why my post hadn’t shown up. I went back and had a look at the Altmetrics of some of my older papers – my first published paper was in 2010 and I started blogging in 2009. None of the Altmetrics scores of my papers included the blog posts I had written about my own papers. For example, this paper from 2013 includes an article I wrote for Grist about the research, but not the separate post I published on my own blog.
So I wrote to Altmetrics to query why my blog posts were missing from my own papers’ scores. I received a reply with a link to a Google form to submit ‘missed mentions’. The form only allows for one specific mention per submission. So I submitted a few separate forms, not just for the blogging paper, but also for a few older ones.
A few weeks later, I checked again. Lo and behold, my blog post on my blogging paper was now listed. Not in the Blogs tab, where you would expect to find it, but under the ‘Misc.’ tab…along with my other blog posts for completely irrelevant papers that I had also filed ‘missing’ reports for.
I wrote back again to try and correct this. The reply from Altmetrics said: “Our blog tracking is not retroactive. We’ll only be collecting blog posts and putting them on the blogs tab if they are published after we have indexed it.” Fine. But they didn’t explain why my irrelevant blogs had been linked to this particular paper’s Altmetric score.
So now all three of the blog posts I submitted as ‘missing’ (both relevant and irrelevant) are still listed under the Miscellaneous tab of our RSOS blogging paper. (Yet most papers that analyse Altmetrics don’t include the Miscellaneous mentions in their analysis…)
Which brings me back to using Altmetrics as a data source.
I had a look at my own papers. Out of 22 papers published in peer-reviewed indexed journals, only 9 include an Altmetrics score on the webpage. For those 9, there is a weak negative correlation between Altmetrics & citations (r = -0.18, p = 0.63; I ran a mixed model to account for year published, and the effect became weaker). Disclaimer: My data is not representative of the broader effect, I’m just illustrating.
Altmetrics scores, just like most other freely available data sources online, have huge caveats. They don’t pick up all share sources, and they don’t measure quality of output…which raises questions about their representativeness and use-by date.
Also, not all journals use them. For example, PlumX seems to be the latest analytics measure replacing Altmetrics; some of the higher-impact journals are already using it, like The Lancet & PLOS. So how do published analyses of Altmetrics contribute to long-term knowledge?
Yes we should be sharing research papers via social media, regardless of how much ‘attention’ it gets. Let’s not wait for Altmetrics scores to inspire us to share our research! But ambiguous social media scores tell us very little about impact. Only time will tell the quality and relevance of a research paper.
© Manu Saunders 2018