The number of published papers using Altmetrics ‘attention scores’ as a data source to measure impact is rising. According to Google Scholar, there are over 28,000 papers mentioning Altmetrics and impact.
This latest analysis published in PeerJ finds a positive correlation between citation rates and the Altmetric score for papers published in ecology & conservation journals over a 10 year period (2005-2015). This implies: the more a paper gets tweeted, blogged, or talked about in online popular media, the more it will be cited.
This seems commonsense. The more exposure a paper gets online, compared to traditional exposure via journal alerts to the limited number of subscribers, the more people will be aware of it and potentially cite it. This is why we do scicomm. (Although, hopefully people read a paper first and decide on its quality and relevance before citing.)
But the increasing excitement over Altmetrics as a data source is a little concerning. Sure, it’s easily accessible open data that can be used to answer multiple questions. But no index is perfect. Each one is limited by the way it’s measured, how it’s used and the quality of data it’s based on. This applies to all ‘open’ data sources.
The limitations of Altmetrics only came to my attention recently. It’s a cool index and it gives a little insight into a paper’s social media reach. But…
Last year I co-authored this paper on the value of blogs for ecology academics – incidentally, our key argument is about how influential blogs can be for the science community, which is also one of the key findings in the new PeerJ paper.
Now, I write a blog post for nearly every paper I publish, so of course I did so for this one, as did most of my co-authors. A few weeks later, I decided to have a look at the Altmetrics index of our paper to see how it was tracking. I was surprised to see that my own blog post wasn’t listed in the ‘Blogs’ section (or any other section), even though the blog posts of most of my co-authors were.
Intriguing. My own blog post on my own paper on blogging wasn’t registering at Altmetrics. That was kind of weird.
I had a look through their FAQs, without much insight. I had linked correctly, according to their instructions, and my blog had been an established website for nearly 10 years, and fit all their criteria for indexing…so I wasn’t sure why my post hadn’t shown up. I went back and had a look at the Altmetrics of some of my older papers – my first published paper was in 2010 and I started blogging in 2009. None of the Altmetrics scores of my papers included the blog posts I had written about my own papers. For example, this paper from 2013 includes an article I wrote for Grist about the research, but not the separate post I published on my own blog.
So I wrote to Altmetrics to query why my blog posts were missing from my own papers’ scores. I received a reply with a link to a Google form to submit ‘missed mentions’. The form only allows for one specific mention per submission. So I submitted a few separate forms, not just for the blogging paper, but also for a few older ones.
A few weeks later, I checked again. Lo and behold, my blog post on my blogging paper was now listed. Not in the Blogs tab, where you would expect to find it, but under the ‘Misc.’ tab…along with my other blog posts for completely irrelevant papers that I had also filed ‘missing’ reports for.
I wrote back again to try and correct this. The reply from Altmetrics said: “Our blog tracking is not retroactive. We’ll only be collecting blog posts and putting them on the blogs tab if they are published after we have indexed it.” Fine. But they didn’t explain why my irrelevant blogs had been linked to this particular paper’s Altmetric score.
So now all three of the blog posts I submitted as ‘missing’ (both relevant and irrelevant) are still listed under the Miscellaneous tab of our RSOS blogging paper. (Yet most papers that analyse Altmetrics don’t include the Miscellaneous mentions in their analysis…)
Which brings me back to using Altmetrics as a data source.
I had a look at my own papers. Out of 22 papers published in peer-reviewed indexed journals, only 9 include an Altmetrics score on the webpage. For those 9, there is a weak negative correlation between Altmetrics & citations (r = -0.18, p = 0.63; I ran a mixed model to account for year published, and the effect became weaker). Disclaimer: My data is not representative of the broader effect, I’m just illustrating.
Altmetrics scores, just like most other freely available data sources online, have huge caveats. They don’t pick up all share sources, and they don’t measure quality of output…which raises questions about their representativeness and use-by date.
Also, not all journals use them. For example, PlumX seems to be the latest analytics measure replacing Altmetrics; some of the higher-impact journals are already using it, like The Lancet & PLOS. So how do published analyses of Altmetrics contribute to long-term knowledge?
Yes we should be sharing research papers via social media, regardless of how much ‘attention’ it gets. Let’s not wait for Altmetrics scores to inspire us to share our research! But ambiguous social media scores tell us very little about impact. Only time will tell the quality and relevance of a research paper.
© Manu Saunders 2018
Agree completely that altmetrics is useful but far from perfect. I mostly like it because it lets me find most of the media and social media mentions of the article all in once place rather than as a metric. Personally I think the main limitation to the score is that there is no open list of the sources, so we can’t really know what is tracked and what isn’t, beyond the big media they list on their site. The other big one is that controversial papers get high scores even if they are bad papers (but that is similar to citations in some ways).
Two other limitations you mention though I am not so sure about. To be fair they have said previously that personal blogs are generally only listed if someone suggests them via their source form. So its not that surprising that the blog posts weren’t turning up previously. A bit of a stuff up that they added all three blog posts to the one paper (and you would think they would rush to correct that. I can see they added one of those blogs to the correct paper too just to confuse the matter even more) but I guess then it in some way validates using the “misc” category which doesn’t count towards the score for entries that are added by a human at a later date.
The second one is that it doesn’t matter if the Journal itself is using altmetrics, you can still get the altmetrics score for your paper in those Journals e.g. PloS https://www.altmetric.com/details/9522558 . That also means that most of your 22 articles could be added to the figure (unless they don’t have a doi with metatags but most Journals do).
Thanks Lachlan. Yes, I agree Altmetrics is great is a nice indicator of social media mentions etc. Here I was talking about whether it’s robust enough to warrant the increasing number of publications using it as a data source to analyse impact
What’s the relationship between Altmetrics and the Citations/h-index/i10-index which appear on Google Scholar? Has any study been done comparing these with one another? I know those are used in academic evaluations …
I’m not aware of a study comparing them, but Altmetrics aren’t used to calculate h-index etc