Altmetrics – What’s that all about?
August 30, 2013
Users of Symplectic will probably have noticed a little Altmetrics button next to the count of their citations. It will probably look a bit like this –
If you’re lucky and your work has created some Internet and social media activity then your altmetrics button could look more like this
What this means is that a company known as Altmetric (www.altmetric.com) has produced what are known as article level metrics for the work and assigned it with a weighted score, based on the number of times it has been mentioned in Tweets, on Facebook, Reddit and Google+, in blogs and in the media, and the number of times it has been saved or bookmarked in services like Mendeley and CiteULike, although not within ResearchGate which seems to be the academic social network of choice in 2013. The same metric is used in the Scopus database and keen readers of LOL will have observed that the creators of Scopus are themselves keen observers of altmetric scores and social media. However, don’t be concerned if your work didn’t score because, as Altmetric themselves acknowledge, the usual score for a scholarly work is zero as most academic articles don’t create this sort of interest.
One detects a certain amount of eye-rolling around the intersection of traditional scholarly publishing and social networking that this represents, particularly when well-cited articles that represent solid contributions to disciplinary knowledge receive low scores, but it does cast an interesting light on the question of academic metrics in general and in some cases may turn up useful new information that is not captured by traditional citation counts. Citation counting as a measure of significance is a phenomenon of the computer age, and is based upon the relatively simple notion that for one work to cite another is a significant act and that the number of citations can mark the degree of a work’s significance. This became apparent in the 1960s, when ground-breaking work by scholars like Watson and Crick in the sciences and Chomsky in the social sciences received large numbers of citations in Science Citation Index. This phenomenon extended to measuring citation rates for journals which led to Journal Citation Reports and the famous Journal Impact Factor. As the expectation grew that academic research could and should produce measurable outputs and results so did the interest in bibliometric measures, and the last ten years have seen new measures like the h-index (I will write another post about that soon) and a stronger interest in actions that can be taken to ensure that articles are not only published but cited as well.
One of the problems with the standard bibliometric approach has been that it only measures a limited number of markers in a limited number of places. Before the advent of Google Scholar the citations of an article could really only be found if it was published in a journal indexed by Web of Science or Scopus. Web of Science in particular was quite restrictive in the number of titles it covered and although citations could be found of work published in other journals, or of books and other types of publication, this was a cumbersome manual process. To some extent Google Scholar overcame this limitation but at the cost of delivering inflated counts (more on this to follow as well). One of the problems with straight citation counting is that every citation carries an equal value and, by extending the boundaries within which citations were counted, Google Scholar could give apparent significance to being cited in a paper that may itself have been rejected for publication (but still existed on the Internet) or even in a student essay. Parallel to this, even within traditional journals, “gaming” of citations by canny journal editors seeking to up their Impact Factors was becoming common.
Another complaint about citation-based bibliometrics was that it looked at only one specific measure in any case, citation of a paper by an article in a journal that was then indexed by one of a handful of databases. In some disciplines this happens less often than in others – it is standard practice in many of the sciences to “cite everything that moves”, or at least every article of possible relevance, whereas in literary studies, for example, relatively few journal articles are written and they tend to consist of direct examination of literary texts, with little requirement for citing the works of other authors. There’s nothing “wrong” with that paradigm it’s just different. In many fields, distinguished discipline-leading scholars have Scopus or Web of Science h-indices that are, or appear to be, shaded into insignificance by those of youngish scientists. Citation moves at a relatively slower rate in the humanities, and in the social sciences as well, and it can sometimes take several years before an article is “discovered” and begins to attract interest. And, of course, in these disciplines the publishing of books and book chapters is much more common, and these do not generally feed readily into the journal-centric h-index. In the creative arts, scholarly publishing per se is relatively unimportant, but there is a robust tradition of “serious” magazines and, more recently, blogging.
Over recent years other information about the impact of published work has become available, and these days publishers often let authors know how often their articles have been downloaded. On top of this, social networking has given readers a chance to enter the game themselves by tagging articles in Reddit or CiteULike, saving them in Mendeley, tweeting them or posting them on Facebook, or mentioning them in blog posts. Downloads are a tricky issue that I won’t go into here, but social activities are all suggestive of engagement or influence and could be seen as indicators that a particular work has made contact with an audience, and could even be seen as early indicators of likely citation activity. In an article in the Journal of Medical Internet Research called Can Tweets Predict Citations? Gunther Eysenbach found that “highly tweeted articles were 11 times more likely to be highly cited than less-tweeted articles”, but he also concluded that “tweetations” (some of you might want to take a walk in the fresh air for a minute) were not a substitute for citations as a proxy for scholarly merit but instead “should be primarily seen as a metric for social impact and knowledge translation”. There is a real difficulty here in disentangling the abstract idea of scholarly merit from the more easily observed phenomenon of popularity, but this problem really predates the twitterverse and the recent case of academic fraud by social psychologist Diederik Stapel is evidence that the race for citations has probably been distorting both research direction and research practice itself for many years.
So, should Massey researchers be concerned that their altmetric score in Symplectic is zero? The short answer to that is no, a score of zero is totally normal and does not mean that the work will not be cited and well regarded within the appropriate community of scholars. On the other hand, a higher score should be cause for satisfaction and it’s always pleasing to know that work is being read, or at least noticed – not all the tweeters will have read much beyond the title and even the act of saving articles to Mendeley may represent either good intentions or display behaviour, while Facebooking academic articles will not always be done for a serious purpose. Whatever the reason, though, a positive number is inevitably pleasing. The real problem arises, however, when the citation score remains at zero while the altmetric is relatively high – can anything be made of this in terms of describing the impact of the work? Here I’m going to stick my neck out and say maybe. A lot of tweets in itself may only represent a sort of Bieberian accident, exciting but transient and difficult to explain. If a case is to be made for the use of altmetrics as evidence of social engagement then just throwing out a number probably isn’t going to do very much and instead you need to look at the blogs that mention your work, and the media articles, to see who they are written by and what they say. Finding that information could shed interesting new light back on your work and what it means beyond the confines of academia.
The next question is how to get your altmetric figure above zero, but this posting has gone on long enough, so that can wait for another time. In the meantime, please retweet.
Bruce White
eResearch Librarian
eResearch on Library Out Loud
Search posts
Categories
Tags
Recent Comments
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- December 2023
- November 2023
- October 2023
- September 2023
- June 2023
- May 2023
- February 2023
- January 2023
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- March 2022
- January 2022
- November 2021
- August 2021
- July 2021
- May 2021
- April 2021
- March 2021
- December 2020
- November 2020
- September 2020
- August 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- November 2019
- October 2019
- September 2019
- July 2019
- June 2019
- May 2019
- March 2019
- February 2019
- December 2018
- November 2018
- October 2018
- September 2018
- August 2018
- July 2018
- June 2018
- May 2018
- April 2018
- March 2018
- February 2018
- January 2018
- December 2017
- November 2017
- October 2017
- September 2017
- August 2017
- July 2017
- June 2017
- May 2017
- April 2017
- March 2017
- February 2017
- January 2017
- December 2016
- November 2016
- October 2016
- September 2016
- August 2016
- July 2016
- June 2016
- May 2016
- April 2016
- March 2016
- February 2016
- January 2016
- December 2015
- November 2015
- October 2015
- September 2015
- August 2015
- July 2015
- June 2015
- May 2015
- April 2015
- March 2015
- February 2015
- January 2015
- December 2014
- November 2014
- October 2014
- September 2014
- August 2014
- July 2014
- June 2014
- May 2014
- April 2014
- March 2014
- February 2014
- January 2014
- December 2013
- November 2013
- October 2013
- September 2013
- August 2013
- July 2013
- June 2013
- May 2013
- April 2013
- March 2013
- February 2013
- January 2013
- December 2012
- November 2012
- October 2012
- September 2012
- August 2012
- July 2012
- June 2012
- May 2012
- September 2009
- November 2008
Leave a Reply