The Death of Bibliometrics? Truth, Numbers and Things

September 4, 2013

The sad business of the American Academic and Scholarly Research Journal has come to a rather unexpected end. The journal is no more, having disappeared from the web along with its associated Research Center, and all record of it has disappeared from Scopus as well. Now only a few blog posts and a single citation in the Malaya Journal of Matematik remain as evidence that it ever existed or that it was once the site of a short exciting battle in the War Against Tosh.

If you have no idea what I’m talking about it’s all here but I will summarise. Three weeks ago today it was noticed that the Scopus database was indexing a very low quality journal with the above title. A post (see the link above) appeared on LOL which drew attention to the journal’s defects as a means of provoking a response out of Scopus, but instead of this it drew a threat of legal action from the publishers of AASRJ followed up by some panicky and rather immature behaviour. This resulted in a lengthy update to the original post laying out some core principles of scholarly publishing that were being traduced by both the journal and, by association, the database; this led in turn to a response from Scopus, first backing their quality control and vetting processes and then a day or two later admitting that there was room for improvement and that indexing of the title in question had been suspended. So, that was it, the AASRJ Affair had come to a satisfactory conclusion and the only real victims were those genuine scholars who had wasted their time and money, plus a few hopefuls who saw their dreams of being real published researchers (names in Scopus no less!) disappear like chaff on the wind.

And yet, and yet, there’s something that’s still bugging me about all this. In his first reply Wim Meester indicated that Scopus are working on “metrics and a mechanism to re-evaluate titles covered in Scopus with the possible result to stop covering poor performing titles.” This was, and is, good news, and I am confident that they will take a good look at Jeffrey Beall’s Predatory Titles list, at least eight of which they are still indexing. However, what has been itching away at the back of my mind is that I don’t think you can do this with metrics and unless the other mechanism referred to is something much less metric and much more like the rat droppings (sorry, kitchen hygiene) test then we’re still not out of the woods. Of the eight Scopus journals on the Beall list the two largest ones by far, contributing many hundreds of articles to the bibliometric mix already this year, are pharmaceutical, so this is a matter of some concern to us all. What’s in that pill you’re taking and who said it worked? Just sayin’.

Library Out Loud has no special power to look into the future, but I’m going to take a stab and suggest that coming generations will see our era as the Age of Numbers. This is a great thing, of course, and the science and engineering that underlie our prosperity are constructed on a firm bedrock of mathematics. Similarly our concerns about climate change and environmental decline are largely derived from, or supported by, numbers. The social sciences derive much of their insight into the way we live from statistics, and in the age of big data business has worked the numbers to figure out that the way to sell us more burgers is to advertise salad. Even the humanities are busy number-crunching, counting the words to tell us what Shakespeare really meant to say. And it’s all good. Really interesting and full of insight. But so dependent have we become on knowing the numbers that we expect to see them everywhere, and rely on them to do everything. Works of art are known to us because of their monetary value, books by their appearance on bestseller lists, bands by their ability to fill stadia or sell downloads. We look straight away at the number of hits a youtube video has attracted and people boast about the number of their (Facebook) friends. Rugby players are measured not only by the number of games they have played and the number of tries they have scored, but by the number of “engagements” they have made and the number of times they have tackled or passed the ball per game. And now we have altmetrics, and even LOL posts come with metrics attached. (Retweet! Retweet!) Which brings us right back to academia and the h-index, but we’re still not quite ready for that yet.

Numbers are a really useful way of finding information about the world, but only after we have attached them to things. And we need to do this carefully, because number statements about things only work if the things they count and average are more or less the same thing. Let’s say we have five oranges and five orange tennis balls, we can aggregate the five oranges and make useful statements about them (juicy) and do the same with the five tennis balls (bouncy) but we can’t add them all together to make ten somethings and still extract information. If we call them ten oranges then the result we get is that oranges are less juicy and more bouncy than we thought, but this is clearly nonsense. About all we can say is that oranges and orange tennis balls are both subject to the laws of gravity and reflect orange light, but you’d have to be stupid to confuse the one with the other – the only thing you can do with an orange on a tennis court is to eat it between sets.

So what does this have to do with the Scopus database and the late great American Academic and Scholarly Research Journal (and the eight titles on the Beall list still in Scopus)? Well, just this. If you are using the database to aggregate and count journal articles and make statements about them, then anything in the database that is not a journal article causes an error, and the more of them there are the greater the error will be. But, you might say, weren’t those items from the AASRJ really journal articles, even if one of them did begin with the killer line that “it is imperative to exist without innovation”? Hey, parts of that journal article were really really well written, although it was odd that the reference list stopped at the letter P. Or the four page journal article on the importance for businesses of having a data backup plan, that said that, well, it’s really important to have a backup plan. Or the journal article written by four undergraduate students about how important it is for the sake of the environment to recycle. Well, it is. Each of these contributions certainly looked like a journal article, each one had a more or less coherent title, a list of authors, an abstract, an introduction, a lot of text that was quite hard to read and a list of references at the end. Just like any journal article. If it walks like a duck and quacks like a duck it must be a duck, right?

No. To be a duck, it must not only walk like a duck and quack like a duck, it must also swim like a duck, fly like a duck, peck like a duck, eat like a duck, defecate like a duck and, erm, get together with another duck to produce a line of cute little fluffy ducklings paddling frantically along behind. And this can’t be determined by standing off at a great distance, holding your nose and trying to identify the ducks by the use of metrics, you have to get down and dirty in the duck pond. Sorry about that.

So, to be a journal article a journal article has to do this – it has to identify a gap, however tiny, in our collective knowledge and then use sound and appropriate methods to arrive at reliable new knowledge. In doing this it will cite other documents both to ascertain the existing state of knowledge (and therefore the gap) and to support its methods and conclusions. There is no other valid research reason for one document to cite another, and similarly there is no valid reason for a journal article to exist except in support of new knowledge. This is not an easy thing to achieve and nor should it be. There are plenty of other things to do in life, but if you’re writing or publishing journal articles then this is what you have to do, anything else is just pretence and time wasting. Tough eh? And while metrics might be a help in determining what is a real journal article and sorting out the wheat from the chaff, there can be no substitute for having an actual look at what is going on. It is imperative to exist without (too much) innovation, and this includes innovation that inappropriately substitutes mere numbers for things.

The title of this post, The Death of Bibliometrics, is intentionally melodramatic and tweetable. Hey, I need the numbers, please retweet. But there is a serious point, which is that every piece of junk publishing that is included in our databases, or on Google Scholar, is a parasite, feeding off the good stuff and destroying confidence and trust. In the same way that fake charities not only take money away from good ones but also raise our levels of cynicism about charitable giving, so fake publishing is an attack on scholarship that needs to be vigorously resisted. That was the real problem with the Australian Journal of Basic and Applied Sciences that got me going on this track back in February – a journal that is still indexed by CAB, incidentally, and that therefore still contributes to the Web of Knowledge. By borrowing the research reputation of Australia to add appeal to their title the publishers may have thought they were simply engaging in a harmless piece of commercial spin, but the dilemma here is that the product they are selling is Truth, and one of the properties of Truth is that it is indivisible, it goes right the way through, beginning with the branding, and touches on everything. If any of the authors believed they were publishing in an Australian journal they were mistaken, but if they did not believe it was an Australian journal then they went along with the game, a little act of cynicism, and in the end what suffers is Truth. The whole citational edifice of scholarship is built on millions of tiny acts of trust, each time one article cites another, and although there are probably some very fine articles published in AJBAS the difficulty is that we don’t know for sure which ones to rely on, and by publishing thousands of them on every conceivable subject the editors have shown themselves to be in the business of Numbers, rather than of Truth. So on second thoughts, maybe I wasn’t being melodramatic enough, and what we are looking at is an early stage in the Death of Scholarship.

Bruce White
eResearch Librarian
eResearch on Library Out Loud

Leave a Reply

Your email address will not be published. Required fields are marked *

    Search posts