22 March 2013

Almetrics and the need for speed

I’ve been playing around with ImpactStory, which I learned about at Science Online and was also featured at the recent AAAS meeting. I like it.

In playing with it, it occurred to me that there is only one reason we need this, and tools like it. Because we need to make decisions relatively fast. With tenure, promotion, and funding decisions typically are trying to evaluate research done in the last couple of years.

If it weren’t for that, I’d wager we’d just use citations to evaluate research contribution. Citations have always been considered a very good measure for how a scientific article has affected the research field. The problem is that citations accumulate too slowly. Altmetrics are starting to tease apart other kinds of impact (interest in the general public). For routine academic decisions, though, would you need anything besides citations if it weren’t for the speed issue?

Additional, 25 March 2013: There are a lot of interesting points in this article, but relevant to this post is this graphic about the probability of being cited in physics:


If I understand this right, the X axis shows the time to first citation and the Y shows the number of citations. As author Kristina Lerman puts it:

A newly published paper is very quickly forgotten. After a paper is a year old, its chances of getting discovered drop like a rock!

This suggests that the situation for traditional citations isn’t as slow and pokey as I might have thought. That first citation seems to be a decent predictor, and a year isn't that long to wait. I would be interested in knowing if there are similar analyses for other scientific disciplines.

External links

Stop publishing so much already!

No comments: