A report was released last week by Toronto-based Higher Education Strategy Associates that came as a surprise. Using their self-stated “alternative metrics” for measuring the research strength at Canadian universities, the Measuring Academic Research in Canada: Field-Normalized Academic Rankings 2012, report uses a combination of different metrics (h-index, research funding, etc) in combination with what they’ve termed field normalization in order to remove the biases created by high citation, money consuming fields (medicine, physics, etc). It is well worth a read, and while much further work clearly needs to be done to parse out exactly how to quantify research strength, the introduction of new metrics in science is something that we have banged on about over the last several years and it is very nice to see some of the surprises that ensue from such novel assessment tools.
The results are that UBC pops out number one in both sciences/engineering as well as social sciences/humanities. Perhaps more surprisingly, Université du Québec à Rimouski is ranked 7th overall in sciences/engineering. The study does not include medical sciences (citing university-affiliated institutions as the source of data crunch consternation) so there are clearly some missing elements to the overall interpretation of this report. What is abundantly clear, however, is that the current metrics for assessing the quality of a university’s research strength deserve to be challenged.
Why do I say that? Firstly, there is a lot of interesting stuff that comes out of this method of assessing research strength. An example highlighted by the authors is that of UQTR and Laval when compared in the sciences/engineering:
Trois-Rivières’ field-normalized H-index scores in NSERC disciplines are slightly better than Laval’s (1.05 vs. 0.99), yet Laval receives field-normalized granting council funding at over well over twice UQTR’s rate (1.27 vs. 0.59). It is not at all clear why this happens.
If this trend continued over multiple years, it would seem reasonable that UQTR would deserve to be funded at much higher rate in future competitions if research impact is what the granting councils wish to support.
Finally, while I think the authors might lose a little credibility with their sarcasm to conclude the report, it is certainly worth sharing their words as the intention of the report is to flag alternative metrics as useful for identifying institutions that are punching above their weight:
… if one can be bothered to look behind the big, ugly institution-wide aggregates that have become the norm in Canadian research metrics, one can find some little clusters of excellence across the country that are deserving of greater recognition.
Perhaps some schools are coasting on an undeserved reputation and others are struggling to be recognized amongst the shadow of the U-15? We will definitely be curious to see the follow-up data from future years in any case.