Metrics can’t replace expert judgment in science assessments, says new report
Findings will be used by NSERC to devise a new allocation system for its Discovery Grants Program.
Quantitative indicators such as the number of publications and citation counts can be used to inform decisions about allocating research funding, but they can’t replace expert judgment, says a new report (PDF) by the Council of Canadian Academies released on July 5.
The report of the Expert Panel on Science Performance and Research Funding, entitled Informing Research Choices: Indicators and Judgment, was convened following a request to CCA in 2010 by the federal minister of industry on behalf of the Natural Sciences and Engineering Research Council. The 16-member panel was asked to consider the question, “What do the scientific evidence and the approaches used by other funding agencies globally have to offer, in terms of performance indicators and related best practice in the context of research in the natural sciences and engineering, carried out at universities, colleges and polytechnics?”
NSERC spends approximately $1 billion a year on scientific research, with about one-third of that going directly to support basic research through its Discovery Grants Program. According to a statement on the NSERC website, the information in the report will be used to compare “overall levels of excellence across disciplines according to international best practices,” with the goal of devising a new budget allocation methodology for the Discovery Grants Program.
“How funding is allocated is always contentious, that goes without saying,” said Max Blouw, a member of the panel and president of Wilfrid Laurier University. “There’s a real interest by governments and government agencies – and I don’t think it’s just Canada, its worldwide – to have some sense of when they make an investment in discovery research, is there a way of evaluating a probability of a good return on that investment?”
There are many quantitative indicators to assess science performance, the report notes. These include not just bibliometric indicators like publications and citation counts, but also patterns and trends in grant applications and research funding, measures of “esteem” such as academic honours and awards, and a host of new internet-based metrics such as the number of article downloads.
These indicators and assessment approaches “are sufficiently robust” to be used to assess science performance in the aggregate at the national level, the report concludes. However, these indicators “should be used to inform rather than replace expert judgment,” which the expert panel says remains “invaluable.” Examples of expert judgment include peer review and other “deliberative” methods, says the report.
“To use a simple metric and make your funding decisions on that would be really far too simplistic,” said Dr. Blouw. “It’s just not a realistic strategy to say, ‘this is an area in which Canadians are publishing heavily and extensively, ergo we must be great at it, therefore we must fund it some more.’”
The research community, he said, “needs to understand that indicators are just that. They have to be used in a very deliberate, thoughtful and informed fashion, and always with expert judgment.”
The key here is transparency, said Rita Colwell, Distinguished University Professor at the University of Maryland and chair of the panel. With many countries facing economic troubles, “the question arises, do you make investments in discovery research, which is sometimes not understood by some and therefore interpreted as frivolous? Then to make the investment, the process has to be clear and the context in which this investment is made also needs to be transparent to the public.”
Dr. Colwell said it’s “to NSERC’s credit that it wishes to ensure that the process it follows – which is very likely contentious when you’re establishing priorities – is understood and accepted by scientists, the public, as well as the minister of science.”