Australia’s national health research funder has acted to restrict assessment of each applicant to its marque research competitions to only 10 publications from the last decade.
At most, just 10. Let that sink in.
No need to maintain or submit a Common CV to detail every sliver of every publication, engagement, or talk you were ever even tangentially part of. No journal impacts or H-index. Just 10 publications over 10 years.
What led to this bold leadership move from the National Health and Medical Research Council (NHMRC)? The council clarified: it’s about valuing research quality over quantity. Truly incentivizing research that is rigorous, transparent, and reproducible. Contribution is crucial: applicants must not only explain how each publication contributed knowledge and quality, but also convey the publication’s impact and its contributions to their leadership.
That movement you now feel are the heavyweights of Canada’s research community shifting uneasily in their (endowed) chairs. What to lean your external credibility and inner confidence on if it’s not your 220 publications? It may also be the indignation of departments and institutions which for years have predicated their reward structures, celebrations, and identities on doing lots of research.
Yet, this change heralds no doomsday — it’s change the world urgently needs.
NHMRC’s stance reconnects us with a deep truth. For 3,000 years humanity has known that contributions to knowledge are only ever determined by the nature and influence of the knowledge produced. Yet, we are at a near crisis point in handling the volume of new research being produced each day — with chronic shortages of the manuscript and grant reviewers, editors, and mentors able and willing to maintain the very quality on which new and useful knowledge depends.
While we all play a role in the publishing game, some gain more in this game than others. Research dissemination has become less about public and epistemological interests than corporate and private wealth: powerful interests to drive researchers and their institutions to publish ever more and faster. Profit margins in academic publishing are notoriously lucrative, and while worldwide sales in academic publishing are now of similar size to global music and film industries, academic publishing is not only more stable and predictable but arguably more promising in the face of global crises in need of knowledge-based solutions.
Some publishers want even more. An international academic publisher recently introduced a new model of “accelerated peer-review” requiring researchers to pay the publisher $7,000 USD for a five-week rapid review timeline to publication. This doesn’t include open access — that’s $4,800 USD extra.
You bet academic publishers want you to publish as many articles as possible.
What then for Canada? NHMRC’s prioritization of research quality, impact, and leadership is in stark contrast to what prevails. Observe submissions for doctoral scholarships in which only applicants with publication numbers in double digits are deemed competitive. See the furtive reliance on publication counts at tired review tables to readily discern if the team “is productive” or the individual merits the institution’s award. Better still, next time you read a grant or hear a conference keynote introduction kicking-off with the bold announcement that Dr. Superlative is a star researcher because they have 250 publications—please, send us a dollar.
Canada’s preoccupation with research quantity must end because:
- It damages researchers. With so many vested interests encouraging academics to publish more, academia overflows with work-related burnout, anxiety, and depression — from graduate students to senior academics. Manifold structural inequities, established before birth in Canada, further curtail the opportunities and contributions of so many diverse researchers and so much diverse research.
- It damages knowledge and society. Working too quickly means research does not receive the time and attention it needs to be the best it can be. Pressures to publish quickly also contribute to questionable research practices — ethically dubious techniques which artificially inflate the merit of work, such as: salami slicing, selective reporting, or analytical hacks. Despite the cataclysmic harm these practices do to public trust in research and researchers, startlingly, up to half of researchers in some national studies confess to using such practices. To not prioritize research quality first is incalculably risky to Canadian’s trust in its research and researchers.
- It damages our infrastructure. The Canadian research infrastructure depends on sufficient numbers of skilled researchers having the willingness and capacity to participate in peer-review, governance, mentorship, administration and community engagement. Esteeming quantity of research counters the time and efforts needed to make these vital contributions.
Thus, we call now on Canada’s research funders and universities to follow NHMRC to prioritize research quality, contribution and leadership in research practices and reduce Canada’s systematic and damaging preoccupation with research quantity.
To foster change researchers can also be cultural influencers:
- Go deep. Openly question the values and practices of administrators and peers when they prioritize or esteem research quantity over quality. Open spaces for reflection and conversation about how prioritizing quantity can damage knowledge, trust, and equity, and unwittingly serves the interests of publishers most.
- Get specific. Stop framing your and others’ contributions or biographies using publication numbers. Being a great dresser is never based on how many clothes you own. Assess others’ performance for funding, hiring, tenure, and promotion more deliberately based on quality, contribution, and leadership. And encourage others to do so too.
- Get skilled. Develop your skills in not just doing research, but in sharing data, community engagement, research impact and leadership practices. For research to impact on and involve our societies and communities in diverse ways, our skill sets must grow accordingly.
- Give yourself permission. The relationship between ourselves and how much research we produce is incredibly hard to break. Systems, practices, and the past steer us incessantly to ground our self-esteem, confidence, and identity in how much research we produce and chase the next output. On the day you die, know you will care far less about how much you did than what you did that made the biggest difference.
Having served on three NSERC grant selection committees, I know that while research quantity is considered, quality has prevailed as the top priority. Reflecting this, an NSERC Discovery Grant application includes the description of five most significant contributions, and only four research contributions from the prior six years are submitted. Through the application, it is critical for the applicant to explain how their research is original, important, and influential, thus emphasizing quality.
This Canadian approach already overlaps with the revised Australian system, where the applicant explains up to ten top publications from the prior decade. That list is provided to the external reviewers, but it unclear if a more complete publication list is provided for the assessment committee.
For both the Australian committee and reviewers, additional publications can be referred to in the Research Impact section. Further contributions might also be integrated into the research proposal, and possibly within an HQP achievement section.
The Australian revision is intended to encourage external reviewers to focus on research quality and contribution, rather than quantity. Conversely, I suspect that the assessment committees had already adopted this appropriate emphasis.
The “damaging madness” has long been with us. In the 1990s members of Canadians for Responsible Research Funding (CARRF) held conferences, addressed government committees, set up web-pages, and wrote papers and at least one book. Recently, one of them came out of the woodwork with a piece in FASEB Journal (36, e22158. “When ‘doping’ is OK: the importance not only of basic research, but how it is funded.” doi:10.1096/fj.202101955.). This describes some progress in implementing the decades-old proposal of “bicameral review” – a carefully thought out replacement for conventional peer-review. Progress, yes, but oh so slow. Clark and Sousa have given us a nice piece, but they should not believe that the madness will end any time soon.
I found time and again that when I would raise the issue of the “quality” of a publication at Promotion and Tenure meetings, some of the other committee members would be very resistant. It is easy to say: “Joe published six papers in two years”, but very hard to establish that: “Mary’s one paper in a top notch refereed journal is worth four publications that repeat half-baked ideas.” One argument that is made against judgements concerning quality has to do with extremely specialized disciplinary topics in the social sciences or the sciences of nature. As a sociologist I was often put down for trying to make an argument about the quality (good or bad) of a paper in psychology, economics, geography or even anthropology. Yet the committees were structured at the “college” level (as opposed to the departmental level) as inter-disciplinary, or at least multi-disciplinary committees. A colleague told me when I first starting at a Canadian research-intensive university to NOT publish since what I published might embarrass me years later. But that then meant that the decision about tenure was not as straight-forward as it might have been had I published the same ideas and research in only slightly different forms in several journals. It is very hard on new faculty without tenure to advise them to only strive for quality in top notch publications, especially if those journals are clearly based on networks at elite US universities (and not as easy for Canadians to be accepted by).