I failed to get a Social Sciences and Humanities Research Council insight grant this year. Disappointing, but not surprising since nearly 80 percent of applicants came up short. Academics often experience failure in publishing and grantsmanship. It’s part of the job. In this case, though, it’s how I failed, and what this says about accountability and peer review at SSHRC, that warrants consideration.
Like others, I found the SSHRC application process to be especially protracted. Long and short versions of the research proposal combine with pages of forms and many other requirements. Final page count for the entire application: 40. Total hours invested: 70 to 80.
Early on, I often asked myself, why do this? Since my last SSHRC research grant, I had supported my work with funds from visiting fellowships and internal research grants. None of these required 70 hours to apply. My graduate students are funded from faculty sources, not supervisor research grants. A rational move would be to skip the whole exercise.
I didn’t for a couple of reasons. My university, like others, has moved to link internal research funding to SSHRC/Tri-Council applications. Don’t apply for a SSHRC and you are not eligible for some pots of intra-university funding.
More important, my proposed research was labour intensive: a historical account of cigarette smoking in youth media after 1945. This would mean examining films, TV shows, comic books and teen magazines. It would require research assistants and funding for archival visits. Tobacco companies, involved in litigation, had for years hired university historians to do similar work. It was time, I thought, for a non-aligned scholar to do so too.
SSHRC seemed the best option, so I spent much of August and September dealing with training plans, budget justifications, and a capricious online application system.
The following April I learned that my proposal was unsuccessful. Six weeks later, SSHRC sent me a package with two reports from anonymous external reviewers, most likely experts in tobacco history. Both offered unqualified praise and support for the project (25 of 26 scores were in the highest possible category). I couldn’t recall getting a more positive set of peer reviews in my 15 years in academe.
On another page was my score from the SSHRC selection committee, which was in the bottom 35 percent of applicant scores. The committee as a whole did not produce this score. Instead, SSHRC uses a “triage” method to adjudicate files. Initially, each application goes to two committee members. If at this stage one’s combined tally falls within the bottom 35 percent of scores, then it’s effectively game over; the application does not go to the entire committee.
Incredibly, no comments are provided to explain this score, not from the two relevant committee members or anyone else at SSHRC. Weak publishing record? Budget inflated? Project too ambitious? I have no idea. The policy is for no written feedback to be sent to the “Group of 35 Percent.”
I contacted a SSHRC program officer to ask about this, to see whether I could at least be sent notes from the two committee members who had read my application. I received this response: “In terms of committee members’ notes, we ask them to destroy them as soon as the meeting is over. The conflict of interest sheet that they sign does tell them not to discuss any applications or keep any notes. We do not keep preliminary notes.”
What about the discrepancy between the external reviews and my low score? It happens, I was told. Might an unreasonably low score from a single committee member who is not required to provide any written rationale effectively scupper an application? This was unlikely since “committee members take their responsibilities seriously.”
My application had identified historians at four Canadian universities who worked for tobacco companies, some serving as expert witnesses in court. Two members of the SSHRC committee were departmental colleagues of these historians. I advised SSHRC of this before the evaluation, asking if conflict-of-interest provisions prevented them from assessing my file. I was told after the competition that conflict of interest did not factor in the assessment of my application.
That may be true. But how can I really know? SSHRC policy prevents the identification of committee members who read particular files at the triage stage. They, in turn, are not required to provide written justifications for low-score assessments. It is optional whether they consider the external reviews.
SSHRC trumpets its commitment to the peer-review process, but it’s a bizarre form of peer review. Imagine a peer-reviewed journal whose editor solicited reviews from experts which came back highly positive. The editor then rejected the paper outright without explanation to the author. If done routinely, how long would this journal remain credible?
What about the SSHRC appeal process? For that, one must show that procedural errors occurred during adjudication. But SSHRC’s procedures permit the very things presented here as most problematic: triage evaluation, take-it-or-leave-it use of external reviews, and no comments for nearly four in 10 applicants.
What’s the solution? SSHRC sees its main problems as insufficient research funds and too many applicants. The latter necessitates triage assessment and the foregoing of written feedback to many. Surely better options exist. The number of applicants could be reduced, for example, by having those with A-to-M surnames apply in even numbered years. Or a “first-cut” of grant applicants could occur at the university or college level, as exists with the SSHRC doctoral award program.
More substantively, Ottawa could adopt a variant of what happens with the Ontario Graduate Scholarship program: SSHRC could provide block funding to universities and colleges and have them administer competitions for certain grants. SSHRC would still set research funding priorities. But instead of adjudicating grant competitions, it would, in three- or five-year intervals, evaluate the research output of grant holders and their institutions. This would require fewer bureaucratic resources than the current system, freeing up money for under-funded research programs.
On its website, SSHRC promotes its adherence to “merit review”, a “transparent, in-depth and effective way to allocate public research funds.” If my experience with this “merit review” process is commonly shared, then it may be time to reassess SSHRC’s mandate and operating practices.
Daniel Robinson is a historian and associate professor in the media studies program, faculty of information and media studies, at Western University.