Skip navigation

Peering behind the curtain of peer review

Academic Michèle Lamont shares her insights on the peer review process after she had rare access to the behind-the-scenes deliberations of several multi-disciplinary review panels

BY PEGGY BERKOWITZ | AUG 04 2009

Peer review is a world that is not fully understood – certainly not by people outside the process and not entirely by those who deliberate on panels to decide what research and which researchers deserve funding or promotion. That is why Michèle Lamont, a Canadian sociologist who holds an endowed chair at Harvard University, turned her attention to peer review in a new book, How Professors Think: Inside the Curious World of Academic Judgment (Harvard, 2009).

“It’s a somewhat secretive world,” says Dr. Lamont in an interview with University Affairs. “Only those who deliberate really know how it works. And the question of standards – how people think of standards, how they maintain them, whether they think the system is meritocratic – this is a really interesting question sociologically.”

Although Dr. Lamont’s recent research has focused on racial and class boundaries in France and the United States, this was not her first foray into the sociology of knowledge. After earning her doctorate in sociology at the Sorbonne (she did her undergraduate and master’s degrees in political studies at the University of Ottawa), Dr. Lamont made waves with her very first paper, entitled “How to Become a Dominant French Philosopher: the Case of Jacques Derrida,” published in the American Journal of Sociology in 1987. The sociology of knowledge “has always remained an interest,” says the native of Gatineau, Quebec.

For her new book, Dr. Lamont studied 12 multidisciplinary panels in five national funding competitions in the United States over a two-year period. She was given rare access to observe three of the panels at work. She also interviewed the panelists, panel chairs and program officers from many different fields in the humanities and social sciences. Six disciplines competing for grant and fellowship funds – history, English, economics, anthropology, political science and philosophy – became the focus of her in-depth study.

History and economics tend to do better in multidisciplinary panels than the others: history because it’s a huge discipline, and historians write well, and they’re writing about topics that people in other fields can understand, says Dr. Lamont. There’s also a great deal of consensus among historians about which projects should be funded. Economists, too, have very clear views among themselves about “who is above the line and who is below the line.” Philosophy she calls “a problem discipline” because philosophers don’t believe non-philosophers can evaluate their work. Meanwhile, English, anthropology and political science fare less well in competitions because of disagreements inside each discipline about what constitutes excellence.

But the book is about much more than the relative standings of English and economics. Fundamentally, says Dr. Lamont, the book is about fairness and how to achieve it.

“Most panelists believe the [peer review] system works – in part because they themselves do things to make it work,” she explains. “They spend a lot of time reading proposals instead of just following their instinct when it’s time to make arguments. They put a lot of care into making the system work.”

Part of the reason panelists take so much care in evaluating proposals is that they’re highly conscious that their own performance is being assessed by their fellow panelists. In fact, one reason people agree to be a peer reviewer is because it gives them a context where their status as academics can be confirmed. “They hope to be able to walk out of the room after two days with their status as an academic enhanced rather than their expertise being questioned,” says Dr. Lamont.

She concludes that the act of evaluating isn’t simply a cognitive experience; interaction and emotion also play important roles. During their deliberations, panelists often talk about what’s exciting in the proposals – that’s emotion, she points out.

“A lot of the previous literature on peer review treated the emotion as either irrelevant or corrupting. And my argument is to say, ‘No, actually it’s not irrelevant or corrupting. It’s essential to evaluation.’”

Diversity was another value that Dr. Lamont examined closely. She wanted to know, when panelists use diversity as a criteria, which dimensions of diversity were they paying attention to?

“Overall I found that institutional diversity and also disciplinary diversity are much more important to them than racial or gender diversity. I think something like 15 percent of them emphasize gender and ethnic or racial diversity, while approximately 35 percent use criteria that have to do with disciplinary or institutional diversity.” In judging finalists for important fellowships, for example, reviewers tried hard to include top candidates from universities other than Harvard. This was an important finding, she says, especially since so much debate in American higher education revolves around gender and race.

While the peer-review panels she observed were all in the U.S., Dr. Lamont believes the findings are relevant to Canada, which has a similar system.

“I think these are cultural templates about how to evaluate that are highly institutionalized in North American higher education – in part because the two higher education systems are so highly integrated,” she says. Dr. Lamont, who is also a fellow and program director with the Canadian Institute for Advanced Research, recently chaired a blue-ribbon panel that looked at peer-review practices for the Social Sciences and Humanities Research Council of Canada.

In Canada, and with SSHRC, there has been more emphasis in recent years on asking researchers to share their findings with the public and even asking for community input into topics for some research. These are values that Dr. Lamont supports. But, asked whether people from communities outside academe, such as business or health care, should be included on peer-review panels, she drew a line.

“I think that’s a very bad idea,” she says. “The reason why academics are asked to evaluate is because we spend our lives developing a fairly detailed classification system that allows us to know what’s been done already and to determine what is original and what is significant. These are the two main criteria [for funding]. … To the extent that peer review is concerned with evaluating the quality of research, I think experts should do this.”

Dr. Lamont mentions in her book that not all scholars are sold on peer review. Asked whether How Professors Think may go some way to appease those scholars who have grown disillusioned with the peer-review process, Dr. Lamont says she hopes it might.

“The argument of the book is not that the system is perfect or it is greatly damaged. It’s more that there are pulls and pushes within the system – some of them more universalistic, some particularistic. … My contribution probably is to describe how complex the system is and how it functions, rather than to come up with a verdict. And I know that many of the people I interviewed think that whatever the faults of the deliberations are, it’s much more desirable than using bibliometric methods that have so many more faults.” And, better than tossing out peer review entirely in favour of some top-down decision-making system.

A blue ribbon for SSHRC

Last fall, the Social Sciences and Humanities Research Council commissioned a “Blue Ribbon Panel” of international experts, chaired by Michèle Lamont, to assess the quality of the council’s peer-review practices.

The panel’s final report, submitted in December, concludes that SSHRC’s peer-review system is “healthy in its fundamentals” and “up to the best practices and highest international standards.”

The panel also offered a series of recommendations to ensure that the system remains sustainable and efficient. Among these: broaden and enrich the pool of expert external assessors; and make all committees able to assess proposals extending beyond strict disciplinary boundaries.

The report’s conclusions were drawn from an extensive documentation review, an online survey of members of the humanities and social sciences community, and more than 50 interviews carried out with reviewers, program officers and SSHRC management.

SSHRC President Chad Gaffield said the council’s management team “is giving careful consideration to the recommendations” and will develop an action plan to further strengthen its peer-review system, to be implemented over the next two years.

PUBLISHED BY
Peggy Berkowitz
Peggy Berkowitz is the editor of University Affairs.
COMMENTS
Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.

Your email address will not be published. Required fields are marked *

Click to fill out a quick survey