Earlier this month I read an article by Julia Belluz that ripped into the scientific publishing system. The saddest, and truest, sentiment of the article can be summed up in the following quotation:
“Taxpayers fund a lot of the science that gets done, academics peer review it for free, and then journals charge users ludicrous sums of money to view the finished product.”
This is certainly not the first attack against the publishing process nor the first to encourage open-access publishing. In the remainder of her article, Ms. Belluz focuses on the role that governments can play in getting more scientific research freely and instantly available. In sum, she suggests that government funding agencies (e.g. the United States National Institutes of Health or the Canadian Institutes of Health Research) could refuse to give grants to those scientists who did not publish in open-access journals.
This is a laudable, and indeed it is the approach being taken bit by bit by funding agencies – the Wellcome Trust in the U.K. for example has a very robust open access policy that includes providing grant funding for the open-access charges. While this will certainly get more research out sooner and without charge, I believe it misses out on an important aspect of the power dynamic that plagues the scientific publishing process.
The fact is that journals with high impact factors wield enormous power because they hold the key to scientists’ careers – the field has become so obsessed with metrics that it is insufficient to be a good scientist with good ideas and the ability to perform good research. As things stand now, if you want research grants (and in most cases, this means if you want a job), then you need to publish a paper (or several!) with a big-name journal.
So what can scientists do? Well, it turns out scientists are involved in just about every aspect of the publishing power dynamic. First, one needs to understand what’s at stake. Scientists want big name papers for three main reasons:
However, papers in big-name journals do not directly give you grants or jobs, nor are they the only way to be recognized as a good scientist. Other scientists make these decisions, but far too often their judgment is impacted by the glitz and glam of the big-name journals.
Jobs are often won by those doing research that has good institutional fit – they bring a novel technology, a new way of looking at things, or a broad network of excellent former colleagues – but jobs are often lost because the candidate is “not fundable.” The latter is more often than not decided based on where they have published and how a grants panel will view them. So it basically comes down to who can get grants. And who generally decides funding outcomes? Scientists.
I wonder how many grant panels have heard the phrase “the project looks good, but the candidate has only ever published in mid-range journals.” Indeed, I know several scientists who rank applications based on a candidate’s publication record irrespective of how good or bad the project is or how well-resourced the working environment is.
One suggestion: Ban the CV from the grant review process. Rank the projects based on the ideas and ability to carry out the research rather than whether someone has published in Nature, Cell or Science. This could in turn remove the pressure to publish in big journals. I’ve often wondered how much of this could actually be drilled down to sheer laziness on the part of scientists perusing the literature and reviewing grants – “Which journals should I scan for recent papers? Just the big ones surely…” or “This candidate has published in Nature already, they’ll probably do it again, no need to read the proposal too closely.”
Of course I generalize and there are many crusaders out there (Michael Eisen, Randy Sheckman, Fiona Watt, etc.) pushing to change things and I mean them no offence. I just wish that more people could feel safe enough to follow their lead. In my own journey to start up a lab, I am under enormous pressure to publish in a big journal (i.e., my open-access PLoS Biology paper doesn’t make the grade and open source juggernaut e-Life has yet to achieve high-level status despite its many philosophical backers).
So, in sum, scientists in positions of power (peer reviewers, institute directors, funding panel chairs) are the real targets for change. Assess based on research merit, not journal label. Let’s make journals tools of communication, not power brokers of scientific careers.