It should come as no surprise that I am a strong advocate of knowledge translation. While this has customarily meant making science accessible to persons that are not experts in one’s field of study but are otherwise important supporters of one’s work, translating research across language barriers even within a field is an equally important pursuit. Indeed, while most impactful scientific journals today are published in the English language, I shudder to think how much excellent science is being published in other languages to which I have absolutely no access. How often has a cursory internet search in your primary discipline pulled relevant manuscripts in Spanish, French, German or Italian, which you have all but ignored?
While the humanities have done a far better job of translating their most important work across many different languages, a large number of relevant research will inevitably slip through the cracks. This is why I was so enthralled the other day while I was reading through this manuscript from Kaufman et al. (1965), and came across the paper summary, an excerpt of which I have included here:
What struck me was the inclusion of a summary in Interlingua, which I had no trouble comprehending despite it not being written in any of the languages to which I have some cursory fluency in. Interlingua, which I later discovered was developed as an international auxiliary language (IAL) between 1937 and 1951 by the International Auxiliary Language Association (IALA), is the most widely used naturalistic IAL that I had never heard of, and the second most widely used IAL after Esperanto. Developed to combine a simple, mostly regular grammar with vocabulary common to the widest possible range of languages, written Interlingua is comprehensible to the hundreds of millions of people who speak a Romance language.
While the practice of including an Interlingua summary at the end of every published work is no longer common practice (if it is presently done at all) – society would be very well served by investing in an application that can translate published work into Interlingua, and journals should host Interlingua translations of their manuscripts online to better disseminate research worldwide. The better understanding of the natural world is, above all else, a collaborative venture and I for one would certainly appreciate more access to my colleague’s works.
Earlier this month, I was gobsmacked when a colleague told me of their paper’s afternoon journey from submission to acceptance in a peer-reviewed journal. Not only was this a lightning fast acceptance, but it was the paper’s first submission, i.e., it had never been through peer review. It was received by the editor, read by the editor and accepted by the editor all within a four-hour time frame – and now it’s online as a Letter to the Editor.
Just a letter to the editor
People make the argument that this report is merely a letter to the editor and therefore is simply a commentary or “useful tidbit” for the scientific community. However, this particular paper has two data figures, one of which has extensive cellular characterization that certainly represents data which could (in theory at least) be unfavourably received or criticized by peer review. So, in effect, data now sits in black and white on the website of an important “peer-reviewed” journal not having been peer-reviewed by anyone else other than the editor.
Still counts for the impact factor
The really alarming thing about this practice is that it will contribute to the artificial inflation of this journal’s impact factor. Letters to the Editor do not contribute to the denominator of the impact factor, but citations they receive can contribute to the numerator – it’s basically like giving away free impact factor points. One wonders if the speedy acceptance had anything to do with a world expert in a field at the University of Cambridge reporting a useful tool for others in the field – d’ya think? To me, this represents complete and utter professional misconduct – the editor should be ashamed to put his/her name to the journal’s editorial board.
So, what can be done?
Aside from the massive overhaul of the peer review and academic publishing system that Jonathan and I regularly bang on about, some simple steps can be taken: 1) Thomson could reform its impact factor calculation such that nothing that appears as a “citable” item can be excluded from the denominator; 2) professional editors in academic publishing could establish a professional ethics standard that discourages anything in a peer-reviewed journal from escaping proper peer review.
Perhaps this is a one-off exception and this is the only journal that permits such practices. Perhaps, but considering I’m still at a relatively junior career stage and I’ve heard about this, I’m willing to bet our readers can share a story or two about similar practices in their own field.
At the end of the day, scientists need to step up and demand better from academic journals. If the industry is going to be run by professionals, then the least we can do is demand better standards.
Biomedical research at academic institutions is mostly funded by federal agencies such as the National Institutes of Health (NIH) and National Science Foundation (NSF) in the United States, and the Canadian Institutes of Health Research (CIHR) and National Research Council (NRC) in Canada that are themselves supported entirely by taxpayer dollars. While scientists are required to justify their research programs to a committee of peers in order to secure grants, few efforts are made to communicate this research to the lay public that funds them. The result has been an increasing disconnect between basic science and its benefit to society.
Most people have a very limited grasp of the state of the art in any given research discipline and the advancements in that field that they help support (see, for example, this editorial in Nature Chemical Biology and a similar opinion piece on science and democracy in The Scientist). Not surprisingly, this has had longlasting implications on how society approaches both short- and long-term policy decisions, and affects the state of science funding in our country.
“There are good reasons to regain the respect of the public. At the end of the day it is the taxpayer who elects the government that controls the purse strings and, consequently, the direction that science takes.”
- Who is Directing Science? Sarah Tilley, 2002.
One solution I propose is for publicly funded research groups, as a major stipulation of their grant, to produce two- to three-minute videos every five years that summarize their research programs. These should be published in open-access journals or academic department/university, public health organization, or grant organization websites where they can be accessed freely by the general public. Indeed, a major limitation scientists have faced communicating basic research to the lay public has been a tendency to ramble, provide too much detail, use technical jargon that is otherwise meaningless to non-specialists (and oftentimes other scientists), and lead/follow their descriptions with multiple, often unnecessary, caveats that undermine the credibility of the work and diminish the listener’s interest.
While efforts such as the American Society for Cell Biology’s (ASCB) Elevator Speech Contest at the society’s 2012 annual meeting are a step in the right direction (Communication: Two minutes to impress), more needs to be done to disseminate this understanding to the larger public. Short descriptors of publicly funded research projects are useful when fielding questions about one’s job, asking for money, talking to politicians, wooing potential collaborators, and even during casual conversations with friends. Scientists have a social responsibility to educate their communities about the importance of the work that community is helping support, and the current state of the art in the field in which this work is being done.
Part of communicating that message is making sure it bridges the divide between basic science and public interest. Animating complex biological processes contextualizes them within their underlying physiology, identifies gaps in our mechanistic understanding, affirms the importance of continued research, provides a bridge between academic scientists and the lay public, and can help promote individual research programs, departments, and institutions. Most importantly, animated videos can justify to taxpayers the considerable costs and time-to-completion inherent to basic research by explaining both the process and ultimately the value (both social and economic) resulting from their investment.
This idea is by no means novel, and was the topic of a NatureJobs column titled “Animating Science” which emphasizes that with the right execution animation promotes science in an engaging, memorable, and concise format. My foray into animating science has been through the support of a project led by Alice Chen (a professional animator) to produce a children’s book titled Lucea Lights Land and Sea that teaches children the basic concepts of bioluminescence, adaptation, and biodiversity in an engaging and entertaining manner.
While I am not proposing that we all write books (although who better suited to engage children in science than scientists themselves?), academic researchers could traverse the present disconnect between their own work and the lay public’s understanding of it by producing short videos describing their scientific projects and affirming the importance of continued biomedical research.
To prove that and how it can be done, I have produced a video series on the topic of platelet production (my personal research interest). To prove that video summaries of basic research projects are not just of value to the general public, but to scientists at large (e.g. as a teaching tool), and confirm that there are concrete academic/career benefits to the researcher in producing similar animations (e.g. self-promotion, recognition amongst peers, publication record), I have published this work in the Science and Society section of the journal Trends in Molecular Medicine. (You can follow this link to view the manuscript, and I encourage you to read it alongside this article.)
Specifically, this manuscript highlights the value of scientists interacting more closely with animators of the purpose of better communicating their research to a wider audience using platelet production as an example. In it I raise emerging concepts and ideas regarding how scientists can best communicate/translate their basic research to the lay public, other scientists and clinicians, and should be used as a guideline for generating videos of your own.
As always, I am happy to take your comments, and look forward to continuing the discussion of how scientists can take meaningful steps to address the increasing divide between basic science and public interest below.
Calling all North American funding agencies!
Researcher mobility appears to be a high priority for funding agencies and universities, and it has many advantages for the science community – most importantly the sharing of new ideas and the formation of new networks. Recently, there has been a backlash against the “need to move,” with many scientists doing perfectly well in their current city/institute and relocations being costly in terms of time and money.
Loss of (or reduced return on) pension contributions is one of the many annoyances contributing to the latter. But finally there is a step in the right direction coming from Europe: the RESAVER Pan-European Pension Fund. It won’t launch until next year so I cannot comment on its mechanics nor its financial benefits/drawbacks. But, if it does what it says on the box, it will be an incredible boon for young researchers.
In excess of those for whom shifting pensions would be inconvenient, the larger class who stand to benefit from RESAVER are international postdoctoral fellows. There are a huge number of scientists on temporary contracts in foreign countries for whom it does not make financial sense to start a pension. Indeed, many pensions requires multiple years of contributions before any benefit is received. This means that those on short-term contracts would be foolish to sign up for such a plan, yet many contracts in the laboratory sciences are just that – fixed terms of 1-3 years. A plan like RESAVER would permit a researcher to leave their home country for an unspecified term and contribute to a pension plan where the benefits would transfer back home without losing out.
Consider this: the median age at PhD completion in the United States is 31.8 years (2012, NSF statistics). It is less likely that students have sufficient funds to contribute significantly into pension plans, meaning that they are already behind their peers who elect to become teachers, lawyers, accountants, etc. Now, after their training, should one of these recently minted PhD holders decide to move abroad for 3-5 years (or more!) of postdoctoral research, they are all of a sudden finding themselves, through no fault of their own, moving back home as a highly trained researcher without a dime in pensions at the tender age of 37.
An internationally transferable pension plan would alleviate a significant portion of this problem, but pension plans are incredibly difficult to set up forms scratch. This is what makes the RESAVER programme even more incredible since the European Commission has provided funds under it’s Horizon 2020 program to help establish the pension fund.
As it currently stands, this style of plan would give European researchers a solid financial reason to make their academic re-locations within Europe to retain the benefits of their plans, potentially hampering recruitment efforts by other countries.
For these reasons, I urge Canadian funding agencies and universities to consider such a progressive programme – at the very least within North America, but ideally worldwide. International fellows represent more than 50% of Canada’s young scientific talent and this would be a massive step forward in supporting this cadre of researchers.
Our summer posts had a theme it seems – something we didn’t plan, but which has resulted in a small series of posts on misplaced priorities in academic research. From my post on academic bullying to Jonathan’s on the difficulties resulting from indirect costs levied by universities to our guest blogger Damien on hiring strategies in laboratories. The comments were plentiful and gave us a good indication that these problems (and proposed solutions) need to feature more frequently on the blog.
One of the comments (thanks David!) directed me to something I’d not heard of before: The Polymath Project, where problems in mathematics were crowd-sourced by Tim Gowers. This is a wonderful example of collective problem-solving ability. I wonder if it would have legs in the life sciences… But, as I mentioned in one of my posts this summer, life sciences seems to be focused on individual success/reward structures.
Jonathan and I will of course write much more about these topics in the future. But for now, let’s recap the summer’s posts:
- The math of academic research grant support doesn’t add up
- The honour society: value in social exclusion
- How to build Canada’s science and technology infrastructure
- Scientists should strive to win the world cup, not the golden boot
- Is the academy worse than the fashion industry for “following the leader”?
In last week’s blog post (“How lab managers hire for science“), Damien raised an interesting point regarding best hiring practices for new academic faculty that I felt should be highlighted here. Damien recommends that when screening research-scientist candidates for the lab, principal investigators should “identify individuals who lack skills that a new investigator can provide. A candidate applying to medical school or to grad school will be seeking research skills and recommendations. Eliminate all other candidates who don’t fall into these categories. It’ll save time by eliminating applicants who are seeking alternative limiting resources, like higher pay” (my emphasis).
This is an interesting perspective for our site because Dave and I spend so much time advocating in favor of higher salaries for trainees, not lower. And still, Damien’s advice is not wrong. The reason is as follows, and I am using this year’s R00 National Institutes of Health (NIH) grant as an example:
On a 2014 R00 independent research grant awarded by the NIH to qualifying recipients upon securing their first independent academic faculty position, total costs are limited to $249,000/year for three years. From this the institution subtracts its indirect cost rate, which varies between institutions but at Brigham and Women’s Hospital is currently set at ~77% of direct costs. Briefly, yearly costs are calculated as follows:
[total cost] = [direct cost] + [indirect cost] = $249,000.00
[direct cost] = [total cost]/[1.(indirect cost rate)] = $249,000/1.77 = $140,677.97
[indirect cost] = 0.77*[direct cost] = 0.77*$140,677.97 = $108,322.03
Research budgets for academic scientists are limited to their direct costs. A budget is calculated per year, and includes the following major line items to which I have attached conservative costs based on my own career experiences:
[salary] = $75,000
[fringe benefits] = [fringe benefits rate]*[salary] = 0.35*$75,000.00 = $26,250.00
[publication cost] = [cost of publication]*[# of publications per year] = $5,000.00*2 = $10,000.00
[research cost] = [direct cost] – [salary] – [fringe] – [publication cost] = $140,677.97 – $75,000.00 – $26,250.00 – $10,000.00 = $29,427.97
In many American institutions such as mine, salary for academic research faculty is not supported by their departments or institutions, and must therefore be derived entirely from their research grants. To sustain research funding, academic faculty must spend the greater part of their time writing academic grants. The percentile rank up to which academic research grant applications in the U.S. will receive funding currently averages 12%. As a result, performing actual research becomes nearly impossible without the help of a junior research scientist (trainee).
The NIH has set minimum yearly salaries for trainees for fiscal year 2014 at the following:
[predoctoral] = $22,476.00
[postdoctoral, 0 years of experience] = $42,000.00
Remember that this is base salary and does not include fringe benefits. Despite the fact that salaries for academic research scientists are already low and do not increase with inflation, new faculty cannot afford to hire help without further institutional support. Nevertheless, academic output is measured by the number of peer-reviewed scientific publications, and the expectation for academic faculty at top-tier American research institutes is to publish at least two papers per year.
Institutional support (when it is available) typically takes the form of a one-time start-up package that supplements grant funding for 1-3 years. More often than not, start-up funds are used to purchase necessary equipment, and at best provide new academic faculty a short-term solution to funding their academic research program. New academic faculty are therefore expected to apply for (and receive) continued grant funding while maintaining a successful independent research program without help on less than $30,000/year.
Increasing research salaries, while desirable, is therefore not possible without an accompanying cap on institutional indirect cost rates. Efforts to do this have been met with strong resistance by top-tier American research institutions. The alternative is to increase academic research funding, which is not likely in the near term.
At the end of the day the numbers need to add up, and despite the strongest possible track record of past funding and publications, research plan and drive, the current math of research funding does not support a successful academic career.
We are very pleased this week to introduce a guest post from Damien Wilpitz, an experienced laboratory research manager at Brigham and Women’s Hospital in Boston. Damien is also the founder and manager of Experimental Designs Consulting, a management consulting firm specifically tailored to new academic science faculty. His article this week (hopefully the first of many) focuses on a critical area where young investigators typically drop the ball – new hires.
Most academic life science investigators struggle in the early years of lab start-ups, not because of their science, but from poor lab management. Much of this mismanagement invariably comes from poorly managed teams. For example, we all know a postdoc, tech or grad student who doesn’t seem to be pulling their weight. We have no idea why the principal investigator keeps them around. They haven’t produced any data for months and they hardly come in.
Some investigators find it very challenging to manage difficult personalities or to fire people. These lackluster teams are usually put together by investigators who are the under stress to produce. Therefore they try to hire quickly and cut corners in the process. This is where mis-hires begin. The costs associated with mis-hires can be staggering.
The age-old saying goes, hire slowly and fire quickly. However time isn’t necessarily on the side of a young lab leader.
I’ve been a lab manager for the better part of two decades and consult on a regular basis with young investigators. I found solutions and trends that offer strategies to ensure a great hire and will save precious time.
1. Categorize candidates according to the lab’s strengths. Eliminate the rest.
To quickly filter through a lot of applicants, there should be key metrics or categories for screening applicants that play to the strengths of a new investigator. Identify individuals who lack skills that a new investigator can provide. A candidate applying to medical school or to grad school will be seeking research skills and recommendations. Eliminate all other candidates who don’t fall into these categories. It’ll save time by eliminating applicants who are seeking alternative limiting resources, like higher pay.
2. Utilize technology to save time that would otherwise be wasted on scheduling.
Scheduling those in-person, face-to-face interviews can become a big time suck. Conduct preliminary interviews over the phone or via some form of video conferencing, i.e. Skype or Facetime. It’s more convenient for both the investigator and the candidate. Once you feel comfortable with candidates over a phone/Skype interview, you can then bring in the top two to three for a face-to-face interview.
3. Have an interview/dialogue with all references.
The third and most important strategy for hiring, and where spending time is actually important, is in the reference checks. I find principal investigators aren’t spending enough time on this part of the process. Reading a reference letter or email is not enough. It’s important to have a dialog with a prospective candidate’s references. This can create a clearer picture of the candidate. Letters and emails cannot convey the same level of intonations and emotions as human dialogue, and a lot of the most important information about a candidate can be inferred by reading between the lines.
I routinely use these strategies to help new and established investigators build strong solid teams. Great scientific teams start from hiring. By cultivating great teams through a purposeful recruitment process, your odds of continued success improve dramatically.
The last four decades have seen a steady increase in the number of authors on scientific publications. Since 1975, when there was on average 1.9 authors per paper, we have seen increases each decade to 3.12, 3.76, 4.61 and finally 5.12 authors per paper in the period 2010-2013. It is clear that science has become a team sport, yet our rewards mechanisms are still almost completely based on individual achievements.
Individuals get papers. Individuals get jobs. Individuals even get Nobel prizes. But… individuals don’t do science by themselves.
Many of my colleagues will know that I have a mountain/valley analogy for the authorship list of a life-science paper. The first and last author peaks are given the highest importance and the valley exists in between where the “others” live – higher up on either side is even marginally better.
But what are the consequences of this misaligned reward system?
Ambition trumps science
Everyone who is in a medical science lab (and probably others!) will be familiar with authorship discussions/battles. Horror stories emerge from research groups about “stealing the credit” or choices being made for “political/career reasons.” These incidents are not uncommon, but they are also subject to each person’s opinion. I often joke with my colleagues about how every research paper is actually 200% work because if you total each author’s perceived percentage contribution it would always exceed 100%. But the important first/last places are always claimed by the “leaders” – real or perceived – and this raises a whole range of sociological issues for how that is determined.
No reward for helping others
For me, this is by far the biggest problem. Imagine you are in research to reach a particular goal (e.g., cure breast cancer) and you work toward that goal by getting really good at a particular job that is essential to advancing cancer biology. Let’s imagine you develop a technique that lots of people use but one that rarely forms the main thrust of the “publishable story.” You have helped advance cancer research, but your academic career could be at risk (e.g., no first-author papers). If you do this at the postdoctoral stage of your academic career – you’re toast.
The logical career path therefore is to only work on projects that have the potential of being first-author papers. In other words, we actively discourage working together on a project. One possible solution is listing people as equal contributors (also known as joint first authorship), but this does not always work out so well for the second name listed. Indeed, this makes me wonder how things play out with joint first authorship between the sexes – my guess is that men more often lay claim to the pole position of a “joint” first author paper…
Academics could take some lessons from companies
Industry works almost completely differently when it comes to this – a defined goal is set (e.g., develop product X that can do job Y) and a team of people is assigned to achieve it. If there is a particular person that does an outstanding job, they get rewarded (maybe a promotion or a bonus), but you rarely see “wonder drug, designed by Jon.” While I can certainly appreciate the unpredictability of scientific research, there are still broad goals. Notably, industry also manages to figure out how to recruit and retain people as well making me wonder what metrics are used – they probably do not rely on first/ last author publications.
So, yes, there is a problem – but how do we solve it? Journals have started to request individual author contribution statements which is a good thing, but the vast majority of these simply say “did research” or “performed experiments” – hardly informative (or noticeable!) should you be on a hiring committee judging a scientist’s merit. A very interesting article in 2007 from PLoS Biology pitched the idea of splitting the impact factor of a paper across the authors by percentage contribution. This would have the added benefit of policing the number of and/or subsequent credit given to “ghost” contributors (those who gift a reagent or patient sample that is important to the paper, but really had no involvement in the paper itself).
At the end of the day – the public via government and charitable funding provides academics with a huge pot of money to help advance society through ideas, healthcare solutions, and technological breakthroughs. Our current rewards system is designed to get a single person working on each problem – and if it’s a big problem, many single people working individually with the person that gets there first getting the reward. Am I saying that an individual cannot have an excellent idea and pursue it through to completion? Absolutely not – but the reality of the situation is that society ends up better if we work together to win the world cup, not individually to get the golden boot.
Dave published an excellent post last week where he compared the academy to the fashion industry for its general lack of innovation and conformist social exclusion. Today I thought I’d play devil’s advocate to Dave’s very well-received piece, which almost always lands me in trouble. In the interest of staving off the expected torrent of personal attacks on my character, let me begin by stating clearly that the views and opinions expressed in this article are not those of the author and do not necessarily reflect the official policy or position of this or any other organization. Let’s begin.
Earlier this week I received the following email. One of many that typically goes unread, it had one interesting detail that caught my attention and ironically saved this communication from the trash bin.
Dear Dr. Jonathan N Thon,
We have been through your articles and we are enthralled to know about your reputation and commitment in the field of Psychology. We strongly believe that this potential research in the field of psychology would be beneficial to the people working in this field.
International Journal of School and Cognitive Psychology (IJSCP) invite you to submit a mini/full review manuscript on any topic of your choice related to our journal.
You might be interested in our latest issue.
Papers may be submitted from any discipline related to but not limited to Cognitive psychology, Mental process, Memory, learning, problem solving, Neuroscience, Attention, perception etc.
Please let me know if you require any more information. Awaiting for your reply,
Assistant Managing Editor
International Journal of School and Cognitive Psychology
It’s not typically my habit to so blatantly call out individuals or organizations for holding positions with which I disagree, but this correspondence needs to be publicized. Why? For starters I am not, nor have I ever been, a psychologist. I earned my PhD as a biochemist. I have drifted professionally to become a cell (more specifically, platelet) biologist. I maintain an active online presence and even a cursory Google search for my name would reveal that I have never published a paper in a psychology journal and am not qualified to academically review this field.
What’s worse, is that I’m tempted to believe that were I to submit a mini/full review manuscript to this journal, there is a high probability that it would be published. This is a single (albeit telling) example, but it prefaces a larger problem in higher education.
Academic specialties are by definition niche environments where active participants have more than a passing awareness of the other players in their field. This is why, in part, most of us who have ever submitted a manuscript for peer review have a fair idea of who reviewed it, despite having been blinded to our reviewers, and why the community assembles itself into cliques. The best protection against poor or bogus science in an age where just about anyone can publish in what (at least at first glance) may appear as a semi-reputable journal, is to excessively filter our content.
We do this in many ways, and some are more accessible than others. For lay audiences and newcomers to a field, the most reliable way to separate reputable science from the rest is by relying on the journal’s reputation. Some journals are better than others. Although a lot of papers are retracted from top-tier journals we like to cite and love to hate (e.g. Cell, Science, Nature, etc.), the likelihood that the data published in these manuscripts have been properly vetted and are reputable, accurate and impactful is significantly higher than were it coming from the International Journal of School and Cognitive Psychology (for example).
The other way is by relying on the reputation of the scientist or group. Oftentimes, newcomers will not know who the heavy hitters in their field are and must therefore fall back to their institution or (and yes, it happens) their country. Some institutions and some countries have better reputations for scientific integrity and quality research than others. These also vary dramatically by field, but there are a select few institutions in every country that have established strong reputations across the board. They are typically also the largest, best funded institutions and the two are not inextricably related.
Among the active players in a field there is a tendency to view more favourably work coming out of top groups than that of unfamiliar research programs. This is not a bad thing. Top groups have established their reputation for quality research and innovation over many years, often having made paradigm-changing discoveries that have stood the test of time, and there is a reason why all eyes turn to them when a new finding is made.
Is peer review without its faults? No. But allowing anyone to publish anything because they believe it to have merit is not the answer either. The fallout of peer review is that a lot of good ideas will not get published/funded because they don’t rise to an ever greater standard of proof. The alternative is bad science – and this is arguably worse.
I hate to admit this, but I find an incredible number of scientific papers really boring. It seems that more and more, research papers are using the same sets of sexy and expensive tools without actually answering the question they set out to explore and overload their readers with “big data”. It further appears that this is the primary formula for getting published in big journals – and the nasty part of that whole business is that publishing big is controlled by an ever-diminishing fraction of the world’s scientists.
Remember when you were in high school, and there were popular ways to dress and popular places to be? It was difficult for some kids to afford to keep up while other non-conformists simply opted out of “being popular”. Eventually, we look back fondly at these people who didn’t follow along – many of them had a much better sense of self and preferences. No matter how much the popular groups or trends pushed, some people just didn’t buckle and emerged many years later as cool people with novel ideas.
My fear is that the academy is subject to the same primitive bullying techniques resulting in social exclusion as a consequence of breaking rank. The system (unknowingly?) props up the careers of a cadre of researchers who are just really good at following along. The really sad corollary to this in the age of tight funding is that we lose the non-conformist kids who have the creative ideas of today and tomorrow. Surely universities are the place that should foster new and alternative ideas and approaches and be immune to such behaviour. Academic bullying is a problem and it’s squeezing the creativity and lifeblood out of science.
Let me explain how I see this operating. The three things that matter most to a scientist’s career progression are publications, grants, and personal reputation (e.g., the ability to attract the best PhDs and postdocs). All three are determined by a frighteningly small number of people who have the power to socially exclude for their own benefit (e.g., keep an idea out of the mainstream, promote the careers of the people they like, etc, etc). While they don’t necessarily do this, the power is theirs to wield.
How might this manifest itself? One example is that the experiments requested by reviewers are often expensive and technology-laden, only really performable at the top-flight institutions in the world (kinda like that new watch that everyone “must have”). Dan Tenen, a professor at the Harvard Stem Cell Institute jokingly refers to these as “Figure 5 – the experiments that the reviewer requested and never mean anything, but had to be done to get published”. While Dan’s lab is in the position to do the experiments and poke the fun at the process, this is sadly not the case for the vast majority of research labs. Not only does this process slow down science, but it also makes non-privileged scientists collaborate with the top dogs, thus reinforcing the circle. If the experiment addresses a fundamental flaw in the paper, fine – but I worry that this is not often the case.
Moreover, granting and funding agencies have “go to” people for peer review and one of the worst things they’ve done recently is made these panels public before applications are submitted. The “followers” will study these panels, look for what they’ve published and how they think and write their application to meet these criteria. Some people call this good strategic planning, I call it a unfortunate side effect of the need to survive. Again, we risk squeezing out the good novel ideas.
The challenge going forward must therefore be to create a scientific research environment where the pressure to publish falls a distant second to new idea generation and development of the human capital. At this juncture though, careers depend on papers, so scientists will do what it takes to get published… sadly this all too often means towing the party line and not really exploring new ideas.
There are some interesting models out there for how to tackle this and I’ll be exploring those in future articles – for now though, ask yourself how representative our current system is when we often rely on the judgment of two to three experts chosen by a single journal editor or funding agency…