Over the coming weeks, I’ll be breaking down the fantastic information found in the Canadian Association of Postdoctoral Scholars 2013 Survey of Canadian Postdocs. To start this, I thought I would focus on the most surprising finding in my mind: 53.1% of the 1,830 respondents were either landed immigrants or holding a work permit. This is an incredibly high fraction that represents a huge opportunity for Canada, but only if policies and programs are designed to maximize the influx of such talent.
Plenty of non-American talent
Many of Canada’s postdoctoral fellows travel abroad and many find themselves in the United States, but the converse is not as frequent as many people think. Indeed, just 8% of international postdocs are from the U.S. whereas both France (13%) and China (12%) supply higher numbers of international researchers to the Canadian workforce.
When asked why they moved to Canada for research, facilities and resources were chief amongst reasons, showing that Canada has clearly created an excellent research environment. However, without the correct numbers and types of jobs available following this temporary period of research, it is not surprising that many leave the country. Funnily enough, the major challenge cited by international postdocs is not something remarkably academic or specialized, but rather “transitioning to life in a new country” and “visa/permit issues”- surely Canada can do a better job of making its talented young people feel more welcome.
You may ask why Canada should invest in these young researchers when they will all run away back to their home country? Again, the CAPS survey sheds light on this issue, showing that only 25% of researchers on work permits and just 3% of immigrant researchers have definite plans to leave Canada. There is a huge opportunity to capture this bright class of motivated young people to drive economic benefit for Canada, but we again do very little to support this permanent relocation.
Where does this leave Canadian researchers?
Jonathan just posted last week about attracting and retaining talented researchers, pointing out both the importance of international experience and the need, in Canada especially, to create jobs for researchers. Those jobs do not have to be academic jobs, but they do have to make the case for staying in – or coming back to – Canada for long-term employment.
As a Canadian-funded postdoctoral fellow working outside the country, I have lamented the lack of connectivity between Canadian funding bodies and institutions. My PhD and postdoctoral training cost CIHR $210,000 in salary alone and they have done virtually nothing to encourage my return. Indeed, funding agencies, institutions and companies do very little to attract its early career scientists back to Canada (both Jonathan and I can attest to this) both for academic and non-academic jobs. I think that two main problems exist: 1) lack of networks 2) poor programming for its fellows.
When one is looking for a non-academic post (industry, science writing, consulting, law, etc.), you are much more likely to do this locally. In my own case, a move to industry in one of the Cambridge biotech science parks would be much easier than trying to figure out the lay of the land in Toronto, Montreal or Vancouver. This is mostly because I regularly meet and interact with scientists who are employed with these companies and are collaborating with academics at our university.
EMBO, and countries like the UK and Australia, have come up with ideas on how to do this. EMBO created a “Fellows Network” that meets regularly and interacts with academics and non-academics; the U.K. encourages international applicants to its independent funding programs (Career development awards) and Australia ties the latter portion of grant funding to a fellow’s “return to Australia.” As far as I can see, Canada lags in this area and desperately needs to rethink its policies if attracting Canadians to return to work in Canada is a goal.
Overall, Canada needs to support both cohorts of talented researchers in order to capture the best and brightest minds to drive critical and inventive thinking that forms the baseline for discovery and innovation. Creating programs to bring back internationally trained researchers and encouraging Canadian trained international researchers to put down roots are not trivial tasks especially when the people making these decisions are (as described in the CAPS survey) adults “in the middle of their lives, but at the beginning of their careers.”
For the past year, I have been sitting on the publications committee for a society-run journal and in the journal’s quest to improve its impact factor (IF), it became clear to me that one of the system’s dark secrets is the “window of IF eligibility.” It single-handedly disadvantages journals whose science stands the test of time and favours journals that have speedy public relations’ campaigns.
For those not aware of it, a journal’s IF is based on two numbers for year X:
- The number of times articles published in the two years prior to year X are cited during year X
- The number of citable articles published in the two years prior to year X
The IF is simply the first number divided by the second.
This means that all articles published more than two years before the year being evaluated do not count for anything in the IF world. The IF metric significantly impacts the careers and fundability of scientists, but the practice of only counting two years post-publication is damaging for several reasons.
No benefit to standing the test of time
Not many people regularly scroll through the table of contents in journals with an IF of 3, so when very good papers are published in low-ranking journals, it sometimes takes a little time for them to get on people’s radar. Most scientists would agree that a paper needs to stand the test of time, and indeed many papers in low IF journals do exactly that, but they do not end up helping the IF of that journal.
A case in point, there was a 2003 paper in my field published in Experimental Hematology (IF < 3) that was cited 6.5 times per year in the eligibility window and an average of 10 times per year since. By contrast, a paper on the same topic in the same year in Nature Immunology (IF > 20) was cited 17 times per year in the eligibility window and an average of 11 times per year subsequently. The papers end up in the same place (i.e., cited about 10 times/yr) but they differ dramatically in their contribution to the journal’s IF (17 vs. 6.5).
Bias against poorly promoted journals
Big journals have big public relations teams – they do big press releases, and they carry an already big name. Obviously, getting quick publicity for articles will lead to citations earlier than papers that aren’t read immediately.
Artificial bias toward being trendy
Surely, in the age of reddit, Twitter and things “going viral,” we can appreciate that trendiness does not equate importance. However, our metrics for evaluating scientists almost fully rely on how trendy their research is (or at least on how trendy the journal they’ve published in is), and not how important or good quality it is. This translates into a culture that rewards buzz words and style over substance.
So, what can we do to improve the situation? Scientists could start picking up tables of contents from low IF journals (unlikely) or we could actually read beyond the abstract to see if the paper we’re citing actually proves a point and hasn’t been published somewhere else (also unlikely). I think the easiest way is to measure articles (and journals) by average citations per year since publication. I’d love to see what proportion of “high impact” papers crash and burn after the first few years post publication. It would not surprise me if the number was very large.
People often ask me what I would do if I were in charge of fellowships for Canadian trainees. In response, I will often slip into my usual refrain of making investment in people the basic tenet of any fellowship program. As it currently stands, the career track for academics artificially selects for those that can handle small amounts of stability and an incredible amount of career uncertainty. Many believe that a fellowship program that fights against this is dreaming a dream too big, but there is one organization working its tail off to combat the tide and Canadian funding agencies should be taking notes.
The European Molecular Biology Organization (EMBO) is at the forefront in fellowship program design. Its long-term fellowship program for postdoctoral fellows has the following components:
- useful eligibility rules
- child care allowance
- dependent allowance
- travel allowance
- parental leave
- part-time work policy (such that fellows can do 3 or 4 day weeks and remain funded)
- private pension plan
- Fellow’s Network
And EMBO is not a boutique funding organization (the 2013 Spring competition awarded 100 fellowships) so they’ve clearly figured out a way to invest in a broad set of useful programs for the diverse life situations that young scientists find themselves in. For this post, I want to highlight three of these components as things which every major trainee funding organization should consider implementing straight away:
Useful eligibility rules:
EMBO is realistic about who it wants to fund and makes no bones about telling those that do not fit the criteria that their application will be thrown into the bin straight away. Statements like: “Applicants must have at least one first author publication accepted in press or published in an international peer reviewed journal at the time of application” make it very obvious that if you don’t have a paper, you will not be getting a fellowship. This cuts down on wasted application time and reduces peer review burden.
Private Pension Plan:
One of the most drilled home messages of financial advisers is to start pension savings early. This is not only advisable for the Canada Pension Plan (where each year helps you earn more later), but also private pension plans where compound interest relies on early starts. Early career researchers, often located in foreign countries, will often not get enrolled into any sort of pension plan until their mid-late 30s due to the transience of academic training. EMBO therefore created its own, internationally transferable, pension plan for its fellows – genius.
One of the greatest travesties of the Canadian trainee funding system is the lack of connectivity that the funding organizations have with the recipients of their money. While former trainees will remain on mailing lists sometimes, that’s about as good as it gets. After funding expires, fellows often drop off the face of the earth. EMBO, it seems, has figured this one out too – their FellowsNet is highly interactive and also appears to be for life (can someone confirm this?). It is not simply about record keeping, but rather is about creating a community of like-minded individuals who build lifelong friendships and collaborations.
From my own experience, my hat needs to be tipped in the direction of CIHR for their progressive research allowance policy. This funding allowed me to have decision making power over which conferences I attended and avoided many awkward discussions about where funding would come from. I can see the movement in the right direction at CIHR, and thankfully there are groups like EMBO doing the trail-blazing – all we need to do is follow along.
Today we are very excited to have a guest post from one of Canada’s new Banting Fellows, who has asked to remain anonymous. You may be surprised to read this person’s assessment of Canada’s “Cadillac” award for postdocs. The most challenging question, from our perspective, that our blogger raises is: Are universities buying the fellowships?
One year ago, the Government of Canada launched the Banting Postdoctoral Fellowship to much fanfare. The program aims to “attract and retain” top-tier international talent and position award holders “for success as research leaders of tomorrow.” Despite some initial reviews, there has been little evaluation of how the scheme is faring. My aim here is to provide my own perspective as a life scientist holding a Banting at one of the largest universities in Canada.
The goals of the Banting fellowship are certainly laudable. Foremost, they aim to provide early career scientists with the flexibility and support to establish an independent research career. The trouble with the awards, however, is that they only last for two years. This prevents award holders from establishing a presence as a leader because they cannot apply for research grants from the tri-councils on a two-year, non-faculty position, nor supervise graduate students, because they’ll be out of a job before the students finish! Ultimately, I think the lofty ambitions of the program will go unrealized because of this limited tenure.
Other countries, such as Britain and Germany, have similar mechanisms to recruit the world’s top postdoctoral talent. The difference is that they recognize that becoming a research “leader” means just that, the ability to lead a group of researchers in developing a comprehensive body of work. Top programs in these countries that Canada should be emulating include the Royal Society University Research Fellowships, Advanced Fellowships offered by the UK research councils and charities, and Germany’s Alexander von Humboldt Foundation.
Obviously, offering longer fellowships comes at a cost. One option would be for the tri-councils to halve the number of Bantings and increase the funding to four years. This could be achieved by reducing the value of the award. Personally, I care very little about my salary, and I imagine that other researchers that are passionate about their work, place monetary gain well beneath their work. I would happily be paid half of my current salary if my Banting lasted four years, allowed me to apply for a CIHR Operating Grant or NSERC Discovery grant, and supervise graduate students. A second option would be to create new, longer-term fund schemes such as by re-allocating fund from other budgets. Such actions would put serious support behind new investigators in Canada and parallel many of the international funding programs mentioned above, which have both short- and long-term fellowships for candidates of differing experience and achievement.
My second gripe with the Banting fellowships is their definition of “institutional support.” This is very vague on the program website, so what exactly does (or could) it entail? No doubt anecdotal, but I am compelled to recount a tale of a friend of mine who is exceptionally successful in his field (physical sciences) and has worked at several of the top institutions in the world. You would expect him to be a prime candidate for a Banting and indeed, he applied for a Banting at one of the best universities in Canada. But because this university has an excellent reputation, it offered him no additional financial support. They felt that their reputation was sufficient reason for him to come to their institution, in addition to the collaborators that were there. In the end, he did not receive a Banting, despite being highly qualified with a strong research proposal.
By contrast, my own university has been exceptionally generous with their financial commitment to my research, demonstrating strong support to the Banting committee. Ironically, despite my host institution’s support, there are few staff members that I can engage with, especially when compared to my colleague’s choice of research environment. While this is a sample size of two, I cannot help but feel suspicious that some universities may be using the offer of “institutional support” to, in effect, “buy” fellowships to raise their profile. My host university has provided no benefits aside from research money, yet I would happily trade some cash for the potential to supervise graduate students.
To summarize, while I’m certainly better off that I’ve held a Banting, I can’t see how they are any different from a standard PDF. At my university, it makes no difference whatsoever that I hold the award – all post-docs are equally treated as “non-employees”! It seems to me that all the Government of Canada has done by creating this program is generated two salary tiers for PDFs, without additional benefits. To me, this seems like a huge misdirection of very limited resources by a government so preoccupied with fiscal accountability. The government needs to extend the fellowship duration and work with universities to deliver tangible research benefits if the program is to achieve its purpose and positively contribute to Canada’s growth.
This past weekend, I attended the inaugural meeting of the Canadian Association of Postdoctoral Administrators in Ottawa. As with most inaugural meetings, there was a combination of excitement and confusion but it appeared that the overall theme was one of identifying common ground and working together in the most productive way possible.
The stated aims of CAPA are to share best practices and to promote the environment for successful postdoctoral scholarship and training. The organization is made up of senior administrators and staff from universities and research organizations across Canada that focus on postdoctoral fellow issues. The steering committee currently comprises David Burns (UNB), Graham Carr (Concordia), Richard Fedorak (U of Alberta), Mihaela Harmos (Western), Sue Horton (Waterloo), Martin Kreiswirth (McGill), and Marilyn Mooibroek (Calgary). While not formally involved in the steering committee, postdoctoral fellows are consulted through the Canadian Association of Postdoctoral Scholars via guest status at teleconferences.
Many interesting items arose in the meeting and it would be hard to properly include them all, so I will restrict myself to some of the items that I found most interesting (all topics are found here, please write me if you would like more information):
Survey of stakeholders
Mihaela Harmos presented the results from 34/50 respondents to the stakeholder survey run in 2011. There are apparently 8,900 postdoctoral fellows in Canada, 45% of whom are not originally from Canada. Only half of these postdocs have minimum stipends and just 2/3 have some sort of benefits package available to them. Of these, approximately 25% pay for 100% of their benefits. Does such inconsistency exist for other professionals in training (e.g., accountants, lawyers, medical doctors)? Readers will know our opinion on this already.
In any event, such surveys will be interesting to monitor in the future to track changes in the quantity and quality of postdoctoral research support in Canada.
Legal status of postdoctoral fellows
We had an informative presentation by Lisa Newton, a lawyer based at Queen’s University, who shared some important points about the legal status of postdoctoral fellows. A major case came out of U of T this year that said postdoctoral fellows were employees of their universities. According to Ms. Newton, provinces look to the Ontario Labour Relations Board for precedent, so this will likely impact future rulings as they crop up.
As Queen’s postdoctoral fellows have recently unionized, Ms. Newton had particularly good insight and listed off some of the key challenges specific to collective bargaining for postdoctoral fellows:
- Job postings (timelines, impact on international recruits)
- Seniority (specializations of postdoctoral fellows are very different)
- Hours of work / overtime
- Postdocs are rarely discussed in university IP discussions whereas faculty members are typically considered. Generally it is thought “he who creates, owns”, what about postdocs?
- Mix of PI-funded and independently funded postdocs complicates collective bargaining
In discussions later on that day, it came up that there are union representatives pressuring postdoctoral fellows at several universities to unionize – have any of our readers experienced this?
NSERC CREATE numbers
As fast as my little pen would move, I scrambled to copy down NSERC’s numbers for its CREATE program. I’ve not seen these presented on their website in such a breakdown, so I thought it would be useful to share.
The vast majority of CREATE grants are for 1.65 million over 6 years and are meant to fund trainees under themed programs of research. CREATE does not fund actual research costs and 80% of the funds go into trainee stipends with the other 20% being for coordination and travel. So, who do they support?
This may well be the topic of another blog post about the CREATE model which has its benefits and drawbacks. For now, it is interesting to note how these awards stack up against the US National Institutes of Health recommendation from earlier this year which was to shift the balance away from grant-funded postdoctoral fellowships in favour of fellowship and training awards. The NIH shows that postdoctoral fellows who obtain merit-based awards (e.g.: fellowships) are more likely to gain independence sooner. It would be very interesting to see what comes out of CREATE in terms of times to graduation, publication record, and age of independence for these trainees vs. NSERC’s fellowship/scholarship funded trainees.
On a side note, in another session the topic of transition awards in Canada (e.g., NIH K99/R00 awards) was brought up and it seems that the biggest challenge for these from granting councils is to figure out where the money could come from. It seems they’ve made these awards a priority at the NIH – perhaps our leadership will see them as valuable as well.
Canadian Association of Postdoctoral Scholars (CAPS)
Luckily, the CAPA meeting also meant that many of the CAPS Executive Committee were in town and we took the chance to meet the day before the conference to carve out the key components of that organization. Members were very active in the CAPA meeting drilling home the three primary concerns of Canadian postdocs that the member university representatives agreed on:
- The need for clarity on the status, timeline, and treatment of PDFs at universities and partner institutes.
- More extensive professional development for PDFs (both academic and non-academic).
- Communication and collaboration between CAPS and CAPA and the national granting agencies.
There were several pleas made for more involvement of postdoctoral fellows in establishing policy that affects them (e.g., the NSERC decision to restrict fellowship applications to once per lifetime) and it seemed that the message was well-received, but the proof will be in the pudding as we move forward. Stay tuned for updates on the CAPS website and we’ll continue to give regular updates of advocacy efforts on this site.
I always knew that bad news was released on Fridays in the summer… but last Friday was pretty ridiculous. NSERC has just announced that in order to improve its success rate (just clocked at 7.8% in the most recent competition) it will now reduce the number of times an individual can apply for a postdoctoral award from two to one.
… now that your jaw is back in place, let’s look at what really matters. The absolute number of fellowships awarded by NSERC represents how many scholars it supports each year through its program, and no matter how many people are applying, this is the most important number.
Sadly, the last five years have seen NSERC’s funded fellowships drop dramatically (awards / applicants):
- 250 / 1169 (2008)
- 254 / 1220 (2009)
- 286 / 1341 (2010)
- 133 / 1431 (2011)
- 98 / 1254 (2012)
This is unbelievable and it cannot be sugar coated with a letter about streamlining or complaints about increased applicants (just a 7% increase in applicants from 2008 to 2012).
The sad facts are that NSERC is awarding 66% fewer fellowships. As you can imagine, this has had an effect on success rates, but NSERC’s solution is to try and reduce the number of applicants in an effort to bring up the rate so that they can rid themselves of their sub-10% success rates.
I can’t even begin to explain everything that has run through my mind while writing this post other than repeating that NSERC has completely missed the point.
If you would like your comments to be raised at the next CAPS executive meeting, I strongly encourage you to write them below.
For those who have not yet heard, the CIHR plans to make major changes to their funding mechanism for health scientists. Last week, at the height of summer vacation, the CIHR released a “What CIHR heard” document that summarizes the feedback they received on the proposed changes.
At first, I was simply going to pick out what I thought were the items of highest relevance to early career researchers, but as I perused the document, I noticed something quite interesting. At the end of every survey question, the CIHR lists the group of researchers (organized by field and seniority) who agree or disagree most strongly with each of the proposed changes and in every instance of disagreement one category (out of 12) of researchers is always listed: Senior Biomedical (Basic) Researchers. It might be unfair to suggest it, but as this represents the most entrenched set of researchers (those who have been there the longest and are within the biggest CIHR research tier) it stands to reason that maybe, just maybe, they are trying to protect the system that keeps them at the top with the most resources.
Below I have pulled out the levels of disagreement based on proposed change:
- Integrated Knowledge Translation (46%)
- Multi-phased competition process (57%)
- Application Focused Review (25%)
- Use of Structured Review Criteria (35%)
- Remote Screening Process (49%)
- College of Reviewers (37%)
Overall, the picture is quite different, with many groups (especially in younger researchers and those in applied research areas) showing much stronger support for the proposed reforms. It seems to me that those who want things to change the most are those who do not have access to as many resources and hope that CIHR’s reforms will help redistribute the wealth.
In the very comprehensive report, numerous ideas, complaints, and comments were provided, some of which I would like to highlight:
Too many eggs in one basket - the proposed structure states that if an investigator has a program grant already, they cannot apply for a second grant (program or project). This prompted many concerns about what happens in the case of poor renewal rates (especially in the transition period from multiple grants to single grants) and whether or not it would stymie collaborative efforts between labs that each already have a program grant (i.e., they would be unable to apply for a joint grant). This is one of the main points we raised in our analysis as well and I think CIHR needs to think hard on the best solution to this concern.
Incentives - some respondents thought that providing incentives for reviewers would encourage more qualified reviewers to partake. These included honoraria, deadline extensions on one’s own applications, and increased value/duration of one’s own grant. Personally, I do not think that incentives are the way to go – it seems to me that the privileges associated with reviewing the country’s best research and the interactions with other reviewers are worth it. However, I could support the penalization of investigators who do not participate in peer review (i.e., ineligibility to apply for future grants, limited access to funding, etc).
Fear of the multi-phased process – one of the items that we most strongly supported was the blinded first stage of project grants which aims to cull poor proposals even if they come from excellent scientists. If you write a poor proposal, you don’t really deserve to get funded, it doesn’t matter who you are. It seems that support was not as strong amongst respondents to the CIHR – with respondents registering concerns about inability to assess anonymously and increased re-submission of proposals that didn’t make the cut – I wonder where our readers stand?
Many hands make the labour light - another item that provoked some concern was reviewer fatigue from having more reviewers (5-8) on all applications. Despite these assessments being intended to take a shorter amount of time, some respondents believed that in order to do a proper job, less time per application would lead to poor decisions. I disagree. To emphasize this point I will draw an analogy that I think scientists should understand. If presented with the following two sets of data, which one do you have the most confidence in?
- A well-respected, well resourced lab publishes a very detailed study that identifies Protein X as a marker of advanced colon cancer.
- 10 distinct groups from different institutions across the world all identify Protein Y as a marker of advanced colon cancer. The data in each individual study are not as detailed or exhaustive as the first study.
If you have to choose one to add into your diagnostic tool for colon cancer, which do you choose? Does repeatability outweigh the strength of the first study?
This seems to be the reason for increasing the number of reviewers per application. The CIHR is asking for less detailed, less rigorous reviews of a less comprehensive proposal. In the event that one reviewer misses the point or over-hypes an application, their impact is curbed by additional reviewers. If a grant gets through everybody, then chances are, it has some merit. The subsequent in-depth review is meant to clean up the remaining issues and fund or not fund based on merit.
Overall, the CIHR has a lot to think about. The report they produced does not suggest much about what the outcomes will be, but does say that final decisions will be released in the autumn. We’ll stay tuned and be sure to inform our readers of its release.
Earlier this summer, two major reports were released from the U.S. National Institutes of Health and the National Academy of Sciences. Beryl Lieff Benderly offers an excellent, though slightly pessimistic, summary of the reports and their potential implications on the Science Careers site and this is well worth a read if you’re not willing to wade through the over 400 pages combined. If you do read through the reports outside of your day job, it will likely take you as long as me to form some opinions on their contents and whether or not they can work in practice. The reports cover much more than what I will talk about below, but I’ve tried to pull out some ideas that I think Canadian universities and policy makers would do well to pay attention to.
On a side note, the Canadian Association of Postdoctoral Scholars is looking to collate the opinions of its postdocs (all of those working in Canada and Canadians abroad) to help focus its advocacy efforts on the key issues for early career researchers in Canada. Please visit their website and Facebook page if you would like to share your thoughts (or leave them below).
The three items arising from these reports that I was particularly impressed with are:
- Reduce the time to complete a PhD
- More fellowships, less grant funded PDFs
- Two streams: Scientist and Academic
1. Reduce the time to complete a PhD
My PhD took 5 1/2 years to complete and, as someone who is at least curious about the prospect of running my own lab, I see enormous benefit from the extra time in training over a 3-4 year PhD. My final two years were easily my most productive and I was able to build networks of scientists through a lengthy stay at a single institution. Was it necessary for this time to be spent entirely as a graduate student though? The ability to assess information critically and design good experiments can surely be taught in the first three to four years — if people need the time for finishing PhD research projects for publications, let them take on a year or two of postdoctoral work in their PhD lab. It is simply unfair to expect someone who is getting a PhD for a purpose other than becoming an academic to spend five to seven years (the 2006 median in the U.S. was actually 7.9 years) of their twenties in graduate school. If we beat the drum about the need for PhD quality scientists in law, journalism and public policy, then we must come up with ways to train them more efficiently.
Graduate programs at Canadian universities should be substantially shorter and broadly inclusive of all types of graduate students — from those driven to become professors to those looking to acquire the skills of a doctorate for another profession.
2. More fellowships, less grant-funded PDFs
An interesting table is presented in the NIH report that shows the relative future success of NIH fellowship funded vs. grant-funded postdoctoral fellows. Both the average time to obtaining a first operating grant (RO1) and the average success rate is substantially higher for those on fellowships (5.3 years, 48.3%) compared to those funded from grants (6.5 years, 32.5%). The cynic would say that these numbers simply represent being on the gravy train where each award breeds the next, just as papers from well-known labs are purported to get an easier ride in big journals. However, I would argue that this makes an even stronger case for making more fellowships available in lieu of grant-funded posts where more “chances” can be taken by award committee members. Furthermore, the NIH report makes an excellent point that few, if any, mechanisms exist to judge the quality of training given to a grant-funded researcher. More fellowships would allow better tracking and quality control of training environments.
Will Canada’s granting agencies do the same? It sure as hell makes for better press than NSERC’s 9% success rate in PDF fellowships…
3. Two streams: Scientist and Academic
Though the timing can be quite varied, many of those who hold a PhD realize that they have a preference for bench work compared to running their own group. It seems to me that professors can recognize who the most valuable members of their research team are, but it also seems that the careers that get best supported are the ones that shoot for independent investigator. If a postdoctoral fellow is highly skilled and does not want to run their own group, wouldn’t it make sense to put them in permanent positions that have good salaries and benefits? We’ve written about this before in a previous entry, The solution: Hire scientists to do scientific research… On this note I have to share the pessimistic view of Ms. Lieff Benderly on the kitten-strength recommendation from the NIH:
“The working group encourages NIH study sections to be receptive to grant applications that include staff scientists and urges institutions to create position categories that reflect the value and stature of these researchers.”
Will universities and research institutes step up to the plate and hire departmental research scientists, or will research scientists be forced to depend on their supervisors’ grant wrangling skills? My bet is on the latter if there is no obvious benefit for research institutions.
In conclusion, I hope that all of those who work for a granting agency, university or research institute will read these reports. Understand that there has been a dramatic change in the biomedical research workforce over the last decade and try to address the changes. Shorten the PhD, reward researchers on merit, and let scientists do scientific research for a career.
It falls to scientists to speak up in support of federally funding research and in this third installment of a four-part series, I explore the economic cost of doing research in a cash-strapped system and the burden this is placing on young investigators.
To bring yourself up to speed, installements 1 and 2 are referenced below:
- Biomedical Research and Broken Clocks: All the Parts, but No Instructions
- A Difficult Pill To Swallow: The Harsh Realities of a 15% Funding Rate
As has been discussed here on and off for quite some time, 80% of PhDs in the US will not become professors. For the majority of these scientific investigators, the inability to secure a faculty position has meant that they must languish in a series of post-doctoral positions supported by grant-funded professors who are increasingly finding themselves with limited resources. The average age of independence in research is now in the mid-40s, a testament to the bleak prospects facing young scientists (PDF).
Given this highly unstable state of academic funding, it is not surprising that many investigators have chosen to transition into more secure professions like teaching, medicine or law. For an in-depth review of the career prospects of a post-doctoral research scientist please see Careers and Rewards in the Bio Sciences: The Disconnect Between Scientific Progress and Career Progression (PDF). The loss hurts our competitiveness in biomedical research and forces industry abroad.
Given our current economy, it is imperative that efforts to improve the nation’s fiscal stability be grounded in the long-term competitiveness of industries we currently head, and that we leverage our expertise in medical science and capacity to do high-tech research. This does not need to come from increased government spending alone. Whereas academic medicine cannot build R&D into the pricing of its services, universities profit directly from tuition fees, patents and personal endowments.
Since these revenues are derived from faculty teaching loads, the scientific success of their investigators, and established reputation of their research program, faculty support must be factored into departmental operating budgets, freeing up tax dollars to directly support research innovation. Another idea would be to create tax breaks for private donations to federal funding agencies in an effort to reduce their dependence on public dollars and incentivize industry investment in national research programs. In the United States (the same nation that passed the Bayh-Dole Act to spur commercialization of university research), government funding of university research exceeds business funding by an order of magnitude, and business investment in university research is nearly half that of Canada (PDF).
Finally, limiting the number of federal awards issued per investigator, most of which are held by senior faculty (PDF), would open up more funding opportunities to help support young investigators and significantly lower the age of independence. While the debate of whether to preferentially support established labs with proven track records over younger faculty with new ideas is ongoing, without early career support junior researchers will not succeed.
If we are unwilling to prioritize young faculty and share what wealth there is, perhaps the better question is “Should we continue training so many of them?”