Posts by David Kent
As our readers know, we ran a panel discussion last month in Toronto at the 5th Annual Canadian Science Policy Conference. It was a packed room and the panel featured heated exchanges at some points (even between panellists!). Many diverse opinions were shared, pointing to a clear need for academic-training reform.
Chris Corkery began by representing the Canadian Association of Postdoctoral Scholars and their recent survey findings from ~20% of Canadian postdoctoral scholars. This launched the panel discussion by showing that early career researchers in the postdoctoral stage were a growing cohort of young bright minds in their mid-30s who aspired to becoming career researchers, but found themselves in a temporary holding zone when it came to career options.
A major policy issue for Canadian universities, governments and industry is how to avoid wasting such talented individuals that represent a major national investment. We set up three perspectives to engage this issue: Rob Annan from Mitacs delivered a talk on Mitacs’ suite of programs to facilitate the transition of trainees into industrial placements, Mawana Pongo spoke of the need for governments to invest in young trainees and the creation of 2,000 fellow positions and 1,000 professor positions, and Ben Neel spoke about the view from research-focused academic institutes and “tough love” for trainees
Dr. Neel’s was perhaps the most provocative presentation, arguing that graduate students and postdoctoral fellows needed to be given a “tough love” message more often than they appeared to be receiving currently. He argued that trainees needed to become masters of their own destiny and that things “aren’t that different” to when he trained (despite admitting that the numbers of scientists being trained has far outstripped new faculty positions). Dr. Neel made some solid arguments – PhD students and postdoctoral fellows have never been particularly well compensated for their educational standing and academic jobs have never been what the majority of people end up doing, and that the real disaster is the reduction in science funding.
Where does this leave us in the policy world? Three main questions need answers:
- Should there be multiple career streams for publicly funded researchers?
- Should the time to complete a PhD be reduced? (And if so, how can this be achieved?)
- Should PhD researchers be treated like medical residents and junior accountants despite unclear career outcomes?
After our panel, CSPC President Mehrdad Hariri asked if we would consider continuing the conversation through CSPC. If this takes place, we’ll certainly let readers know. In the interim, start the conversation below and we’ll collate the responses in our next quarterly summary. As I’ve said many times before, there is little obvious incentive for anyone else to fix academic training, so it’s up to researchers to fix their own system.
This Thursday, I’ll be attending the Canadian Science Policy Conference in Toronto to run a session on the training the next generation of scientists. The session promises to be discussion-based and I hope that some practical ideas and solutions will be proposed by audience members and panelists to help address what I consider to be one of the greatest wastes of human capital in our country.
The results of the Canadian Association of Postdoctoral Scholars (CAPS) survey will launch the panel highlighting the real need for new policy solutions to address the ever-increasing numbers of science-based trainees being spun out of Canadian universities and research institutes. This human resource crisis stimulated the formation of CAPS and numerous other international groups of early career researchers (e.g., the NPA in the United States, and ICoRSA for international research staff). The panel brings together stakeholders in industry, government and academia to discuss the needs of each sector and strategies for Canada to adopt in order to come out ahead in its training and utilization of young scientists.
I’ve compiled a list of relevant posts by Jonathan and I that try to tackle some of these issues and propose solutions and I hope this will act as fodder for conference-goers to get the discussion rolling. Post-conference I’ll relay to readers who could not attend the key ideas that emerged with the intention of building consensus on the best ideas that granting agencies, universities and employers could adopt:
- Sick of studenthood, early career researchers want employee status
- Half of Canada’s early career researchers are not Canadian
- Attracting and retaining talented researchers
- Reversing the brain drain
- Too much Talent? SSHRC’s “solution” to the postdoc boom
- The PhD Placement Project
- Incredible promotion tool: student and postdoc outcomes
- Tri-Councils should learn from EMBO fellowships
- Postdoctoral mentors and a regular reality check
- Shorter PhDs and more active thesis committees
- Fewer postdocs with higher salaries? Hold your horses!
- Planning Ahead: How many of you are there and who will pay you?
- A paradigm shift in academic advancement
- Creating scientists, not science, is the key to productive universities
Science training will not magically fix itself – it’s up to young scientists to identify the challenges and help to address them. The most important product of a university is people and these people will go out into every sector of society to help improve our collective future.
- I’ll also partake in a panel on the value of science blogging in Canada on Friday. Hopefully this session will highlight the utility and meaningfulness of scientists picking up the proverbial pen and paper to get their thoughts and opinions out into the world.
Last week, the International Consortium of Research Staff Associations (ICoRSA) was launched in connection with the VITAE Research Staff Conference. Forged in the fire that burns in the bellies of early career researchers with low salaries, little stability and poor career prospects, this organization aims to better the researcher profession by linking the individual (often national) organizations to each other.
ICoRSA has been busy in its first year of activity – they have successfully engaged postdoctoral and researcher organizations from across the world (including the Canadian Association of Postdoctoral Scholars) and their current board members reflect a good mix of these organizations. It will be interesting to compare the situation of researchers and research across the world, something that is desperately needed in the policy world.
One of the main advantages that ICoRSA has is its international breadth. Who better to assess the relative state of research than the researchers themselves who have spent numerous years in multiple different countries? As a community we have the unfortunate tendency to determine what the problems are without actually collecting and comparing the data. Indeed, much of what we speak about on our website is often drawn from single reports in single fields or single institutions with only a handful of reports appropriately contextualized. This results in decision making that is based on incomplete information and can result in major policy changes (e.g., the Banting scholarships in Canada) that impact the entire research community.
To highlight this, I look inside my own field and the tacit assumption in stem cell research that a permissive vs. restrictive national policy on stem cells is a major factor in researcher relocation (e.g., researchers would choose to work in locations that allowed them to do human embryonic stem cell research). When stem cell researchers were actually asked why they chose to relocate by a UBC research group (preliminary findings were reported at this year’s Till and McCulloch meeting in Banff), hardly any researchers listed regulatory environment as a major factor in moving. Rather, it seems researchers are moving for career, supervisor, research environment, etc.
ICoRSA therefore, could grow into a research hub for all of the data being collected in different countries. This sort of resource would allow governments and universities to adapt their research funding and administrative policies to the actual data across multiple countries. It would also inform researchers debating an international relocation about the new research system they would be entering. Indeed, very little is done to facilitate the transition to a new country.
Hopefully this will be a major priority of the ICoRSA group in their first years of existence – only engagement and data collection will lead the way for sound evidence based policy recommendations.
One of the most popular topics on our site over the years has been the taxation and administrative status of postdoctoral fellows. The budgetary changes and the resultant discrepancy between postdoctoral and graduate student take-home pay galvanized Canadian postdoctoral fellows across the country and was a primary driver of the enthusiasm that formed the Canadian Association of Postdoctoral Scholars (CAPS). Five years on and a lot of settled dust later, it appears that post-PhD researchers want to be treated like grown-ups.
The original 2009 CAPS survey was completed when it was potentially beneficial (depending on your university, province and tax forms) to be classified as a trainee rather than an employee. The newest survey was completed after Budget 2010 when it became obvious that postdoctoral fellowships would be fully taxable. The result – over 75% of postdoctoral fellows would prefer to be classified as employees, with 70% of those currently classed as trainees or students preferring to have employee status.
How will Canada’s universities, funding bodies and research institutes respond to this preference?
One interesting wrinkle will be the handling of external fellowships. Contributing to pension and employment insurance is not free and if all postdoctoral fellows were to be employees, somebody would need to pick up the tab (would granting agencies allot monies for such contributions? Would universities be responsible, or would it come off the stipend of the fellowship itself?). Perhaps postdoctoral fellowships themselves would be viewed as training awards that exist outside of a standard employee/employer relationship. This is what happened to me as an externally funded fellow in the U.K. – no pension, no employment insurance, etc. Now I’m paid from my supervisor’s grant and am an employee at the university.
Overall I would note that, for the most part, the U.K. and Europe administer post-PhD researchers as employees. In my limited experience, much less venom is spouted concerning salaries and compensation. Things are still bad when it comes to career prospects and stability, but it seems that most early career researchers have access to a decent salary and benefits. The story is much more mixed in the United States, with some centres offering incredibly lucrative packages and others offering virtually nothing.
It is worth reminding people that the current federal government already categorizes postdoctoral fellows as employees: “Unlike post-secondary students enrolled in courses and pursuing a degree or diploma, post-doctoral fellows can be compared to a number of other professionals such as lawyers, medical residents, and accountants, where there is a period of paid training at the beginning of their careers (Jim Flaherty, Minister of Finance in a letter to CAPS). (N.B. the comparison doesn’t stack up well when remuneration/benefits are considered.)
So, my advice to universities, granting agencies, and research institutes – listen to everyone else. As said in the CAPS survey, “In short, postdocs are adults: in the middle of their lives, but at the beginning of their careers.” These people are professional scientists and deserve to be treated as such.
Over the coming weeks, I’ll be breaking down the fantastic information found in the Canadian Association of Postdoctoral Scholars 2013 Survey of Canadian Postdocs. To start this, I thought I would focus on the most surprising finding in my mind: 53.1% of the 1,830 respondents were either landed immigrants or holding a work permit. This is an incredibly high fraction that represents a huge opportunity for Canada, but only if policies and programs are designed to maximize the influx of such talent.
Plenty of non-American talent
Many of Canada’s postdoctoral fellows travel abroad and many find themselves in the United States, but the converse is not as frequent as many people think. Indeed, just 8% of international postdocs are from the U.S. whereas both France (13%) and China (12%) supply higher numbers of international researchers to the Canadian workforce.
When asked why they moved to Canada for research, facilities and resources were chief amongst reasons, showing that Canada has clearly created an excellent research environment. However, without the correct numbers and types of jobs available following this temporary period of research, it is not surprising that many leave the country. Funnily enough, the major challenge cited by international postdocs is not something remarkably academic or specialized, but rather “transitioning to life in a new country” and “visa/permit issues”- surely Canada can do a better job of making its talented young people feel more welcome.
You may ask why Canada should invest in these young researchers when they will all run away back to their home country? Again, the CAPS survey sheds light on this issue, showing that only 25% of researchers on work permits and just 3% of immigrant researchers have definite plans to leave Canada. There is a huge opportunity to capture this bright class of motivated young people to drive economic benefit for Canada, but we again do very little to support this permanent relocation.
Where does this leave Canadian researchers?
Jonathan just posted last week about attracting and retaining talented researchers, pointing out both the importance of international experience and the need, in Canada especially, to create jobs for researchers. Those jobs do not have to be academic jobs, but they do have to make the case for staying in – or coming back to – Canada for long-term employment.
As a Canadian-funded postdoctoral fellow working outside the country, I have lamented the lack of connectivity between Canadian funding bodies and institutions. My PhD and postdoctoral training cost CIHR $210,000 in salary alone and they have done virtually nothing to encourage my return. Indeed, funding agencies, institutions and companies do very little to attract its early career scientists back to Canada (both Jonathan and I can attest to this) both for academic and non-academic jobs. I think that two main problems exist: 1) lack of networks 2) poor programming for its fellows.
When one is looking for a non-academic post (industry, science writing, consulting, law, etc.), you are much more likely to do this locally. In my own case, a move to industry in one of the Cambridge biotech science parks would be much easier than trying to figure out the lay of the land in Toronto, Montreal or Vancouver. This is mostly because I regularly meet and interact with scientists who are employed with these companies and are collaborating with academics at our university.
EMBO, and countries like the UK and Australia, have come up with ideas on how to do this. EMBO created a “Fellows Network” that meets regularly and interacts with academics and non-academics; the U.K. encourages international applicants to its independent funding programs (Career development awards) and Australia ties the latter portion of grant funding to a fellow’s “return to Australia.” As far as I can see, Canada lags in this area and desperately needs to rethink its policies if attracting Canadians to return to work in Canada is a goal.
Overall, Canada needs to support both cohorts of talented researchers in order to capture the best and brightest minds to drive critical and inventive thinking that forms the baseline for discovery and innovation. Creating programs to bring back internationally trained researchers and encouraging Canadian trained international researchers to put down roots are not trivial tasks especially when the people making these decisions are (as described in the CAPS survey) adults “in the middle of their lives, but at the beginning of their careers.”
Last week was the culmination of an incredible amount of volunteer labour through the CAPS-ACSP group who produced their 2013 Survey of Canadian Postdoctoral Scholars. Done in collaboration with Mitacs, a not-for-profit group aimed at facilitating the transition from academia to industry, the survey emphasizes the need for urgent action at universities and research institutes in order for Canada to remain competitive on the world stage.
Many articles have been written already about the survey, including great pieces from University Affairs’ Leo Charbonneau and Beryl Lieff Benderly at Science, highlighting administrative ambiguity, poor remuneration and benefits, and low access to career development training.
Over the coming months, we will be using this survey data to take a more in-depth look at some of the key issues and to compare the situation of early career researchers in Canada to other countries – we hope you will enjoy this series and our proposed solutions. As part of a panel at this year’s Canadian Science Policy Conference, I have organized a panel discussion that will feature a presentation of this survey data alongside comments from leading researchers, non-profits and government members. It is entitled Training the Next Generation of Scientists and should produce some excellent discussion on this topic and these new data.
Guest Post: Dr. Eddy Kent
- Funding repercussions of U.S. debt showdown – 2013 edition
- Reversing the brain drain
- An open letter to Stephen Harper on the status of science funding in Canada
- The PhD Placement Project
- Measuring the non-academic impact of your science
- Impact factor ‘eligibility window’ skews the system
- Open access is “a journey not an event”
- Incredible promotion tool: student and postdoc outcomes
Dave continued to write for the Signals blog with:
Our guest blogger, Dr. Eddy Kent attracted some stories of frustration with SSHRC’s new policy on fellowships. The article on student and postdoc outcomes made the rounds on reddit with a long list of comments around whether or not such statistics would provide a valuable resource for prospective trainees.
It’s an incredibly busy autumn, but as always I encourage people to consider writing on the issues they feel most passionately about, the Black Hole is a forum for discussion and the proposal of solutions to the problems in training early career researchers. Email us at email@example.com if you are keen.
Citations are the standard benchmark for scientists to assess the impact of their work. Highly cited papers have clearly influenced the field and few would dispute their importance. What citations do not measure, though, is the wider impact of a paper – do industrial projects build from these discoveries, are school children interested in them, will it inspire governments and funders to direct more resources into research efforts? Tools for measuring such impact are rare, but recently I was introduced to a quick and easy web-based program for getting a bird’s eye view of the non-academic impact of papers.
Altmetric is very simple and can quickly help determine the wider impact of a paper. It has developed a free toolbar app that everyone can use in addition to several advanced programs that can be purchased for those interested in doing higher-level metrics. Out of curiosity, I ran the tool on some papers to see if anything interesting popped up and was impressed with the breadth of data captured as well as the easy to use/navigate web interface.
What does it measure?
From what I’ve seen so far, Altmetric collects data from Twitter, blogs, news agencies, Faculty of 1000, Google+, etc., and uses it to assess a paper’s relative impact to other articles. For some articles, it has sufficient data to compare it to articles from the same journal or all articles in their database. It then produces a score that represents the non-citation activity around an article.
Sometimes things agree quite well – I have a paper in 2008 and one in 2009 in the journal Blood. Citation tracking tells me that the latter has had a larger impact (86 vs. 24 citations) and the Altmetric scores are 7 and 1, respectively. However, when I compared the 84-citation paper to a 2007 Cell Stem Cell paper I was part of which also scored an Altmetric 7, it comes in substantially lower than the CSC paper’s 178 citations. Could this be down to journal source alone or does it actually reflect a similar impact outside of peer-reviewed academic articles? Significantly more analysis of other articles would likely need to be done to assess such possibilities.
What else do you get?
An excellent component of the Altmetric page is the “score” function which shows how an article compares to other articles in the same journal and from the same time. For example, the paper we published in PLoS Biology this past June scores an Altmetric 16 and was measured to be in the 79th percentile of all PLoS Bio articles, 92nd percentile for articles of a similar age, and 95th percentile of all articles measured in the database. Currently there are no papers citing this article as it is only three months old, but when one is going on the job hunt, these additional metrics might be useful for hiring committees to determine the relative standing of a paper.
Is it all just vanity?
Many people will look at these types of metrics because they are curious about their own articles and many will simply dismiss the whole altmetric exercise as one of pure vanity and therefore of no added benefit to science or scientists. No doubt, there is an element to the tool that will encourage scientists to look up their own article, but the utility of Altmetric is quite a lot broader. It collates related news stories and blog entries, it assesses papers relative to other papers in the same journal from the same time, and it actively excludes citations from its calculations, thereby offering a novel way of assessing papers.
And importantly, we should also ask what exactly citations measure – in many cases, article citations are accumulated by things that do not reflect the quality or impact of their work – prolific authors self-citing, field popularity (e.g., stem cells, microRNAs), perceived journal importance (e.g., Nature articles get read because they are in Nature). At least Altmetric offers a way to assess articles on their individual merits and assess their broader importance (e.g., impact in industry news).
Bias toward new articles
Social media, especially as it pertains to discussing scientific articles, is very new and therefore all articles written before a certain time suffer disproportionately when it comes to Twitter, Google+, etc. Furthermore, there are inevitably new social media applications that will emerge in future years, making this a persistent problem for Altmetric. We’ll have to wait and see how this will affect the tool as it develops in the future.
Until then, we will continue to update readers on novel tools for assessing biomedical researchers. For more information, check out our previous entries:
- Google Scholar “My citations”
- Let’s Accessorize
- Collaboration Networks
- New Metrics for measuring research
For the past year, I have been sitting on the publications committee for a society-run journal and in the journal’s quest to improve its impact factor (IF), it became clear to me that one of the system’s dark secrets is the “window of IF eligibility.” It single-handedly disadvantages journals whose science stands the test of time and favours journals that have speedy public relations’ campaigns.
For those not aware of it, a journal’s IF is based on two numbers for year X:
- The number of times articles published in the two years prior to year X are cited during year X
- The number of citable articles published in the two years prior to year X
The IF is simply the first number divided by the second.
This means that all articles published more than two years before the year being evaluated do not count for anything in the IF world. The IF metric significantly impacts the careers and fundability of scientists, but the practice of only counting two years post-publication is damaging for several reasons.
No benefit to standing the test of time
Not many people regularly scroll through the table of contents in journals with an IF of 3, so when very good papers are published in low-ranking journals, it sometimes takes a little time for them to get on people’s radar. Most scientists would agree that a paper needs to stand the test of time, and indeed many papers in low IF journals do exactly that, but they do not end up helping the IF of that journal.
A case in point, there was a 2003 paper in my field published in Experimental Hematology (IF < 3) that was cited 6.5 times per year in the eligibility window and an average of 10 times per year since. By contrast, a paper on the same topic in the same year in Nature Immunology (IF > 20) was cited 17 times per year in the eligibility window and an average of 11 times per year subsequently. The papers end up in the same place (i.e., cited about 10 times/yr) but they differ dramatically in their contribution to the journal’s IF (17 vs. 6.5).
Bias against poorly promoted journals
Big journals have big public relations teams – they do big press releases, and they carry an already big name. Obviously, getting quick publicity for articles will lead to citations earlier than papers that aren’t read immediately.
Artificial bias toward being trendy
Surely, in the age of reddit, Twitter and things “going viral,” we can appreciate that trendiness does not equate importance. However, our metrics for evaluating scientists almost fully rely on how trendy their research is (or at least on how trendy the journal they’ve published in is), and not how important or good quality it is. This translates into a culture that rewards buzz words and style over substance.
So, what can we do to improve the situation? Scientists could start picking up tables of contents from low IF journals (unlikely) or we could actually read beyond the abstract to see if the paper we’re citing actually proves a point and hasn’t been published somewhere else (also unlikely). I think the easiest way is to measure articles (and journals) by average citations per year since publication. I’d love to see what proportion of “high impact” papers crash and burn after the first few years post publication. It would not surprise me if the number was very large.
Two weeks ago I attended the Flow Cytometry UK Meeting and their keynote speaker was Douglas Kells, the current chief executive of the Biotechnology and Biological Sciences Research Council (BBSRC). He gave an inspiring talk about the benefits and utility of open access publication and what the BBSRC (and other funding agencies) were doing to promote it. He summarized the open access battle as “a journey, not an event” and described the process as akin to shifting a cultural norm – that is it will take some time before open access becomes the standard and we all look back on the time of restricted access as the dark ages.
Professor Kells first listed the major players in the open access game. Universities are obsessed with research performance and researchers themselves crave publication in high-impact-factor journals. Funders are looking for the maximum research impact while publishers are generally out to create revenue. Finally, libraries wish to provide the best possible service for readers and the current system is a complicated quagmire of licensing fees and journal package subscriptions.
Gold vs. green open access
In the spirit of being realistic about the transition process for journals, there are two modes of open access currently being pursued. The first is gold where upon payment of a single article processing charge, the article is released instantly as Creative Commons (CC-BY) which means that anyone can use it so long as attribution is given. The green category is the same but permits self-archiving and sometimes an embargo period where the journal holds rights until an agreed period has elapsed (typically 6-24 months).
Many journals have already complied with gold (e.g., PLoS journals) or green (e.g., Nature) open access and according to Professor Kells, very few are holding onto no open access options at all. The BBSRC, in collaboration with Sherpa-Romeo, Joint Information Systems Committee and the Wellcome Trust are preparing a directory of open access journals website to list journal compliance levels which should help promote the transition of more journals to gold, or at least green, open access.
The future beckons
One of the biggest advantages in Kells’ mind of open access is the ability to improve text mining. The literature is too vast for any individual to comprehend and Kells cited a study which quoted 8% as an estimate for how many research findings are included in abstracts. Open access would give text mining algorithms access to more knowledge and in turn this will create new knowledge and new connections not currently possible. A useful tool to begin exploring what this level of interaction and data scouring can do is called Getutopia - a “free PDF reader that connects the static content of scientific articles to the dynamic world of online content”.
Overall, the future for open access is very bright, but it requires the stakeholders to figure out a way of establishing the new cultural norm. For now, I encourage our readers to always consider open access when submitting their work for peer review – changes of such magnitude requires buy-in from all players.
Today, we are very pleased to have a guest blog post from the Humanities and only a touch of familial guilt was used in its procurement. Dr. Eddy Kent, an assistant professor in the University of Alberta’s English and Film Studies department, puts in his two cents on SSHRC’s announcement earlier this month about its postdoctoral fellowships….
The Black Hole, says the masthead, is devoted to “issues of interest to early-career scientists in Canada.” Hopefully, you’ll put up with a contribution from someone who teaches English Literature about changes afoot at the Social Science and Humanities Research Council (SSHRC). Since they form part of a larger set of changes coordinated across the TriCouncils, the recently announced revisions to SSHRC’s postdoctoral fellowship program are significant to early-career scholars across the academy.
In July 2012, SSHRC announced the renewal of the Talent program. For those unfamiliar with SSHRC lingo, Talent is responsible for funding students and postdoctoral fellows. They set out a three-year timetable, including specific plans for postdocs. In July 2013, right on schedule, they announced the following changes for the 2014-15 competition year.
- Postdoctoral Fellowships awards will increase from $38,000 per year to $40,500 per year for up to two years, while provision for a separate research allowance will no longer be offered.
- Applicants will be eligible to apply up to two years after completion of a PhD (instead of the previous three-year window).
- Applicants who have previously held a postdoctoral award from SSHRC, CIHR, or NSERC, including a Banting postdoctoral fellowship, will no longer be eligible to apply.
- Postdoctoral Fellowships evaluation criteria will align with other SSHRC grants, which assess proposals based on three main criteria: challenge, feasibility and capability.
It is the first two points that warrant serious attention, since they signal the declining status of the postdoctoral fellow in Canada’s academic research culture.
The increase in the value of the PDF looks positive. However, since the research allowance was worth $5,000, it is revenue neutral. When I held my SSHRC award in 2008-09, for example, I received $38,000/yr plus the research allowance, for a total value of $81,000. Next year’s winners will receive $40,500 for two years and no research allowance, for a total value of $81,000. This shell game provides no net benefit to an award holder. Factor for inflation—the Bank of Canada’s handy CPI calculator reports a 6.6% increase from 2008 to 2013—and you can see a substantial erosion of a SSHRC PDF. The poor situation is magnified further when you consider the current tax status of a fellowship (taxable income) compared to a research allowance which can be partially deducted against research expenses.
This may surprise readers of this blog, who are likely more familiar with the CIHR and NSERC components of the Tri-Councils, but until now SSHRC accepted applications from candidates for up to three years post-PhD. That has now been reduced to two with a maximum limit of three lifetime applications per candidate. This probably sounds like a good deal for those readers wrestling with NSERC’s rule that allows you to apply only once – ever. Still, the basic principle holds: the Tri-Councils are reducing the number of candidates in any given year.
What benefit this might be to individuals is unclear. Some of my colleagues have suggested it would level the playing field, where currently the freshly-minted PhD competes with someone three years out. I haven’t seen any data to confirm or disprove this, but there are structural reasons that suggest such worries are overblown. Unlike the sciences, postdocs in the humanities and social sciences are not common; if you don’t hold a SSHRC, then you’re most likely working as a contract or adjunct lecturer. So the very fact that someone is applying for a SSHRC postdoc two or three years out means they are presently unlikely to be in a position that allows them to accelerate their research projects. A 3/3 or 4/4 teaching load is not especially conducive to one’s research.
If it doesn’t help individuals, the question cui bono? must be asked. In this vein, some have suggested, cynically, that the change is really an effort to massage the all-important success rates. Ian Milligan, a historian at the University of Waterloo, has studied data from SSHRC and assembled a revealing graph.
A steady increase since 2001 in the number of applications coupled with a relatively flat number of awards has led to a plummeting success rate. The simple solution — the one Milligan characterizes as “hamhanded” — is to reduce the denominator in this ratio.
I really hope that this is not the case, since that represents hundreds of real individuals. Putting aside the humanitarian concerns for a moment, it’s important to acknowledge this change makes little economic sense for Canadian society, given that the vast majority of these applicants are PhDs from publicly-funded Canadian universities and a smaller majority will have also received doctoral funding from SSHRC. The increasing applicant pool is a consequence of the major (and laudable!) Canada-wide investment in graduate training over the last 15 years. Every unsuccessful applicant therefore represents a lost investment. This is why success ratios are so important to SSHRC, which needs to justify itself to senior federal bureaucrats and politicians whose vocabulary is increasingly that of investment and tax dollars.
If the proposed revisions to the program succeed, then we might expect the number of applications next year to drop by a third to around 650. Assuming a similar number of PDFs are awarded, the 2013 success rate could thus rise to 23%, back in line with the ratios for 2000-2009.
But what’s good for SSHRC’s advocates to the federal government does not serve the interests of would-be postdoctoral fellows. These revisions mean that less money will be made available to fewer people overall. Dispiriting stuff, and hard to reconcile with Talent’s stated goal “to develop the next generation of researchers and leaders across society, both within academia and across the public, private and not-for-profit sectors.”
I’m sympathetic to SSHRC’s predicament. It works within budgets allocated to them by the federal government. Stephen Harper’s Conservatives have pled austerity and refused to invest significantly in postsecondary research. The Canadian Association of University Teachers reports that “between 2007–2008 and 2011–2012, funding for SSHRC will have declined by over 10 per cent in real terms. NSERC’s funding is down a more modest 1.2 per cent, while core support for CIHR will have declined by 4.1 per cent.”
So, barring a major change in federal policy, what is to be done? There is of course another and, I would argue, more equitable way to bump the success ratio back toward 20%: change the numerator, the number of awards granted.
Let me suggest how this might work. SSHRC’s 2011-12 budget for grants and scholarships was $332.3 million. The precise numbers for the PDF program are not readily available but a rough guess (150 awards * $81k) would put it at around $12 million. It might be less than that, since not everyone holds the PDF for the full two years, but let’s not quibble. Twelve million is less than 4% of the larger budget. To increase the number of awards by 50% (75 new awards) you’re looking at an extra $6-7 million dollars (or 2% of the overall grants and scholarship budget).
Redirecting that 2% away from other programs (Insight Grants, doctoral or masters fellowships, etc.) would be difficult. It also wouldn’t address the eroding value of a SSHRC PDF due to inflationary pressures. But it has to be fairer than doing nothing. The public has spent significantly to train more researchers, but it now appears that having invested so much in these highly-trained, highly-qualified, and dare I say Talented people we are refusing to support them at this most critical juncture. SSHRC must act more responsibly, supporting more from this vulnerable group at the most vital transitional stage of their research careers.