I’ve written a couple of blog posts in the past about the lack of available data on Canadian postsecondary education, and the cuts to research that have reduced what information is available (and reliable). In some cases, the information does exist but it takes a lot of time-consuming fiddling to pull out the required numbers and make sense of them. Recently, the task of untangling the data on PhD employment was taken on by Dan Munro at the Conference Board of Canada, as part of the CBoC’s larger project on postsecondary education and skills.* While the results of this analysis have been interesting, it’s also become apparent that the numbers we do have are flawed in ways that prevent us from fully describing the situation in relevant terms – which in turn has implications for policy.
The following examples illustrate the kind of problems I’m talking about. In the first case, Dan Munro wrote a blog post after corralling the numbers on PhD employment outcomes. This kind of information could help answer what is a pressing question for many PhD students: how many PhDs are ending up with tenure-track faculty jobs? The post cites 18.6 percent as the proportion of PhDs who have full-time faculty jobs (data were from the 2011 National Household Survey). However, this number includes both tenure-track and tenured, and full-time contract faculty jobs.
There’s a real need to make a distinction here because PhDs tend to want a long-term (tenure-track) academic position, not a temporary one. What proportion of that 18.6 percent comprises faculty on short-term contracts? In the United States it seems that this group is increasing in size, but what about in Canada? The numbers don’t tell us what we want to know because while the category of work may be the same, in terms of practice and lived experience – i.e. what’s relevant to people actually working in this profession – there are significant differences. So why is the distinction not being made?
Another glaring problem is with Statistics Canada’s Canadian Occupation Projection System (COPS), which in theory is where we’d look if we wanted to know how the job market for academic positions is shaping up for the next five years or so. But here’s the weird part: COPS “projects a shortage of university professors (NOCS 4121) to 2020. It estimates that there will be 44,328 new job openings in this category between 2013 and 2020, but only 39,030 projected job seekers (comprised of new graduates and immigrants with PhDs)” (Dan Munro, personal email). This seems to fly in the face of everything we know about the academic job market at present, which is unlikely to change much in the next five years. How can Statistics Canada’s projections be so un-intuitive? We know the competition for academic jobs is keen, and even the abovementioned numbers on full-time faculty seem to confirm this. Why don’t the COPS numbers reflect it?
One issue is that these statistics don’t take into account graduates’ intentions. Statistics Canada asserts that “an appreciable number of workers are expected to seek opportunities in other occupations […] that are related to their studies and that offer better job opportunities.” But if we look at what PhDs want to be doing, what proportion of this group is seeking an academic job? Take for example a 2012 study in which Louise Desjardins found that “About two-thirds (65 percent) of Ontario graduates pursued a doctoral degree with the intention of becoming university professors”; in the humanities specifically, this number was 86 percent. Some PhDs stay on the job market for multiple years after graduation, holding down non-academic and/or part-time positions while they await a tenure-track opening. How does this square with the proportion – which is under 18.6 percent – who are working in tenured or tenure-track positions?
In a blog post about the COPS numbers, Alex Usher explains that “COPS isn’t very good at understanding how politics and public sector finances affect hiring in monopsonistic fields like education and health care.” Apparently so. But does this mean we should discard the model being used by Statistics Canada? Dan Munro explains that “as far as the model goes […] it’s not a mistake. Certain assumptions have to be made in order to make projections and, in most cases, these assumptions are reasonable. But in the specific case of ‘university professors’ one of the model’s assumptions just doesn’t make sense.” He argues that what the results need is context, which in turn requires further (and qualitative) research: “it’s less a question of “fixing” the model and instead using contextual awareness and qualitative analysis to understand specific sectors – i.e., supplement the model.”
As an example of this, reliable statistics on contract faculty would be very helpful in painting a picture of the larger context of academic work in Canada. Unfortunately, not only has the National Household Survey been altered; the University and College Academic Staff System (UCASS), which covered “full-time teaching staff in degree-granting institutions that are under contract for twelve months or more”, got the chop. We don’t have details about the proportion of faculty who are teaching on contract (either part-time or full-time), and we also don’t know how that number would break down in terms of the different types of contract positions available, in which institutions and academic fields, and in terms of those profs’ reasons or motivations for taking on those jobs. There’s been comparatively little systematic investigation into the conditions of employment for contract faculty in Canada, even though their numbers have increased in the past 20 to 30 years.
Statistics are important; they can help us to piece together a picture of what the situation is for PhDs entering the job market, and what kinds of competition graduates face given the way hiring trends are unfolding. Considering the importance of these data, it’s amazing that no one else seems to have done this work until now. They may be flawed, but these statistics can help provide a basis for further investigation. If nothing else, the COPS numbers alert us to a discrepancy between how such things are being measured, and how they’re being experienced “on the ground” – pointing to a need to investigate this disconnect. Critically analyzing the available information is also important because the decontextualized statistics can be taken up politically and rhetorically, for example in arguments about the market for academic work in Canada.
Possibly revealing my “bias” as a qualitative researcher, I have to agree with Dan’s comment that “no matter how much ‘big data’ you collect, if you’re not asking the right questions in the right way, and drawing on expert interpretation and inference while analyzing, you’re probably going to miss some important features of the world.” This is just one reason why I’ll continue to argue that we need more numbers – but also that we always need more than numbers.
*This post has been informed by Dan’s generous feedback in an ongoing discussion of some of the issues involved, via e-mail and Twitter.