When Twitter posted a job ad recently for computer science graduates at its proposed “global centre of excellence” in Vancouver, it had unusually specific requirements for bachelor’s degree-holders. Their bachelor’s had to come from one of the world’s top 100 universities as defined by the Times Higher Education (THE), Quacquarelli Symonds (QS), or a similar ranking. This, according to Phil Baty, rankings editor with the London-based THE, is the latest example of how “obscenely powerful” university rankings have become.
Ever since Maclean’s magazine launched its ranking of Canadian universities in 1991– following in the footsteps of U.S. News and World Report, which began rating U.S. institutions in 1983 – no other topic has caused as much hand-wringing and outright irritation among higher-ed leaders, even those at universities that lead the pack. David Naylor, outgoing president of the University of Toronto and initially an outspoken critic of rankings, put it this way at a conference a few years ago: “I have made peace with rankings, just as I’ve made peace with the fact that every five years I have to have a colonoscopy.”
Since then, dozens more ranking systems have sprung up, mainly global rankings like those by THE, QS, and the Shanghai Ranking Consultancy. These three were launched a decade ago and remain the most established and well known worldwide. They have broadened their offerings over the years while new agencies have entered the field, providing everything from regional rankings to discipline-specific ratings and reputational league tables (see the descriptive list at the end of the article).
Criticism has dogged all of them to a greater or lesser extent because of the data and methodologies they employ. Even THE’s Mr. Baty, at the Worldviews Conference on Global Trends in Media and Higher Education in Toronto this past June, acknowledged that all rankings have “serious limitations.”
Chief among these is that rankings measure largely research output and reputation and don’t take into account differing institutional missions, said Philip Altbach, director of Boston College’s Center for International Higher Education, who also spoke at the Worldviews Conference. Teaching and learning are ignored for the most part because they aren’t easily measured on an international scale, he said.
Another drawback is that rankings systems cover just a small fraction of the world’s institutions, between one and three percent of some 17,500 universities, according to a report by the Brussels-based European University Association. The humanities, fine arts and social sciences are under-represented in the rankings because most of this research is published in books rather than the journals used as bibliometric indicators. Also excluded are journals published in languages other than English, because of their lower citation counts, according to the report.
University of Alberta President Indira Samarasekera has argued that too many measures, especially those used for reputational rankings, rely on subjective opinions collected through surveys of students and faculty members. Rankings also fail to consider government and business investments in university R&D and technology transfer, to the detriment of economic and technological powerhouses like Israel and Germany, she wrote.
But arguments like these haven’t dampened the popularity of rankings and, many observers agree, their influence over institutional behaviour and public policy is growing. “Rankings have become an industry, partly because they sell newspapers and magazines, but also because many universities devote considerable attention to trying to increase their position in the rankings,” said Glen Jones, Ontario Research Chair on Postsecondary Education Policy and Measurement at U of T’s Ontario Institute for Studies in Education.
The European University Association report cites research showing that immigration policies of the Netherlands favour applicants with graduate degrees from the global top 200 schools. Brazil selects students for scholarships to study abroad at institutions that rank well in the THE and QS rankings. India recently took this to a new level by announcing that only foreign institutions that have placed in one of the top three rankings systems will be permitted to open branch campuses in that country.
Governments in China and Japan have undertaken initiatives to create their own top-ranked universities by channeling research funding to a few institutions, said Dr. Jones in an interview. In Canada, though, rankings haven’t had much impact on public policy, based on the evidence. It’s not that universities haven’t tried: the U15 group of 15 Canadian research-intensive universities grouped themselves in this way, in part to advocate for a greater share of government research dollars, he noted.
Alex Usher, president of the Toronto-based Higher Education Strategy Associates, noted that “academia generally runs on prestige. What rankings have allowed us to do is to quantify our prejudices about prestige in higher education.” But he went on to say that international rankings are more likely to reflect institutional behaviour rather than drive it. Universities would be chasing research dollars with or without rankings, although the annual score cards may have added more urgency to their efforts, said Mr. Usher, who is a member of the advisory board of the Shanghai rankings and whose firm has produced its own research ranking of Canadian institutions. On the other hand, he added, rankings do have influence on student decisions, particularly when it comes to international students selecting where they will study.
Rankings also tend to convey certain benefits to top-ranking institutions, which can use their status to attract international students and donors, said OISE’s Dr. Jones. These schools are more likely to benefit from research partnerships and collaborations with their international counterparts. Rankings can also give universities and students some comparative data, which is otherwise in very short supply.
In Dr. Jones’s view, Canada’s universities generally perform well in global rankings, given how small the higher education system is, compared with other countries’. One thing he’s noted is how Canada’s top performers – the universities of Toronto, British Columbia, McGill and Alberta – differ from top U.S. and British schools. The Canadian schools are research-intensive but also have accessible and large undergraduate programs. He once calculated that U of T, which usually ranks in or near the top 25 in the Shanghai index, enrols more undergraduate students than the top five institutions combined. “They have a very different profile than Oxford or Harvard,” he said.
The main global rankings systems
Shanghai Academic Ranking of World Universities: The first global ranking of universities, launched in 2003 by Shanghai Jiao Tong University to assess China’s standing and chart its progress. It ranks the top 500 universities in the world.
Times Higher Education World University Rankings: Launched in 2004 by the Times Higher Education and Quacquarelli Symonds, but produced by THE since 2010. It ranks the top 400 and compiles rankings by subject, region, and reputation.
QS World University Rankings: Quacquarelli Symonds, a U.K. consultancy, ranks 700 institutions and also has rankings by subject, region and for younger schools. It offers an audit that schools can pay for; QS then issues a rating of one to five stars.
U21 Ranking of National Higher Education Systems: Launched in 2012 by Universitas 21, a global network of research-intensive universities, it ranks higher education systems in 50 countries, rather than individual institutions.
U-Multirank: An initiative financed by the European Commission that will report in 2014. The online tool lets users specify the type of institution to compare and select the measures to include. More than 650 institutions are participating so far.