Get more Mail & Guardian
Subscribe or Login

International rankings don’t measure what matters

International rankings of universities are big business and big news. These systems order universities on the basis of a variety of criteria such as student to staff ratio, income from industry, and reputation as captured through public surveys.

Universities around the world use their rankings as marketing material and parents and prospective students make life choices on the basis of them.

But the methodology underpinning the Quacquarelli Symonds and Times Higher Education ranking systems and others like them would be unlikely to pass as a third year student’s research project. And yet high-status universities around the world spend time and money competing in this extravaganza rather than pointing out that the Emperor is wearing no clothes.

Why would they when the rankings reinforce their position as institutions of choice for those who can afford their fees?

As a researcher of higher education, I find it worrying that we’re held captive by these glitzy spectacles.

Imagine if a student indicated that their research project would be to develop a ranking of all universities. They would allocate 20% to whether current students and the general public thought the university was prestigious, 5% for the number of Nobel Prize winners on the institution’s staff, 30% for the number of research publications, and so on. Any academic advisor would throw the proposal out.

Some of these criteria are subjective. The weightings are arbitrary, important aspects of many universities are missing and the averaging of unrelated aspects to a final number is simply poor science which does not tell us much about the institution at all.

And yet this is exactly how rankings are determined.

The ranking method doesn’t add up

The methods behind the international university ranking systems vary but the underpinning methodology is identical. Convert proxy measures of a few academic activities into numeric metrics, add these together and come up with a ranking of institutions.

The criteria may be entirely unrelated to each other or may be poor proxies of the academic activity being measured. Reputation surveys and student throughput, for example, probably tell you more about how wealthy, and therefore selective, the university is than anything about the quality of their teaching.

Furthermore, the weighting of each criterion is almost entirely arbitrary. If publications and citations are worth 20% and web visibility is worth 10%, you will get one order of institutions. Change this to 25% and 5% and the entire list rearranges itself.

Many of the criteria could be seen as descriptive rather than evaluative. High pass rates and student:staff ratios are separate criteria in many systems, yet most academics would argue that they are measuring related issues.

Privilege begets privilege too, so universities that are the most prestigious and charge the highest fees can be the most selective in student enrolment and staff recruitment. Because higher education is not a meritocracy but largely reinforces social stratifications, these universities will then celebrate their inevitable success in getting students to complete their studies and graduate.

Many rankings focus on reputational rankings which ask employees, graduates, and the general public to indicate which institution is the best. But this becomes circular: a strong reputation leads to a strong reputation. This benefits well known universities and neither reflects nor benefits teaching and learning, research, community engagement, or any other academic activity.

When less wealthy universities attempt to compete, the cost can be a drain on resources that could be spent on activities more attuned to their context.

Many universities in South Africa are now chasing these rankings, even while admitting that they’re problematic. But participation is neither innocent nor harmless.

It’s not innocent because universities know the average citizen believes that these rankings say something about quality. This then influences choices as to where to study and work and whom to employ. Instead of using their research skills and academic integrity to pull these games apart, they expend a great deal of energy trying to improve their position.

Increasing research output in the global South is essential. If the rankings drive this process, well and good. But it’s harmful if the focus on publications and postgraduates comes at the expense of important factors that aren’t used in rankings.

What counts

Ranking systems don’t concern themselves with whether the university takes its community engagement responsibilities seriously. They don’t consider who students are and where they want to go. Ranking systems care only for the market – and higher education is a very large market.

Academics can and should speak out against these neocolonial processes that position all universities as striving to be identical and competing for market share.

But it’s difficult for universities to opt out of these games. The university where I work refuses to engage with ranking organisations. And yet it’s still included in these systems, as they draw on publicly available data.

Despite having among the highest undergraduate success rates and publication rates in South Africa, the lack of medicine and engineering programmes works against it. So too does its strong focus on community engagement and its small size – though these might be exactly why the university is a good fit for many.

The dodgy methodology is stacked against the institution but far more problematically, it’s stacked against most of the purposes set for higher education in South Africa’s white paper of 1997.

Nowhere do these systems concern themselves with transformation, social justice or the public good.

Sioux McKenna, Director of Centre for Postgraduate Studies, Rhodes University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Subscribe for R500/year

Thanks for enjoying the Mail & Guardian, we’re proud of our 36 year history, throughout which we have delivered to readers the most important, unbiased stories in South Africa. Good journalism costs, though, and right from our very first edition we’ve relied on reader subscriptions to protect our independence.

Digital subscribers get access to all of our award-winning journalism, including premium features, as well as exclusive events, newsletters, webinars and the cryptic crossword. Click here to find out how to join them and get a 57% discount in your first year.

Sioux McKenna
Sioux McKenna is the director of the Centre for Postgraduate Studies at Rhodes University

Related stories

WELCOME TO YOUR M&G

If you’re reading this, you clearly have great taste

If you haven’t already, you can subscribe to the Mail & Guardian for less than the cost of a cup of coffee a week, and get more great reads.

Already a subscriber? Sign in here

Advertising

Subscribers only

South Africa breaking more temperature records than expected

The country’s climate is becoming ‘more extreme’ as temperature records are broken

More top stories

US fashion contaminates Africa’s water

Untreated effluent from textile factories in in Lesotho, Ethiopia, Kenya, Mauritius and Madagascar pours into rivers, contaminating the water

Deep seabed mining a threat to Africa’s coral reefs

The deep oceans are a fragile final frontier, largely unknown and untouched but mining companies and governments — other than those in Africa — are eying its mineral riches

Komodo dragon faces extinction

The world’s largest monitor lizard has moved up the red list for threatened species, with fewer than 4 000 of the species left

DA says ANC’s implosion has thrown local government elections wide...

The DA launched its 37-page manifesto on a virtual platform under the banner “The DA gets things done”.
Advertising

press releases

Loading latest Press Releases…
×