The University of Johannesburg advertises on billboards that it is in the top 4% of the world’s universities while the University of KwaZulu-Natal boasts on its homepage that it is “ranked in the top 3%”. In the south, the University of Cape Town time and again ranks tops in South Africa, and Africa.
There are at least five different international ranking bodies that evaluate South African institutions, each with their own criteria and formulas.
Ranking systems have been part of the US academic system for nearly a century, but global rankings took off in 2003 when the Shanghai Jiao Tong University started the Academic Ranking of World Universities (ARWU), also referred to as the Shanghai Rankings. The initial aim was to benchmark the top performing universities in China, but their list also included other world universities.
The global interest that followed led to a collaboration in 2004 between the Times Higher Education magazine and the UK consultancy firm Quacquarelli Symonds (QS) to compile the Times Higher Education Supplement (THES) rankings. They split in 2009 to become the Times Higher Education World University Rankings (THE World University Ranking) and the QS World University Rankings.
In academic circles THE and QS are considered to have more influence, Dr Magnus Machale-Gunnarsson, who monitors ranking systems for the University of Gothenburg in Sweden, told Africa Check.
Locally, the Center for World University Rankings (CWUR) based in Saudi Arabia has received a lot of media attention, with the Best Global Universities (BGU), owned by American publisher U.S. News & World Report, being a newcomer.
How are these ranking compiled?
Each ranking body has a different set of indicators. The Shanghai Rankings use 6 indicators largely focused on research output, such as the number of articles published in nature and science journals or the number of Nobel Prizes and field medals in mathematics that alumni and staff win.
In comparison, THE World University Rankings has 13 indicators grouped into 5 categories focusing on:
- the teaching environment: a reputation survey, staff-to-student ratio, doctorate-to-bachelor ratio, doctorates awarded to academic staff, and the institution’s income;
- research: volume, income and reputation of the outputs;
- international outlook: assessing the ratio of international staff and students, as well as international partnerships with other institutions;
- citations: assessing the number of times published work is cited; and
- industry income: looking at the income generated from commercialising research.
Each of these indicators is then given a certain weighting to create a composite index and the total score is used to rank universities.
It’s a labour intensive process and therefore most rankings are compiled by a specialised department of a tertiary institution or commissioned by the commercial wing of a company.
Some of the league tables use third party information that can be considered objective, such as the number of doctorate graduates obtained from government agencies, and can reasonably be used for international comparisons.
The simplistic way of formulating the composite indicators draws criticism that rankings are an insufficient way to measure aspects of tertiary education quality. Machale-Gunnarsson told Africa Check: “There is no agreement on what ‘university quality’ is. Is it good research or good education? And what is good research? Is it research that has impact in the scientific community, or research that makes a difference for people outside academia?”
Lack of transparency
Considering the significant influence that these indices hold, there is little transparency on why certain indicators were selected, and weighted in a certain manner. In a critical review, academic Lee Harvey pointed out that the choice of certain indicators is idiosyncratic (why, for example, does the Shanghai ranking only use Nobel Prizes as measure of the quality of education and faculty?), while the weighting assigned to many are arbitrary (why have the proportion of international students count 2.5% towards a university’s score – and not 10% or 2% – as is the case with THE?).
In some cases, it’s not possible to get data for every country and that limits international comparison. For example, the QS world university rankings use total faculty members to calculate its faculty to student ratio and citations per faculty indicator instead of distinguishing between teaching and research staff.
Some indicators rely on opinion surveys, which could lead to under or over scoring because of the experts’ limited knowledge about the institution or even their own personal bias. In a World Bank conference presentation, higher education policy advisor Ellen Hazelkorn called these surveys “self-perpetuating” because she sees them as benefitting established universities in developed countries.
Are they of benefit to African universities?
(former CEO of South Africa’s Council of Higher Education and now a higher education consultant)
“The reason why South African universities perform well is that they have more resources than other African universities,” Essop told Africa Check. “The University of Cape Town ranks very high because it has a lot of international academics and attracts international exchange students. A lot of their A-rated scientists are not South African.”
On a global scale, Essop’s stance is that South Africa cannot afford to compete in these rankings. “Participating in these ranking systems affects the way you organise operations and allocate limited resources. South Africa may not even want to attract international students as at the moment the country is still grappling with access for local students.”
Essop gave the example of the University of the Witwatersrand which aims to be in the top 100 universities in the world by its 100th birthday in 2022.
“That would mean increasing their research output and postgraduate production,” he explained. “The bigger problem is that half of students who enter the [South African university] system do not leave with a qualification. So it is an issue of deciding to allocate more resources for research versus teaching, and that will impact the output of graduates.”
Professor Sioux McKenna
(Centre for Higher Education Research, Teaching and Learning at Rhodes University)
Some South African universities, such as Rhodes University, made a decision not to participate in university ranking systems. This means they do not provide data to ranking organisations, which then have to make do with what is publicly available.
McKenna told Africa Check that the “the fundamental methodology of combining discrete numbers to rank a complex organisation is problematic. It supplants notions of the public good with notions of excellence in quite dangerous ways. We are part of a public sector working together in a context of dire need and significant unevenness in the system. So a process that pits us against each other as competitors cannot be productive.”
Dr Thandi Mgwebi
(executive director of the Research Chairs and Centres of Excellence at the National Research Foundation)
However, even if institutions are not in agreement with the system, at the very least they should be familiar with the systems and the methodology behind each one, Mgwebi told Africa Check.
“As a result of the transformation of higher education globally, massification, the race for excellence, competition for funding, universities are constantly striving to position themselves in these international rankings,” she said. “There is undoubtedly fierce competition for talent and funding amongst universities. In the past, there were no metrics used to determine or backup perceptions about universities. Now, students, teachers, research partners and donors increasingly consider data provided in rankings to base their decisions on the universities.”
South Africa’s top ranked universities
|URAP (2016-2017)||THE World University (2016-2017)||ARWU (Shanghai) (2017)|
|CWUR (2016)||QS World University (2018)||BGU|
South Africa’s department of higher education and training allocates research subsidies to institutions based on research publications output. When using this as criteria, the University of Pretoria took the accolade of being ranked top in 2015, with the University of KwaZulu-Natal second. When universities’ staff count is taken into consideration, the University of Pretoria was still first, followed by Stellenbosch University.
[Factsheet first published: 28/01/2016. Last updated: 11/08/2017]
© Copyright Africa Check 2019. Read our republishing guidelines. You may reproduce this piece or content from it for the purpose of reporting and/or discussing news and current events. This is subject to: Crediting Africa Check in the byline, keeping all hyperlinks to the sources used and adding this sentence at the end of your publication: “This report was written by Africa Check, a non-partisan fact-checking organisation. View the original piece on their website", with a link back to this page.