Back to Africa Check

What a new ranking of SA universities can(not) tell you

The latest iteration of the University Ranking by Academic Performance (URAP) has been released and the South African media is already reporting on who tops the academic charts. (Answer: The University of Cape Town.)

The ranking system was developed by Middle East Technical University’s Informatics Institute. The Institution has released yearly rankings of 2,000 universities since 2010.

But, how useful is URAP in helping the public gauge the quality of South Africa’s higher education institutions?

Ranking based on articles published

As we previously noted in a factsheet on South African university rankings, these systems are wrought with limitations, including the inherent difficulty of capturing something as complex as quality. The criteria, weighting and input used are often arbitrary and not transparent.

The URAP website explains that “publications constitute the basis of the ranking methodology”. The institute looks at things like the number of articles a university publishes in international scientific journals and how often these articles are cited by other academics.

“The assumption is that an article that receives many citations is of higher quality than an article that receives few citations,” Dr Magnus Machale-Gunnarsson, an analyst who works in the Analysis and Teacher Education unit of the University of Gothenburg in Sweden, told Africa Check.

URAP considers six indicators:

ArticleIncludes the number of articles published and indexed by Web of Science.
CitationThe total number of citations received for published articles.
Total DocumentThe total document count which covers all scholarly literature.
Article Impact TotalA formula devised to measure a university’s “scientific productivity”.
Citation Impact Total (CIT)A formula devised to measure a university’s “research impact”
International CollaborationThe number of publications produced in collaboration with foreign universities.

Each university gets a score out of 600, with the “article” and “citation” indicators accounting for 21% each and the “article impact total” accounting for 18%. Citation impact total and international collaboration both count 15% towards the score, while the total document count makes up the last 10%.

As URAP only looks at publication-related measures, it can only tell us about one aspect of an institution’s quality.

Furthermore, the bibliometric databases used don’t cover all research fields equally well. “It works well for the natural sciences and medicine, but not very well for the humanities and social sciences,” Machale-Gunnarsson said.

'Only measures what can easily be measured'

“It is very difficult to measure university quality, and every ranking producer measures only things that can be measured easily,” Machale-Gunnarsson told Africa Check. “For example, they might want to measure how much students get to meet their teachers… and talk about the course contents… this would be a very interesting thing to measure, but there are no such data available.”

Users therefore need to question the individual indicators used and ask whether what they measure matters to them, Machale-Gunnarsson added. “For example, URAP measures only international research. Is that what you are interested in?” - Gopolang Makou (11/08/2017)


Additional reading:

FACTSHEET: How South African universities fare in global rankings (and does it matter?)

Mail and Guardian: University rankings a flawed tool

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.
limit: 600 characters