South Africa’s education system is in a state of “crisis”, critics say. The World Economic Forum claims that the country is at the bottom of the class for the quality of its maths and science education. And there are concerns about high drop-out rates in schools with nearly half of all school pupils dropping out long before they can reach their matric year.
So how do you assess the country’s education system and what data is available?
No single dataset has the answers. Different assessments highlight different problems. Some are far less credible than others.This guide outlines and evaluates the datasets available for measuring the state of South Africa’s education system.
The Southern and Eastern Africa Consortium for Monitoring Educational Quality (SACMEQ)
SACMEQ is an international non-profit organisation which undertakes education policy research. It is made up of 15 education ministries from Botswana, Kenya, Lesotho, Malawi, Mauritius, Mozambique, Namibia, Seychelles, South Africa, Swaziland, Tanzania, Zanzibar, Uganda, Zambia, and Zimbabwe.
SACMEQ has conducted four major education policy research projects between 1996 and 2013. The most recent data published by SACMEQ suggest South Africa (and a number of other countries) made tremendous improvements in the quality of its primary education between 2007 and 2013. However, education experts have questioned the validity of the 2013 results and the apparent improvement since 2007.
“A big part of the alleged Sacmeq improvements arise from different methodologies employed in 2007 and 2013, making them incomparable until they are properly equated,” wrote education researcher and fellow at the Organisation for Economic Co-operation and Development, Nic Spaull, in September 2016.
The previous research project, conducted between 2006 and 2011, assessed 61,396 Grade 6 pupils and 8,026 Grade 6 teachers in 2,779 schools across 14 countries.
South Africa scored relatively low for both literacy and mathematics. It came tenth for the reading ability of the pupils tested – behind Tanzania, Seychelles, Mauritius, Swaziland, Kenya, Zanzibar, Botswana, Zimbabwe and Namibia. Five countries – Uganda, Mozambique, Lesotho, Malawi and Zambia – fared worse than South Africa.
South Africa came eighth for mathematics behind Mauritius, Kenya, Tanzania, Seychelles, Swaziland, Botswana and Zimbabwe. Zanzibar, Mozambique, Uganda, Lesotho, Namibia, Malawi and Zambia scored lower in mathematics than South Africa.
SACMEQ did not identify any statistical improvement in South Africa’s Grade 6 mathematics and language performance between 2000 and 2007.
“SACMEQ and [Trends in International Mathematics and Science Study] stand out as being far better than anything else,” Martin Gustafsson, an economics researcher at Stellenbosch University, told Africa Check. “They are the only standardised measures of improvement over time. They are much better than anything else we have.”
Said Stephanie Allais, a senior researcher at the Centre for Researching Education and Labour at the University of the Witwatersrand: “It is important to benchmark ourselves against other African countries. What’s particularly useful about SACMEQ is that the teachers [also] have to write the tests, which they initially refused to do in South Africa.”
(SACMEQ’s second research project included a test for teachers. Due to teacher union objections, South Africa was one of the few countries that did not complete the section. The next SACMEQ study gave teachers an option to refuse to write the test. Out of 1,488 teachers, 245 refused to do so.)
Trends in International Mathematics and Science Study (TIMSS)
In most countries, the TIMSS test is taken by pupils in Grade 8. South Africa, Norway and Botswana are exceptions, though. In all three countries, the test is taken by Grade 9 pupils because it was found to be too difficult for those in Grade 8.
TIMSS is a cross-national study that measures trends in mathematics and science achievement in Grade 8. It has been conducted every four years since 1995. The TIMSS results can be compared over time and between countries. It has been described as the “gold standard” when it comes to testing techniques.
The report notes that “South Africa has shown the biggest positive change, with an improvement of 90 points in science and 87 points in mathematics. South Africa started with very low performance scores in 2003 and this upward shift translates to an overall performance improvement by approximately two grade levels between 2003 and 2015.”
Every five years, the Progress in International Reading Literacy Study (PIRLS) is conducted. The study, which was first carried out in 2001, assesses reading comprehension and tracks trends in reading literacy in over 60 countries. South Africa’s first assessment took place in the 2006 round.
As part of the 2016 study, a nationally representative sample of 12,810 Grade 4 pupils in 293 South African schools were assessed. Pupils are tested at this age because it is “an important transition point in their development as readers”. The study notes that “typically, by this time in their schooling, [pupils] have learned how to read and are now reading to learn”.
However, the vast majority of South African pupils (78%) did not meet the lowest literacy benchmark set by PIRLS, compared to 4% internationally. This means that they “cannot read for meaning or retrieve basic information from the text to answer simplistic questions”. There was no significant change in the results from 2011.
A paltry 0.2% of South African pupils attained the advanced benchmark, compared to 10% internationally. At this level, pupils are able to “integrate ideas as well as evidence across a text to appreciate overall themes, understand the author’s stance and interpret significant events”.
South Africa ranked last out of the 50 countries who participated in the study.
The ANA was a standardised national assessment introduced by the South African Department of Basic Education in 2011. It tested literacy and numeracy from Grade 1 through to Grade 6. Pupils in Grade 9 were also tested. The assessments were not designed to allow comparisons to be made between years.
There was controversy about the ANA data. For example, Spaull described the changes in the average Grade 1 mathematics score – from 68% in 2012 to 59% in 2013, and then back to 68% in 2014 – as “impossible”.
Allais described the ANA as a fledgling system that offered some insight into the troubles that beset the schooling system.
“We know that our schools are dysfunctional. So what government is doing with the ANA is showing us the problems that need to be addressed. But you don’t fatten a pig by weighing it. We don’t need to know more and more about what’s wrong. You fatten a pig by feeding it. We already know what’s wrong with our system, what we’re not so good at is know how to fix it,” she explained.
The annual matric results often receive the most attention, but they provide limited insight into the state of the education system.
“Matric is telling us some useful things about our education system. It’s telling us that most kids are very weak and achieve very little,” said Allais.
Gustafsson said that the pass rate can be influenced by a number of factors: “The problem with it as a measure of improvement is that in general, the pass rate will decline the more people you push through to Grade 12. Because when you push through more and more students the additional students tend to be poorer students and on average poorer students struggle to a greater degree in the examinations.”
He said that the pass rate could also be influenced by pupils’ subject choices.
The matric exam results also do not take into account the large numbers of school pupils who drop out long before their final year.
For every 100 pupils that started school in 2005, only 56 made it to matric in 2016, 37 passed and 14 qualified to go to university. Between Grade 10 and the final matric exams, nearly 500,000 pupils dropped out of the system.
World Economic Forum Global Competitiveness Report
The World Economic Forum’s education rankings make headlines every year. The Global Competitiveness Report for 2015–2016 placed South Africa’s math and science education last out of the 140 countries assessed.
But, as a ranking of education systems, the report is meaningless and offers no insight into the problems and challenges facing South Africa’s education system.
The ranking is not based on standardised, cross-national tests, as some of the other assessments are. Rather the ranking is based on an “Executive Opinion Survey”. This opinion survey is based on interviews with various unidentified “business leaders”.
Spaull has described the WEF ranking as “subjective, unscientific, unreliable and [lacking] any form of technical credibility or cross-national comparability”.
Allais said that the ranking system was incoherent and confused: “Maybe because the ranking comes from the World Economic Forum everyone thinks it is important. But of all the ranking systems it’s the least well thought through. The problem is that they don’t put it out as a perception index. They put it out as a ranking of education systems and it’s not that at all.”
© Copyright Africa Check 2019. Read our republishing guidelines. You may reproduce this piece or content from it for the purpose of reporting and/or discussing news and current events. This is subject to: Crediting Africa Check in the byline, keeping all hyperlinks to the sources used and adding this sentence at the end of your publication: “This report was written by Africa Check, a non-partisan fact-checking organisation. View the original piece on their website", with a link back to this page.