In a speech delivered at the International African Vaccinology Conference in Cape Town on 8 November, the head of the South African branch of the health NGO Doctors without Borders/MSF questioned the accuracy of the department of health’s figures for childhood vaccinations.
“Recent government data diverges substantially from the numbers published by the WHO and UNICEF,” said MSF General Director for South Africa Daniel Berman.
“The accuracy of South Africa’s data is in serious question,” he declared.
Department of Health claims 96% coverage
The data he was referring to was presented at a conference in Hermanus in mid-October by Johann van den Heever, the head of the Department of Health’s Expanded Programme on Immunisation (EPI).
EPI-SA, now a government programme, was started by the World Health Organisation in 1995 to make essential vaccinations available to all children worldwide. Today it covers every childhood immunisation from polio vaccination, delivered soon after birth, through to tetanus and diphtheria boosters which are provided at the age of 12.
In his presentation Van den Heever said that, according to official data, 96% of South African children had received the vaccinations they were supposed to at their age. By contrast the World Health Organisation (WHO) and UN children’s agency UNICEF estimated coverage of just 64 percent, he said.
Officials question their own data
Acknowledging what he called the “huge discrepancies” between the official EPI figures and those published by the WHO and UNICEF, Van den Heever said the government programme “disputes the WHO/UNICEF coverage figures”.
But it also “acknowledges that there are data quality issues” with the information system used by the department to gather data on vaccination coverage.
The official data was often “poor and unreliable”, he admitted. Among other problems, some districts “use old data collection tools that are not updated”. In others, there is “no consistency in the data tools used in the same district”. And often “data is not verified and monitored by supervisors”.
Pressure all over Africa from politicians
According to the MSF director Berman, health officials all over Africa are under pressure to declare stronger results than the data supports. Those in charge of health programmes are “under pressure from national politicians and international funders to report better and better results. Let’s be honest, coverage rates in many countries are exaggerated,” Berman said.
And turning to South Africa, he added: “If we’re honest, the numbers we have on South Africa’s vaccine programme are worrying.”
“Based on data analysed by the World Health Organisation and UNICEF, South Africa’s coverage data are significantly worse than data from countries with lower Gross National Income and less developed national health care systems,” he went on.
For example, the WHO/UNICEF data showed that in South Africa in 2011, only 72 percent of children under one year of age had received their three doses of vaccines against diphtheria, whooping cough and tetanus, he said. By contrast, 86 percent of children in Angola had done so, and 87 percent in Malawi, the MSF director added.
Is the WHO/UNICEF data more reliable?
According to Candy Day, a specialist in monitoring and evaluation at the non-profit organisation Health Systems Trust there are two standard ways of measuring vaccination coverage.
One is to carry out a survey and establish the proportion of children that have been immunised based either on interviews with the child’s parent or care-giver or on reviewing the “road to health” cards that each child is supposed to have to record vaccinations.
This is the method used by WHO/UNICEF and is considered the more reliable of the two standard methods, but prone to under-counting the number of vaccinations carried out as people forget or fail to record their children’s vaccinations properly.
Moreover, in the surveys carried out “there have been technical problems with recent surveys and other surveys are old and based on a small sample of children,” Day told Africa Check.
The other method, used by the department of health, is to gather data routinely at district health facilities, log all the immunisations and divide the number by the estimated child population in the district or region.
This method is prone to “over-counting”, particularly if the population estimates used are too low, which is thought to be the case in South Africa.
As WITS University statistical expert Alex van den Heever told Africa Check, estimates on immunisation rates are “only as good as the underlying data rather than any formula”.
And the data used by the WHO/UNICEF for their figures, even if more reliable than official data, was still found to be inadequate by the WHO itself.
As set out in this WHO document, revising the estimates for South Africa in 2011: “The WHO and UNICEF estimates of national immunisation coverage are based on data and information that are of varying, and in some instances, unknown quality.”
Conclusion: flaws in the data undermine all claims
“Accurate and valid data is critically important,” MSF General Director Daniel Berman said in his speech on 8 November, and nobody would disagree. Earlier, EPI-SA manager Johann van den Heever called for “drastically” improved data collection at all levels.
But for the moment, while the official figures declare that 96 percent of children in South Africa are fully immunised, and WHO/UNICEF figures warn that just 64 percent receive all the vaccinations they should at their age, the figures used by both government and the WHO/UNICEF cannot be relied on as accurate.
As the WITS statistical expert Alex van den Heever told Africa Check: “No verifiable claim can be made about immunisation rates in SA. Essentially no-one knows.”
Edited by Peter Cunliffe-Jones. Additional research by Ruth Becker.
Previous report ANC and DA both wrong on gap between rich and poor
© Copyright Africa Check 2017. You may reproduce this piece or content from it for the purpose of reporting and/or discussing news and current events. This is subject to: Crediting Africa Check in the byline, keeping all hyperlinks to the sources used and adding this sentence at the end of your publication: “This report was written by Africa Check, a non-partisan fact-checking organisation. View the original piece on their website", with a link back to this page.