Why social network surveys don’t necessarily reflect the views of SA youth

Comments 12

In recent months a South African marketing research company has peppered local media outlets with headline-grabbing releases detailing the findings of their "cellphone" surveys of South African youth. The surveys do not provide true representation of youth opinion across the country yet much media coverage has been unquestioning.

Do 38% of South Africa’s youth believe that the policies recently announced by Julius Malema’s Economic Freedom Fighters will win the party support? Would 26% of the country’s youth vote for the disgraced former ANC Youth League president’s new party?  Do 42% of young people believe that foreigners have a negative impact on South Africa? And do as many as 52% of young black South Africans truly believe the Democratic Alliance will bring back apartheid if they win an election?

These are some of the startling claims made by the “consumer insights company”, Pondering Panda, which boasts that its use of “mobile technology” to conduct youth surveys allows it to “deliver much larger samples, faster and more cost-effectively than traditional research suppliers”.

Courting controversy

Over the past six months, the company has peppered the South African media with headline-grabbing press releases announcing its latest findings. It has notched up an impressive hit rate. Its website lists close to 200 news articles published between late January and early August 2013 that quote from, or are based on, its surveys. A sizable number regurgitate the contents of the company’s press releases.

As a result, Pondering Panda has garnered hundreds of thousands of rands – possibly millions – in free publicity and has secured some of South Africa’s best-known brands as its clients. The company’s CEO, Butch Rice, has also made a name for himself with his highly critical and controversial comments about traditional research and polling methods and those who use them.

The research industry in South Africa is a “Jurassic Park”, he was quoted as saying at a conference in May this year. “The traditional research industry still clings stubbornly to invalid analytical techniques, inefficient research designs, and cumbersome and expensive reporting, which takes an inordinately long time to deliver.”

The tagline on Pondering Panda’s website promises: “A new wave of research. Finally.”  The company boasts that its surveys have “faster turnaround times” with “typically 2,000 responses in a 24-hour period”.

But do these surveys accurately reflect the views of South Africa’s youth? Are they reliable? And are the media being critical enough when electing to publish their findings?

Opting-in to social media

Pondering Panda’s press releases contain an important caveat. “All interviews were carried out on cellphones … across South Africa,” they state, and the responses are “weighted” to be nationally representative of age, gender and race. Findings are generally attributed to “South African youth”.

What the press releases do not say is that all their research is done using an “opt-in system” that surveys users of the MXit social media platform. (It is an odd omission, given that the Pondering Panda website states that it has “exclusive access to MXit”, which it describes as “Africa’s biggest social network”.)

As an example, we decided to look a little more closely at the Pondering Panda survey which found that the “majority” of young blacks in South Africa believed the Democratic Alliance (DA) would bring back apartheid if they came to power.

The survey gave the company headlines, and lots of free publicity, after its release in April this year. Reporters who wrote about it made little or no attempt to seek further comment or to interrogate and contextualise the findings. Entire paragraphs were lifted from the Pondering Panda press release or attributed to its spokesperson.

Pondering Panda states that “full results are available on request”. When Africa Check requested the results of the DA survey, we were sent a spreadsheet document. It indicated that the survey was conducted over six days and that there were 3,009 respondents aged between 15 and 34. It did not give a detailed explanation of how the data was collected and weighted. The only mention of MXit in the document was the statement: “All data gathered over MXit social network”.

The devil in the detail

How are the surveys conducted? Rice told Africa Check: “Data is collected by an opt-in option on the MXit base, which has 6.5 million active users on a monthly basis. The sample sizes are significantly larger than those of traditional research suppliers, often running into tens of thousands. Data from the Census is used to weight or balance the sample, in line with standard research practice by all companies,” he said, adding that “the key to sample representivity is dispersion”.

Herein lies a problem: MXit, like many social media tools, does not require people to use their real names when signing up. Many don’t.

Rice claimed Pondering Panda had checks and balances in place to ensure the platform yielded “valid and reliable data”, including checking a respondent’s identity against details they supplied when signing up to MXit and cross-comparing it to information supplied when they complete surveys.

How effective these checks and balances are remains unclear. An Africa Check researcher signed up for MXit under a false name, provided a false date of birth, downloaded the Pondering Panda app and completed a number of surveys. One of them asked for details on race and geographic location. The Africa Check researcher claimed to be a black male, aged 18 to 24, from the Northern Cape. (The age correlated with the date of birth given on signing up.) The researcher is, in fact, a 38-year-old white male from Gauteng.

A further problem is that there is some confusion about the number of active MXit users in SA. Pondering Panda claims on its website that MXit has 6.5-million active users in SA. But stats published by MXit in October 2012 stated there were almost 50-million registered users and 9,346,806 active users. Yet six months later, in March 2013, Vincent Maher, vice president of growth and product strategy at MXit, said the “latest active users’ statistics” for South Africa was 6.3-million per month.

Questions and answers

Another key factor to consider is the way the survey question was formulated: “Some people think the DA will bring back apartheid if they won an election. Do you think this is true?”

The question is problematic because different people will interpret it differently, said Derek Luyt, a researcher, analyst and director of the Centre for Accountable Governance.

For instance, some respondents may think they are being asked whether it is true that “some people” think the DA will bring back apartheid, not whether they themselves hold that view. Respondents could also have very different ideas about what is meant by the word “apartheid”. Some of them were probably born after apartheid was scrapped, or were too young to remember exactly what it was like.

“It’s not quite as bad as the old, ‘Do you prefer apartheid or socialism?’ question, but it’s not a question that is particularly meaningful either,’ Luyt told Africa Check.

“Unquestioning” journalists

Lauren Shapiro, a director of Futurefact – one of South Africa’s longest-running research and survey companies  –  said social media research, like that conducted by Pondering Panda, certainly has a place. But she cautioned that it was always necessary to reveal the technical details of how such surveys were conducted.

Rice, a co-founder of Research Surveys, now TNS South Africa, lectured in statistics and marketing research at the University of Cape Town for 10 years. He disparages “just about all” other research methods. Despite questions about the accuracy of data collection on MXit, compared to more conventional door-to-door surveys where people are interviewed face-to-face, he told a marketing conference: “The future of market research is digital, and the digital future is mobile. It is an indictment of market research in South Africa that companies are still investing in door-to-door interviews and focus groups”.

Luyt said he felt that “a large part of the problem is the unquestioning attitude of the media”. “Instead of dissing others, Pondering Panda would probably do the media more of a favour by trying to enlighten journalists about research methods rather than trying to dazzle them with sexy headlines.

“There is a definitely a place for the type of surveys Pondering Panda does, but it is also important to accept the limitations and not to claim to do more than you actually do. You cannot compare it to the work of researchers who do deeper, longer-term research and try to understand things in greater depth and detail.”

Media consultant Gordon Muller believes that research using social media platforms is becoming increasingly important for the quick insights they give.

“To dismiss all other kinds of research as irrelevant and wasteful is wrong…They all have a place. From a macro point-of-view, using a dipstick approach is where social media comes in. But I would urge caution to any researcher who says they are talking on behalf of all youth in South Africa.

“My real issue is why trained journalists are just picking it up and running with it,  without examining it critically and asking questions?”

Conclusion – Surveys  do not represent views of SA youth

Pondering Panda cannot claim that their findings are representative of “South African youth” or – in the case of the DA survey – of the “majority of young blacks”. In the latter case, at most they can claim that 52% of the MXit users who installed their app, took the time to respond to the survey and who said they were young and black believed that either (1) the DA would bring back apartheid or (2) that “some people” believed the DA would bring back apartheid. It is not quite the same thing as the “majority of young black South Africans”.

Pondering Panda’s suggestion that it conducts cellphone “interviews” is also misleading. There are no interviews in the true sense of the word. Respondents who have downloaded the Pondering Panda app choose which surveys to fill out and tick off the answers to questions.

The real blame, however, lies with the media outlets that have unquestioningly swallowed  Pondering Panda’s headline-grabbing press releases. There are probably a host of explanations: budget cutbacks, overworked and inexperienced reporters and an insatiable appetite for cheap, cut-and-paste web content.

The reporting on Pondering Panda, for the most part, has not been journalism. Sadly, far too many of the articles that Pondering Panda so proudly link to on their website are classic examples of “churnalism”. And the reporters have become little more than public relations tools, unquestioningly and unthinkingly churning out copy that has a gimmicky appeal but very little merit.

Edited by Julian Rademeyer

© Copyright Africa Check 2020. Read our republishing guidelines. You may reproduce this piece or content from it for the purpose of reporting and/or discussing news and current events. This is subject to: Crediting Africa Check in the byline, keeping all hyperlinks to the sources used and adding this sentence at the end of your publication: “This report was written by Africa Check, a non-partisan fact-checking organisation. View the original piece on their website", with a link back to this page.

Comment on this report

Comments 12
  1. By Shaun

    I weep for the state of my profession. It is poor journalism such as the examples quoted above that give ammunition to those who would curtail press freedom.

    Reply Report comment
  2. By Kenny

    Thanks for the article. I had a major problem with how the media just latched onto the survey’s without questioning. Its time some of our journo’s do more than just regurgitate press statements.

    Reply Report comment
  3. By Africa Check

    Fair point, Megan 🙂 It is a clumsy word, although one that is quite commonly used. We’ve updated the story and replaced “juniorised” with “inexperienced”.

    Reply Report comment
  4. By Question Method

    Unfortunately, this analysis also suffers from some weaknesses. Take the conclusion: “Surveys do not represent the views of SA youth.” The article does not make that point. Rather it makes the point that surveys “might not” represent the views or that they are “not statistically valid” in representing the views of the youth. I actually came out from the article with a higher respect for Pandering Panda, who provided all of their data and are clear on their methodology. It absolutely has it’s weaknesses, but the blame for not understanding those lay almost entirely with unquestioning media.

    Reply Report comment
  5. By Alta

    What is the distribution of the age of MXit users? This is more useful than lower and upper limits. I discussed this article with some students today and according to them MXit is for high school kids. If this is true it further adds to the questionability of surveys which mainly focuses on MXit for sampling. What is the definition of ‘the youth’ then?
    Ironically Butch Rice addresses similar assumptions about a survey on the South African middle class on his blog.

    Reply Report comment
  6. By jerry

    about pondering;you also sound to be DA’s defenders then objectivilly analysing all questions in their survey.your intention was not to defend the truth but to promote the DA

    Reply Report comment
  7. By Africa Check

    Hardly, Jerry. We chose the DA survey as an example because it was one of Pondering Panda’s most widely-publicised surveys.

    Your accusation of bias towards the DA is also unfounded. Perhaps you should read some of our other reports. Like this one about Helen Zille’s claim that no-one in Cape Town has to use bucket toilets. We found her claim to be wrong: https://africacheck.org/reports/claim-that-no-one-in-cape-town-has-to-use-bucket-toilets-is-wrong/

    Reply Report comment
  8. By Sean

    I definitely agree with this article. I believe that pondering panda should explicitly state their unweighted and weighted results and state how they have weighted the data precisely (race, age, gender, geographical areas, province etc…). Furthermore, they should generalise to “South African youth who use Cellphones and Mixit” and not “South African youth as a whole”. Then again it doesn’t sound as exciting and credible.

    If the organisation really is concerned about their data and presenting credible findings, they should have open ended questions in their surveys and analyse the qualitative data as well. However, this will be very expensive and time consuming and will eat into their profit margins, which in turn means it won’t happen.

    Reply Report comment
  9. By Pondering Panda

    We are slightly bemused that we have been targeted by Africa Check of late, particularly when the practices of traditional research companies are so much more obviously flawed, and thus deserving of attention in terms of validity. But, let’s put that aside, and debate whether the criticism of our findings are justified.

    The acid test of any research survey or method is simple – the validity of the findings. How well do the findings mirror the ” truth “, whatever the truth might be. Anything else about the research design is almost irrelevant by comparison. It is the search for the truth that counts, how closely can research show what’s really going on out there. The problem is, it is sometimes difficult to calibrate the findings of a survey against the truth. For example, how could we ever find out the true number of people who believe that the DA would bring back apartheid, if they were to take over from the ANC? It’s simply not possible. The true figure will continue to elude us, and can never be established beyond any reasonable doubt. We have to rely on survey findings in cases such as these, and our faith in the findings will depend on the faith we have in the research methodology.

    Fortunately, there are ways of evaluating survey methods, and Pondering Panda welcomes challenges relating to data validity. All we have to do is contrast survey findings with facts we know to be true, or as close to the truth as possible, based on statistics in the public domain that are considered accurate. The support for a political party in an election is one such ” fact “, assuming the election is not rigged. Or how many people actually registered to vote, and so on. By asking these types of questions in a survey, the validity of the survey method is subjected to an acid test. It either correlates with reality, or it doesn’t.

    Before launching Pondering Panda to clients, we spent eight months running vigorous experiments on our data, reassuring ourselves that our survey method would yield valid and reliable data. Only after all tests had been successfully passed did we launch the company to the outside world. Predictably, our new clients subjected our data to their own internal checks. Did our surveys yield the same findings as those of their other research suppliers, in comparable surveys? Happily, we have yet to fail a test of this nature, which is evidenced by our growing list of blue chip clients. However, those findings are confidential, and hard facts are needed to address an appropriately cynical audience. So, let’s take a look at three examples of our validations in the public domain (available on our website at http://www.ponderingpanda.com/validations):

    The correlation of home language as reported by our respondents, with Census.
    Claimed registration for voting in the last election, compared to the official IEC registration data.
    Claimed support for political parties in the last election, compared to actual voting results.

    Take a look, and you’ll see that the correlation between Pondering Panda’s survey results and reality is striking. So before criticising our data, we believe an appropriate and more accurate approach would be for Africa Check to take a look at traditional research providers, and see how their validations match up. We challenge any traditional research supplier to match the accuracy of our findings.

    There are many technical points raised by Ray Joseph in his piece, and going into each of them individually will probably only bore your readers. We will willingly provide a response on any technical points raised by Ray in his piece to anyone who requests it. However, we would like to address his criticism of the opt-in method we use. In this, our interviews, gathered via mobile phones are compared to the seemingly more representative samples gathered by traditional research suppliers, whose respondents are approached by field workers, and asked to be interviewed. Much better than opt-in you might say. Until you discover that refusal rates to participate in surveys can routinely be more than 90%, and less than 10% ” opt-in ” to the traditional approach. So the real question is just how representative are those supposedly representative samples? Would you agree to be interviewed for a 27 page Afrobarometer questionnaire if they knocked on your door tonight? (Because that is how long their questionnaire is, according to their website). Those ” representative ” samples are very much opt-in, with a big question mark over what type of person would willingly give up an hour or more of their time for a boring and repetitive interview. And just how valid do you think the responses are, by the time the luckless respondent gets to the 27th page?

    In the global world of research, it is widely accepted that the future of research is in the mobile space. Short interviews with lots of people trump long interviews with few people. Without going into detail, it has been categorically proven in the fields of behavioural economics and neuroscience, that shorter surveys provide more valid data. And mobile surveys recruit more dispersed respondents than traditional sample designs, thus covering more views in a heterogenous population. Because of the dramatically lower demands of respondents in terms of interview length, far more people are prepared to give their views, as it typically takes less than a couple of minutes to respond. The result? Higher quality data, with significantly larger samples.

    Is Pondering Panda perfect? Absolutely not. Are we better than traditional research? Emphatically yes.

    If we are to be criticised for our methodologies, we suggest that we are calibrated against research findings from traditional suppliers. Put us head to head. Compare the quality of our data on specific issues. When we differ, then decide who is closer to the truth. And let’s see which traditional research supplier will accept our challenge of replicating our surveys and validating them against actual behaviour. That is the acid test.

    Reply Report comment

Leave a Reply

Your email address will not be published. Required fields are marked *


Africa Check encourages frank, open, inclusive discussion of the topics raised on the website. To ensure the discussion meets these aims we have established some simple House Rules for contributions. Any contributions that violate the rules may be removed by the moderator.

Contributions must:

  • Relate to the topic of the report or post
  • Be written mainly in English

Contributions may not:

  • Contain defamatory, obscene, abusive, threatening or harassing language or material;
  • Encourage or constitute conduct which is unlawful;
  • Contain material in respect of which another party holds the rights, where such rights have not be cleared by you;
  • Contain personal information about you or others that might put anyone at risk;
  • Contain unsuitable URLs;
  • Constitute junk mail or unauthorised advertising;
  • Be submitted repeatedly as comments on the same report or post;

By making any contribution you agree that, in addition to these House Rules, you shall be bound by Africa Check's Terms and Conditions of use which can be accessed on the website.


This site uses Akismet to reduce spam. Learn how your comment data is processed.