Anim Van Wyk BLOG: Africa Check joins Facebook’s fact-checking programme

When false news spreads on Facebook, people and even countries can face real harm. The network’s new fact-checking programme now has Africa Check to help stop the flow of fake and damaging content.

Facebook’s point of no return came on 15 April 2013.

After the Boston marathon had been bombed, photos of runner Jeff Bauman, his lower legs blown off, flooded the social media network.

To delete or not to delete? Facebook’s rulebook at the time was clear, as New York Public Radio’s Radiolab podcast has it. “No insides on the outside” could be shown.

If Facebook wanted to continue being seen as a neutral tech company, the small team should have stuck to their rules and removed the graphic image.

Instead, an executive under Facebook CEO Mark Zuckerberg “sent down an order” – make an exception.

And so, by judging what is newsworthy, Facebook became a media company that day.

A project to reduce the spread of false news

It’s the morning after the party that celebrated the internet as the great educator and democracy-enabler. As dawn breaks, we can start surveying the mess left by two uninvited guests: misinformation, and disinformation.

In Facebook’s corner lies unfettered hate speech against the Rohingya minority in Myanmar and fuel for Philippine President Rodrigo Duterte’s extrajudicial war on drugs. Scattered around are breaches of personal information and cross-border meddling in elections.

Facebook has long shirked the duty of publisher, but now says it has “a responsibility to fix the issues” on its platform.

Its campaign against false news includes removing fake accounts, giving people more context about what they’re reading and reducing the spread of false news.

The last part is where our community, the International Fact-Checking Network, comes in. And from today, Africa Check will be joining 27 partners in 17 countries in the effort.

Fact-checkers don’t remove content

As part of its third-party fact-checking programme, Facebook allows its partners to see public articles, pictures or videos that Facebook’s machines, or regular users, have flagged as potentially inaccurate. (Here’s how to report something you suspect is false.)

Fact-checkers evaluate the content’s primary claim and give it one of eight ratings, such as “false”, “mixture” or “true”.

If the primary claim is found to be inaccurate, Facebook reduces the content’s distribution on the network. When it does show up in someone’s news feed, related articles by fact-checkers are appended to it. People who have previously shared the content will also be notified of the additional reporting.

“We believe that downranking misleading content strikes the right balance between encouraging free expression and promoting a safe and authentic community,” says Tessa Lyons, a Facebook product manager focusing on the integrity of information on the news feed.

But it’s important to note the limits of the campaign.

Content isn’t removed. If the inaccurate content was posted by a page you follow, or your significant other commented on it, you’ll likely still see it in your news feed. But you probably won’t see it if a former colleague “likes” it.

The rating can be overturned. If it’s a genuine mistake, the person or publisher can correct the content and have the strike – as Facebook calls it – removed.

Private content, satire and opinion are off-bounds. In these cases, fact-checkers rate the flagged content as such.

And yes, Facebook pays fact-checking organisations for the work we do. It’s only fair that one of the biggest companies in the world compensates the small teams that help it clean up.

Focus on content that can harm

Facebook’s fact-checking programme isn’t without problems. There are only so many fact-checkers and so many millions times more false content. The social network has also been slow or absent in helping to protect the fact-checkers in Brazil and the Philippines who have faced relentless harassment.

Then there are also accusations of bias, particularly the view through the “partisan goggles” of US politics. It doesn’t help that Facebook is yet to flesh out guidelines on the types of inaccurate information it wants its fact-checking partners to prioritise.

As Africa Check starts fact-checking for Facebook, we’ll focus on bogus health cures, false crime rumours and things like pyramid schemes – the kind of content that can lead to poor decisions and physical harm.

(I can’t help but cringe when Facebook speaks of preventing “bad actors” from creating “bad experiences” for users. As though being lynched by a mob is akin to a second-rate night out.)

Is the fact-checking programme a perfect and permanent solution? Definitely not.

But the damage has been done and it’s time to put on the cleaning gloves.

I look forward to hearing from you at [email protected].

Anim van Wyk is chief editor of Africa Check, a nonpartisan and independent fact-checking organisation.

CORRECTION: A previous version of this blog post stated that Facebook “only allows third party fact-checking from countries with more than one verified signatory to the International Fact-Checking Network”, which is not the case in Canada, Turkey, Pakistan and India.

© Copyright Africa Check 2020. Read our republishing guidelines. You may reproduce this piece or content from it for the purpose of reporting and/or discussing news and current events. This is subject to: Crediting Africa Check in the byline, keeping all hyperlinks to the sources used and adding this sentence at the end of your publication: “This report was written by Africa Check, a non-partisan fact-checking organisation. View the original piece on their website", with a link back to this page.

Comment on this report

Comments 1

Leave a Reply

Your email address will not be published. Required fields are marked *


Africa Check encourages frank, open, inclusive discussion of the topics raised on the website. To ensure the discussion meets these aims we have established some simple House Rules for contributions. Any contributions that violate the rules may be removed by the moderator.

Contributions must:

  • Relate to the topic of the report or post
  • Be written mainly in English

Contributions may not:

  • Contain defamatory, obscene, abusive, threatening or harassing language or material;
  • Encourage or constitute conduct which is unlawful;
  • Contain material in respect of which another party holds the rights, where such rights have not be cleared by you;
  • Contain personal information about you or others that might put anyone at risk;
  • Contain unsuitable URLs;
  • Constitute junk mail or unauthorised advertising;
  • Be submitted repeatedly as comments on the same report or post;

By making any contribution you agree that, in addition to these House Rules, you shall be bound by Africa Check's Terms and Conditions of use which can be accessed on the website.


This site uses Akismet to reduce spam. Learn how your comment data is processed.