Back to Africa Check

BLOG: Africa Check joins Facebook’s fact-checking programme

Facebook’s point of no return came on 15 April 2013.

After the Boston marathon had been bombed, photos of runner Jeff Bauman, his lower legs blown off, flooded the social media network.

To delete or not to delete? Facebook’s rulebook at the time was clear, as New York Public Radio’s Radiolab podcast has it. “No insides on the outside” could be shown.

If Facebook wanted to continue being seen as a neutral tech company, the small team should have stuck to their rules and removed the graphic image.

Instead, an executive under Facebook CEO Mark Zuckerberg “sent down an order” - make an exception.

And so, by judging what is newsworthy, Facebook became a media company that day.
 

A project to reduce the spread of false news


It’s the morning after the party that celebrated the internet as the great educator and democracy-enabler. As dawn breaks, we can start surveying the mess left by two uninvited guests: misinformation, and disinformation.

In Facebook’s corner lies unfettered hate speech against the Rohingya minority in Myanmar and fuel for Philippine President Rodrigo Duterte’s extrajudicial war on drugs. Scattered around are breaches of personal information and cross-border meddling in elections.

Facebook has long shirked the duty of publisher, but now says it has “a responsibility to fix the issues” on its platform.

Its campaign against false news includes removing fake accounts, giving people more context about what they’re reading and reducing the spread of false news.

The last part is where our community, the International Fact-Checking Network, comes in. And from today, Africa Check will be joining 27 partners in 17 countries in the effort.
 

Fact-checkers don’t remove content


As part of its third-party fact-checking programme, Facebook allows its partners to see public articles, pictures or videos that Facebook’s machines, or regular users, have flagged as potentially inaccurate. (Here’s how to report something you suspect is false.)

Fact-checkers evaluate the content’s primary claim and give it one of eight ratings, such as “false”, “mixture” or “true”.

If the primary claim is found to be inaccurate, Facebook reduces the content’s distribution on the network. When it does show up in someone’s news feed, related articles by fact-checkers are appended to it. People who have previously shared the content will also be notified of the additional reporting.

“We believe that downranking misleading content strikes the right balance between encouraging free expression and promoting a safe and authentic community,” says Tessa Lyons, a Facebook product manager focusing on the integrity of information on the news feed.

But it’s important to note the limits of the campaign.

Content isn’t removed. If the inaccurate content was posted by a page you follow, or your significant other commented on it, you’ll likely still see it in your news feed. But you probably won’t see it if a former colleague “likes” it.

The rating can be overturned. If it’s a genuine mistake, the person or publisher can correct the content and have the strike - as Facebook calls it - removed.

Private content, satire and opinion are off-bounds. In these cases, fact-checkers rate the flagged content as such.

And yes, Facebook pays fact-checking organisations for the work we do. It’s only fair that one of the biggest companies in the world compensates the small teams that help it clean up.
 

Focus on content that can harm


Facebook’s fact-checking programme isn’t without problems. There are only so many fact-checkers and so many millions times more false content. The social network has also been slow or absent in helping to protect the fact-checkers in Brazil and the Philippines who have faced relentless harassment.

Then there are also accusations of bias, particularly the view through the “partisan goggles” of US politics. It doesn’t help that Facebook is yet to flesh out guidelines on the types of inaccurate information it wants its fact-checking partners to prioritise.

As Africa Check starts fact-checking for Facebook, we’ll focus on bogus health cures, false crime rumours and things like pyramid schemes – the kind of content that can lead to poor decisions and physical harm.

(I can’t help but cringe when Facebook speaks of preventing “bad actors” from creating “bad experiences” for users. As though being lynched by a mob is akin to a second-rate night out.)

Is the fact-checking programme a perfect and permanent solution? Definitely not.

But the damage has been done and it’s time to put on the cleaning gloves.

I look forward to hearing from you at [email protected].

Anim van Wyk is chief editor of Africa Check, a nonpartisan and independent fact-checking organisation.


 
CORRECTION: A previous version of this blog post stated that Facebook "only allows third party fact-checking from countries with more than one verified signatory to the International Fact-Checking Network", which is not the case in Canada, Turkey, Pakistan and India.

Republish our content for free

Please complete this form to receive the HTML sharing code.

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.
limit: 600 characters

Want to keep reading our fact-checks?

We will never charge you for verified, reliable information. Help us keep it that way by supporting our work.

Become a newsletter subscriber

Support independent fact-checking in Africa.