Whether you’re a journalist, an activist, a business leader, a health worker or a regular citizen, how can you know when public figures tell the truth and when they distort it? How can you decide what claims are fair? Who can you trust? 

We’ve used our experience as journalists, with help and advice from specialist experts in a range of fields, to draw up this checklist of fact-checking tips.

1. Where is the evidence?

fact-checking tips look for evidence

When a public figure makes a claim, big or small, first ask yourself if the claim is plausible and worth investigating.

Your next question should be, “Where’s the evidence?”

Officials may often have a good reason to refuse to reveal the evidence for a claim they make. They may need, as journalists do, to protect their source. 

Sources do need protection, but we still require evidence. And another reason officials refuse to provide evidence is that it’s weak, partial or contradictory.

Ask for evidence, and if it isn’t provided you’ll know there is, or may be, a problem with the claim.

2. Is the evidence verifiable?

The next step is to find out if the evidence can be verified. Can its accuracy be tested?

In the scientific community, any new trial is only accepted once other researchers have tested it and produced the same or similar results. As Thomas Huxley, a prominent 19th century biologist, put it: “The man of science has learned to believe in justification not by faith but by verification.”

It should be the same in public debate. When a public figure, in any field, makes a claim they want believed, they should be able to provide verifiable evidence. 

If they can’t, can you take what they say on trust?

3. Is the evidence sound?

We have looked, but there is no single fact-checking tips checklist that covers all the different types of evidence you might have to assess before you decide a claim is sound. 

Listed below are the main questions we ask.

Could they know what they claim to know?

If the evidence is based on an eyewitness account, could the person know what they claim to know? 

Were they there? Is it likely that they would have access to this sort of information? Is the information first-hand? Or is it second-hand, something they had heard and believed? Is it something that could be known?

If there is data, when was it gathered?

One trick public figures use is to present information collected many years before as if it were from today, with no mention of dates. But data ages. 

To understand the data, you need to know when it was gathered and what the picture looked like before and after. Public figures may also present data with specific start and end dates, not because this reflects real conditions but to make the numbers look good, starting at the bottom of a regular cycle and ending at the top.

Was the sample large enough? Was it comprehensive?

An opinion poll that samples the views of a few dozen – or even a few hundred – people is unlikely to represent the views of a population of millions. 

For most polling organisations, a well-chosen sample of around 1,000 people is the minimum needed to produce accurate results. But public figures often quote – and the media then report – surveys of a few hundred or few dozen people as representing wider views.fact-checking tips black swan problem

And even large scale surveys can give an inaccurate picture if they don’t look in all the right places. 

This is known as the “black swan problem”. For centuries, people in Europe assumed that all the world’s swans were white because that was the colour of the hundreds of thousands of swans found there. It was only when 17th century Dutch explorer Willem de Vlamingh returned to Europe after discovering black swans in western Australia that people began to know better. The previous “sample” had been large, but not large enough.

How was the data collected?

Sample size is not all that matters. Researchers taking an opinion poll must include people of all relevant social groups – both genders and all ages from different races, regions, and social and economic groups – and in the right proportions. This makes the poll more representative of society as a whole.

How was the study done? Similar surveys done door-to-door can produce different results from those done on the phone because people may respond differently when they’re interviewed face-to-face and on the phone.

And studies that rely on people filling in forms tend to show more errors, particularly if the respondents are less literate, than person-to-person interviews. If the claim is based on a survey like this, could that be a factor?

Meanwhile, what people taking part in a study or a trial know, or think they know, about it, will also affect the outcome. 

This is known as the “placebo effect” – when people’s belief that they’re taking a medicine, when they actually aren’t, affects their symptoms. It’s why medical trials often are, or should be, “blinded” so that patients being studied do not know the nature of the treatment they have or haven’t been given.

Look at the wider picture

Once you know how the data was collected, assess the way it was presented. Did the person tell the truth, the whole truth and nothing but the truth? Public figures may choose what to tell you, and what not to, cherry-picking the juiciest evidence, favourable to their side in an argument, and leaving the less tasty morsels in the bowl.

Is the data presented in context, and would it still support the claim if other, unmentioned factors were taken into consideration?

Say a politician claims that they put “record sums” into the public health system, without mentioning inflation. The claim may in itself be true, but it’s misleading if inflation has caused spending to actually fall, in real terms. Always look at the other factors that make up the wider picture.

And remember to keep numbers in proportion. Spending $50 million on a health project may sound like a lot, for a small community. But divide it among a population, and note that the programme is set to run over 10 years and it seems a lot less generous than it did at first.

4. Data sources, experts and the crowd

If the person making a claim can’t or won’t offer evidence to back it up, this may make it harder to check – but doesn’t prove it wrong. To check it, you can turn to credible data sources, acknowledged experts and crowdsourced information.

Data sources

There are many useful sources of data for checking claims.

You can find information in government papers and official statistics, company records, scientific studies and health research databanks, as well as in school records, development charity accounts, religious orders’ papers and more.

Africa Check maintains a library of guides and factsheets that provide sources of reliable data on key questions. And our Info Finder tool offers useful sources of data on a wide range of topics for Africa, Kenya, Nigeria and South Africa.

Experts

Some claims – to do with medicine, say, or requiring detailed knowledge of a major company’s accounts or a fine point of law – may be better checked by talking to a number of recognised experts.

When you consult an expert, it’s important to ask them to declare any interest they have in the matter that could cause, or be seen to cause, a bias in their analysis.

Some experts may ask you not to use their names. This weakens your report but, if their information can be independently verified, may be acceptable. Unverifiable information from an anonymous source who will only talk off the record should not be used.

The crowd

And for other topics, the best source of information to check a claim may not be research papers or experts, but the knowledge found in the wider community. This is known as crowdsourcing.

If an official claims on election day that all polling stations got their ballot papers on time, or an environmental group claims a factory is polluting a neighbourhood, the best people to confirm or disprove what they say may be the wider community.

But take care when sourcing information from the crowd. First, it’s important to guard the security of your sources. Information sent by SMS, email and other means can be intercepted. And in some countries people who supply sensitive information to the media may suffer for it. So always set up safe and secure ways to communicate.

But you still need to know who your sources are and whether their information is reliable. 

Confirm the identity of anyone who sends you information. Anonymous information should be treated with scepticism. Be wary of mass emails from groups with agendas to push. And take care that you don’t use individual people’s stories, their anecdotal evidence, as if it represented the wider community’s experience.

5. Spotting fakes

Some claims are presented not in words but in photos, videos and other content sent to you or published online. Photos, videos, documents, websites and posts on Twitter, Facebook, Instagram and other social media can all be faked. 

How can you spot what is real and what is fake? These are our fact-checking tips.

Is it convincing or suspicious?

Before you start to look for evidence to check online material, engage your brain. Do the images or words ring true? Is the person likely to use that language or sentiment? Is it the sort of thing they might really have said?

We all understand when people are taken in by clever hoaxes. But if it is obvious, after the event, that the person would have been unlikely to say the words quoted, and you did not check, you may look foolish.

So first, think. And then, if in doubt, check with the person or organisation quoted.

Is there a telling detail out of place?

Hoaxers are often let down by the details. Always be sceptical. The quote used in this meme is not wrong, but something should make you realise it was probably not the 19th century US president who said it.

Look at the phrase used and ask if it would have been said at the time. Look at the photo or video and assess if the light and shade falls naturally. Perhaps, in the background, there are things that should be there but aren’t, or shouldn’t be there but are. 

Is the weather in the image or video the kind of weather you would expect in that place, at that time of year? Are the views, plants, cars and buildings what you would expect to see?

If the details are out of place, it may be a hoax.

Has it – or something similar – appeared elsewhere before?

Hoaxes are often copied, and have a long life online. If you are suspicious about an image or text, check if it – or something similar – has been published somewhere else before.

Run a Twitter search for the material with the hashtag #fake to see if other Twitter users have called it out.

The Twitter logo is seen on a phone. (Photo: Alastair Pike / AFP)

To see if text has been published before, drop it into Google search.

Photos and screenshots of videos can be uploaded onto reverse image search engines such as TinEye, Yandex and Google Images. These help you check if and where the image has been published online before. TinEye also allows you to search for the oldest, largest and most modified version of the image.

If the image, or one very like it, has been published previously in a different context, it may be being used in a hoax

Has the person posted material somewhere else?

People often use the same username on different platforms, so if you’re looking for similar material from the same person, search for their username on Google, Twitter, Facebook, Instagram, YouTube, Flickr and other platforms. 

Check if the person who sent it is where they say they are

If you have doubts about the source of some information, and have the numerical address – the IP code – of the computer it came from, enter this into domaintools.com/reverse-ip to find out which country the computer is in.

6. Be persistent

Fact-checking takes time and persistence. When someone tries to fob you off, refusing to give you data you are entitled to, or failing to provide evidence that backs up their claim, keep pushing.

Verifying public debate is not easy. The devil is often in the detail. To find it you need stamina and persistence.

7. Be open and accept you’ll have critics

Finally, be open in the way you write up any fact-checking report, providing links to your evidence. And be honest. If you make a mistake, admit it. 

Even so, you must accept that you won’t convince everyone.

Most people are reluctant to accept evidence that contradicts something they believe. And some will never be convinced, no matter how much careful argument and well-linked evidence you present. 

Scientists call this the “persistence of discredited beliefs”. Here’s how the psychologists Craig Anderson and Lee Ross explain it: “Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases.”

Some people, you just can’t convince.