Back to Africa Check

GUIDE: 7 steps to detect if someone is talking science nonsense

Chocolate could make you lose weight, or so US journalist John Bohannon fooled the world into believing last year. In collaboration with two German TV producers he created a fake study, saying that people who ate a low-carb diet lost more weight if they ate chocolate.

Many media organisations, such as the Huffington Post, the Daily Star in the UK, and the Times of India, picked up the story and Bohannon’s revelation left many journalists red-faced – from both embarrassment and anger.

It wasn't the first time this had happened – where a juicy press release backed by questionable science conveys a story that excites news editors and journalists – but this time, someone was waiting to catch them out.

Science is a difficult area to report on because it is highly improbable that one person knows everything about all the different and very diverse fields of science. An understanding of astrophysics doesn't mean that the person knows enough about chemistry, biology, or nuclear engineering.

Scientists also have an aura of gravitas and authority – we are more likely to believe a scientist than, say, a politician – and this also makes it difficult to call scientists out on gaps in their story.

So how do you know when someone is talking science nonsense? Journalists can start by asking these questions:

1. Is the journal legit?


The first place to look is at the study backing up the press release or story.

While the words “academic study” give the impression of validity, the nitty-gritty of the study is also the chink in the claim's armour.

Just because someone has published a study in a scientific journal does not mean it is valid. There is a hierarchy of journals, with some being more respected than others: Science, Nature and Cell, for example, are perhaps the pinnacles of scientific publishing. Elsevier, one of the major academic publishers, has a database of peer-reviewed articles and publications called Scopus.

But not all journals are legit. In a blog on the Public Library of Science (PLOS) website Dr. Eve Carlson provides 11 tips for determining if a journal is the real deal, based on a true experience. Her most important tip is to check if the journal has an editor and whether anyone in the scientific community knows who that is.

Part of the reason that the top academic journals are respected is because they have stringent peer-review processes. The process subjects scientific research papers to independent scrutiny by other qualified scientific experts (peers) before they are published.

This helps weed out false claims and doctored findings, although it is unfortunately not always the case. In 2014 and 2015, some of the biggest names in academic publishing -- such as Springer Nature, Wiley and Elsevier -- found out that researchers were peer-reviewing their own papers. But fabricated publications are the exception rather than the rule.

2. Who is the scientist?


Someone may have a “Dr” or “Prof” in front of their name, but that doesn't mean they should be unquestioningly believed.

Ask if the scientist is from a reputable institution. Look into the scientists and their institutions to make sure that they are in fact qualified to be making the claims that they're making.

Online portal Research Gate is a good place to start looking for an academic's publishing history (and to make sure that there is one) or Google Scholar.

Has the academic published papers before? (The answer should definitely be yes.)

What did their peers think of their research? (You can find this by checking what else is being done in the field, and contacting other researchers -- see point 5.) In adjudicating the validity of a scientific claim, the opinion of a scientist's peers is more important than popular opinion.

If any of the journalists who covered the chocolate study had googled lead author Dr. Johannes Bohannon, they would have discovered that he and his institute – the Institute of Diet and Health – were fake.

3. How was the study conducted?


When looking at a study, the most important question you can ask yourself is: what is the sample size? A warning bell should go off if the study (or press release) does not mention the sample size. The chocolate “study” did not, possibly because there were only 15 people included in the study.

Compare this to a study of more than 2,300 Finnish men over 21 years, which showed that men who frequently sauna-ed had a lower risk of cardiac deaths. The large sample size and the length of the study make the results compelling and much more statistically significant than what happened to 15 people's weight over a period of 21 days.

4. Does it show causation or correlation?


Even though the large sauna study talked about a “link” it did not say that saunas cause men to live longer. This is because there is a difference between a “link” (also referred to as “association” and “correlation”) and a “cause”.

When two things happen at the same time it does not mean that one caused the other. But this conflation (causation versus correlation) is one of the most common mistakes in science and health reporting.

Rebecca Goldin, a mathematical sciences professor in the US who is also director of the non-profit organisation Sense about Statistics (STATS), used this example to explain the difference:

“Eating breakfast has long been correlated with success in school for elementary school children. It would be easy to conclude that eating breakfast causes students to be better learners. Is this a causal relationship – does breakfast by itself create better students? Or is it only a correlation: perhaps not having breakfast correlates highly with other challenges in kids’ lives that make them poorer students, such as less educated parents, worse socio-economic status, less focus on school at home, and lower expectations.”

Whether reporting on science or reading about science, always ask yourself: do they really mean “cause” or should it be “is linked to”?

5. What does the scientific community say?


Never take anyone's word. Scientists benefit from having their name and work highlighted in the media – this attention is often linked to a higher profile and increased funding, as shown in a report on public engagement by the Research Councils UK.

While peer-review is useful in gauging the validity of a scientist's work, it is only the starting point. You need to find someone in the same field to talk to, someone who was not involved in the research and is not affiliated with that scientist's institution.

For example, when Prof Lee Berger from the University of the Witwatersrand last year announced the discovery of a new hominin species, Homo naledi, very few local journalists asked international experts in the same field for their opinions. If they had, they would have found that international opinion is divided on whether it is a new species.

6. What other research has been done?


Science seldom happens in a vacuum. On the big issues – genetics, particle physics, exoplanets, nuclear technologies, gravitational waves, hydraulic fracturing (fracking) – there are scientists all over the world investigating these topics.

If all the other scientists in a field say that x causes cancer and a new study says that x does not cause cancer, you need to ask yourself whether it is credible.

7. Who funded the research?


A fundamental tenet of journalism holds true for science: follow the money. Doing research costs a lot of money, and someone has to pay for it.

Governments often bankroll their countries' science, or philanthropists (such as the Bill & Melinda Gates Foundation, or the Wellcome Trust, a UK foundation), or industry. There are many instances – such as the effect of smoking on cancer, and climate science – in which funders dictated, obfuscated or manipulated the results.

This is why it is always important to know who is funding the research and whether it causes a conflict of interest – such as when Coca-Cola funds obesity research while soft drinks increase the risk of obesity. - 31/03/2016

 

Additional reading

GUIDE: Tips to avoid three common statistical errors

GUIDE: Evaluating health claims, quacks and cures

Loading...

Loading...






Republish our content for free

We believe that everyone needs the facts.

You can republish the text of this article free of charge, both online and in print. However, we ask that you pay attention to these simple guidelines. In a nutshell:

1. Do not include images, as in most cases we do not own the copyright.

2. Please do not edit the article.

3. Make sure you credit "Africa Check" in the byline and don't forget to mention that the article was originally published on africacheck.org.

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.
limit: 600 characters

Want to keep reading our fact-checks?

We will never charge you for verified, reliable information. Help us keep it that way by supporting our work.

Become a newsletter subscriber

Support independent fact-checking in Africa.