Deal with it!

“200 kilos of tuna found in a cargo of cocaine”, that was the title of a news article. If you think about this sentence, even for a second, you will notice that this is a bit of a weird scenario. In news about cocaine, it is mostly the cocaine that is found. Not necessarily in tuna though. Now you might be asking yourself: “Was this really in the news?” The answer is ‘yes’, well sort of… It was published on the satirical news website De Speld. Most Dutch people will know this platform and I even dare to say that most people will think it’s funny.

De Speld is known for its funny fake news. But if you do not know that is only produces fake news, their news might look very misleading. That is what happened to the Haarlems Dagblad. On their website they published an article that was based on the tuna article of De Speld. Fortunately, they figured out it was fake pretty soon and they deleted the article. Another newspaper that took an article of De Speld a bit too seriously was De Telegraaf, a well-known Dutch news source. In 2009, when De Speld was not yet as well-known as it is nowadays, they published an article on their website about a ‘Patatje (portion of fries) Holocaust’. They even named De Speld as their source. I guess the journalist who wrote the post did not take the time to take a closer look at the website of De Speld. Even the four other snacks with names like ‘Srebrenica platter’ and ‘J.F.K.’ did not ring a bell at De Telegraaf.

A screenshot of the article on the website of De Telegraaf

What to believe?
In these cases, for the journalists that wrongfully thought De Speld was a trustworthy news source, the solution was easy: just delete the article and, if really necessary, apologize for giving wrongful information. But sometimes making things right is not that easy. Especially if the misinformation is very believable, it can be a real issue to right a wrong. In case of De Speld, believability is not really the problem, especially since it is now very well-known. But I will show you some examples in which this is the problem.

For example in the case of the president of the United States of America. The president of such a big part of the world should be believable and trustworthy, but president Trump proves otherwise. On Twitter he presents information like facts, when scientists have proven that what he says is not true. Take a look at the following Tweet:

Trump states that the concept of global warming was created. Now take a look at the graph below.

graph temperature
I made it myself, but it is made with data from the Data Worldbank. It clearly shows the increase of the average temperature in the world per year. And these are the facts, whilst what Trump says is based on… Well, who knows what? The best thing to do here is to not take everything Trump blurts out too seriously.

Debunking misinformation
With the ongoing growth of the use of social media, the dissemination of misinformation increases (Lewandowsky, Ecker, Seifert, Schwarz, & Cook, 2012). Lewandowsky et al. (2012) were so kind as to provide us with very specific recommendations for debunking misinformation. They provided different solutions for four different problems. The solutions are directed at how to design, structure and apply corrections. In the image below, all problems and recommendations are explained.

Lewan debunk misinfo
The problems (on the left) and solutions as proposed by Lewandowsky et al. (2012)

In case of Trump’s Tweet about climate change people’s worldview is threatened. The solution would then be to affirm the real worldview. According to Van der Linden, Leiserowitz, Rosenthal, and Maibach (2017) Inoculation would be a solution against misinformation about the cause of climate change. Attitudinal inoculation is when you pre-emptively warn people about attempts to spread misinformation. In the case of the climate change cause, these attempts are politically motivated, just like in Donald Trump’s Tweet. Warnings would help to promote and protect (“inoculate”) public attitudes about the origin of climate change as confirmed by scientific consensus. The consensus is that 97% of climate scientists have concluded that the climate change is in fact human-caused. The warning should make people aware of politically motivated messages about the climate and alert people to not believe these messages.

Other factors influencing what you believe
Banks (2013) suggests another solution, namely enhancing critical thinking skills. Especially in ambiguous data and information environments, this could prevent people from believing misinformation. But there are some criteria that need to be met in order to be critical (Mezirow, 1991). People need to:

  • Have accurate and complete information
  • Be free from coercion and distorting self-deception
  • Be able to weigh evidence and assess arguments objectively
  • Be open to alternative perspectives

These strategies for dealing with misinformation are all very well thought out. But maybe Inoculation Theory can have two sides. If people are presented with information about another possible side to a story (e.g. that climate change is not caused by humans) and this information is very believable, it might make people consider the side that you are trying to have them stay away from. They get alerted about the other side, whilst beforehand, they might have never even considered there to be another side. I think Inoculation Theory can work well, but that for it to work well, you need to know your public. You need to know what your public’s opinion on the case is right now and if they are sure of that. This can be connected to the Elaboration Likelihood Model, you have probable heard of. If people are motivated and have the opportunity to process information, they are more likely to change their attitude towards the topic (Petty & Cacioppo, 1986). Thus, if people are motivated and have the opportunity to think about the cause of climate-change, they might consider the news side of the story presented during the inoculation. I think, if the Inoculation Theory is rightly applied, it can work for people who strongly believe the right thing. I would like to know from you: Do you think it is possible that inoculation might actually have an opposed effect? And if so, when?

Andreas. (2011, July 19). Journalisten checken nieuws niet meer [Blog post]. Retrieved from
Banks, D. A. (2003). Misinformation as a starting point for critical thinking. Retrieved June, 28, 2004.
Bennootje. (2009, October 3). De Telegraaf gelooft in patatje Holocaust [Blog post]. Retrieved from
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106–131.
Mezirow, J. (1991). Transformative dimensions of adult learning. Jossey-Bass, 350 Sansome Street, San Francisco, CA 94104-1310.
Petty, R. E., & Cacioppo, J. T. (1986). The elaboration likelihood model of persuasion. Advances in Experimental Social Psychology, 19, 123-205.
Van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. Global Challenges, 1, 1-7.


2 thoughts on “Deal with it!

  1. The example of the Telegraaf made me laugh, although it’s quite serious that a journalist that provides news to the people would make a mistake like that! I personally think that inoculation can have to opposed effect, especially since everyone lives in their own filter bubble. If 80% of your news supply contains the false information, then the 20% that is correct would seem unbelieveble. You might feel like they are trying to convey a meaning that is completly fake and untrue, that is because you have already been inoculated, but with false information! Perhaps there are options to break bubbles with standard, recognized, true information. That will be difficult though, but if there is an official answer to a certain case then inoculating can also have a good potential in providing the people with real information.

    Liked by 1 person

  2. I think that it is always effective to innoculate, but not always for the right reasons. Once you have been innoculated with false facts, the myth will sustain as a virus on your system. It is always good to spread the correct inforation as much as you can, but I do not think it is always effective. Think about Brandolini’s law: the amount of energy necessary to refute bullshit is of a magnitude larger than to produce it…


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this:
search previous next tag category expand menu location phone mail time cart zoom edit close