STORY BY LYDIA STROHFELDT
From viral conspiracy theories to misleading statistics, misinformation has become one of the defining challenges of the digital age.
Dr John Cook, Senior Research Fellow with the Melbourne Centre for Behaviour Change at the University of Melbourne, studies how to use critical thinking to counter misinformation. Best known as the founder of climate communication website Skeptical Science and creator of the educational game Cranky Uncle, his work explores why false claims spread, and how people can learn to recognise and resist them.
In this Q&A, we chat with John about the psychology of misinformation, the challenges of debunking it, and what individuals can do to navigate today’s complex media landscape.
Q. Your work bridges psychology, critical thinking, and communication to tackle misinformation. What first drew you to this field, and how did your early experiences shape your approach?
A. What initially drew me to the topic of climate misinformation was getting into arguments with family members, being a nerd, and also not wanting to lose an argument. I started a database researching the most common climate misinformation arguments and what peer-reviewed science had to say about it, being very systematic. While I was building this resource up for myself, I realised that other people probably have similar arguments with family members and might find this to be a useful resource.
So, I created Skeptical Science: a website that debunks climate misinformation. A couple of years into that, I got an email from a cognitive scientist sending me some psychological research into how to debunk misinformation, or more specifically, how not to debunk misinformation. The bad way of doing it was how I was doing it.
That incorrect way was putting the emphasis on the myth that you’re debunking rather than the facts; using the myth as the headline, then following that with a long, complicated debunking. Often when you do it that way, what people remember the most is the myth … but you want to emphasise and repeat the facts, and communicate the facts as simply as possible.
Q. In your research you describe misinformation as a multi-faceted problem that goes beyond just false facts. How do you define misinformation, and what are the psychological or social factors that make it so hard to counter?
A. I have two working definitions for misinformation, coming from different angles.
The fact-based definition: misinformation is any information that conflicts with the best available evidence or the consensus of experts. This is generally a good, helpful definition when it comes to science-based topics.
The logic-based definition: misinformation is arguments that are misleading due to fallacies or rhetorical techniques. I’m increasingly more attracted towards the logic-based approach because it is much more generalisable – it can cover more types of misinformation, and it’s often more effective as a way to counter misinformation.
I’ve done a number of psychological experiments where the logic-based approach is quite effective, sometimes more effective than fact-based corrections.
Q. You’ve applied ‘inoculation theory’ and developed tools like the Cranky Uncle game to build resilience. How does exposing people to weakened forms of misinformation help them resist it in real life?
A. Inoculation is about preemptively building people’s resistance to misinformation rather than trying to undo the damage afterwards. Again, there’re two main types of inoculation: fact-based and logic-based.
With fact-based inoculation, you can inoculate people against a specific myth; saying, all right, here is this myth, but here are the facts and how you know this myth is incorrect.
But the logic-based approach highlights the technique used to mislead you (i.e., cherry-picking, using fake experts or conspiracy theories), so it’s about building people’s awareness of the techniques used to deceive. Once people know those techniques, they can spot them across topics.
It works great in lab experiments, but how do you actually reach people with it? Digital games are what I ended up experimenting with. And so, we developed the Cranky Uncle game, which is centred around logic-based inoculation: teaching people how to spot the techniques of science denial and using games to make it engaging and fun.
“A logic-based approach is about building people’s awareness of the techniques (i.e., cherry-picking, using fake experts or conspiracy theories) used to deceive.”
Q. Is it enough for people to be aware of deception, even without a deep understanding of what techniques are used to mislead?
A. It’s certainly a question we discuss a lot in the misinformation research community. On the one hand, if you tell people to generally “watch out!” … the problem is that it swings them towards being cynical and skeptical about everything; they can’t trust anything. You don’t want that either, because you don’t want them mistrusting factual sources and scientific information.
In the research community, we have a concept called discernment, which is the ability for people to discern fact from myth. The way we measure it is by showing factual and misinformative statements to see if there’s a gap between the two. You want that gap to be as big as possible … and to see belief in facts go up, while belief in myths go down.
There is this really delicate balance. We want to get people on guard, but we don’t want them to become overly cynical or skeptical.
Q. What real-world impacts, particularly in Australia, would you hold misinformation in the media accountable for?
A. One thing the media has historically done is false balance. That’s when you have an overwhelming scientific consensus, and yet the media portray an issue as a 50-50 debate; they’ll bring on a climate scientist and a climate denier and let both have their say.
The public come away thinking there’s a 50-50 debate in the scientific community, so that format is misleading because what people perceive about the scientific consensus is really important. If they get a misperception there, that can have flow on effects and make them less accepting of climate change [and science] in general.
It can come from a good place, trying to be balanced and adhere to journalistic norms of giving both sides equal voice. That’s an entirely appropriate approach when it’s issues of politics or opinion, but when it’s about scientific consensus, then it becomes a misleading approach.
“One thing the media has historically done is false balance. That’s when you have an overwhelming scientific consensus, and yet the media portray an issue as a 50-50 debate.”
Q. Your work includes machine-learning and automated detection systems for misinformation– whatrole should technology play versus human-driven communication and education?
A. The rationale for using AI does have problems, as it’s very carbon intensive, and it’s not always reliable. But the argument for using AI to tackle misinformation is that it spreads faster than we can manually respond to it.
We’ve been trying to adopt the best practices from psychological research. As it’s not just about fact checking, it’s also about logic checking, you need to identify which fallacies are used in misinformation and explain that to people. That’s a really important part of debunking.
We found that generative AI tools like ChatGPT are not that good at logic checking. They’re not good [or accurate] at identifying fallacies or explaining them.
We spent a whole year developing an AI model to detect fallacies more reliably. It’s really just the first step, and it’s going to be a long journey. There will be a bit of an arms race, because the AI generating misinformation will progress, and we’re going to have to progress as well.
“Misinformation spreads faster than we can manually respond to it.”
Q. What practical advice would you give people to counter misinformation in their everyday media consumption and personal conversations? And is there any specific advice you would give to regional and rural communities?
A. Educate yourself on the ways that you can be misled, and learn the techniques used to mislead you. Other than that, identify authoritative, reliable sources.
For example, in North Queensland, the Great Barrier Reef is a pretty big issue … and there’s a lot of misinformation about it. There is community interest in this topic coming from lots of different angles: the agricultural point of view, the environmental point of view, and the scientific. There’s also a lot of industry generated misinformation casting doubt on the science of what’s happening to the reef and what factors are causing damage to it – a lot of word salad.
The Office of the Great Barrier Reef is very rigorous. They create reports summarising the state of science, and consensus reports. They put a big effort into trying to make the information accessible to people.
So, my advice is to find authoritative sources that are evidence-based rather than driven by a particular agenda.
“Educate yourself on the ways that you can be misled, and learn the techniques used to mislead you. Other than that, identify authoritative, reliable sources.”

