“What is truth?” Pilate asked. In today’s world, we’re frequently forced to ask the same question, albeit for different reasons. Much of the information we receive, particularly in areas such as politics and social issues, is fake news- yellow journalism or propaganda designed to deliberately mislead or deceive and affect our decision making. Depending on your news sources, you may feel you’re in an echo chamber, hearing the same messages with the same slant repeated over and over. Or, given multiple views of an issue or story, all claiming to be the truth, you may start to feel like the blind men and the elephant.
Here’s a thought: the problem isn’t really fake news, but our ability to sort through the noise and make sense of it- in our ability to think critically. Unfortunately, that ability is constantly tested, and we don’t always come out the winner. As Walt Kelly put it, “We have met the enemy, and he is us.”
One of the biggest problems with fake news is that we love it so. Our brain chemistry contains a reward system that gives us a dopamine hit, a feeling of pleasure, when we do something that deserves reward, such as surviving a bear attack or learning a new task. We also get that hit when we process information, such as reading an article, that supports our beliefs- that tells us what we know is correct. It’s the basis of reinforcement learning, a good thing. But it also makes us more unwilling to accept facts or opinions that disagree with our own beliefs. It’s called confirmation bias. It’s why Democrats watch MSNBC and Republicans watch Fox, instead of vice versa.
Confirmation bias is a cousin of cognitive dissonance, the stress we feel when we’re forced to hold two contradictory beliefs or opinions at once. We humans like consistency; we each have a model of the world that works for us, and tend to avoid situations that are inconsistent with that world model. But avoiding cognitive dissonance means you’re avoiding a chance to change your mind. It’s the ‘my mind is made up; please don’t bother me with facts’ mindset.
In order to avoid confirmation bias, you have to first understand it thoroughly. Why do we believe as we do? This can be framed as a question. Do we first take in information- understand it- and then decide whether to believe it or not? Or do we simply believe whatever we see and hear, and then perhaps change our minds later when we come across evidence to the contrary? This was the essence of a famous mid-17th century debate between French philosopher and mathematician Rene Desacrtes and Dutch philosopher Baruch Spinoza. Descartes believed that we take in information and then decide rationally whether or not to believe it. This seems proper; it would be wrong to believe something on the basis of insufficient evidence. Spinoza disagreed. He felt that understanding some bit of new information was believing it- that we tend to believe anything we hear, and have to work to ‘unlearn’ it if we find evidence to the contrary. This is less appealing; it implies that we’re gullible, and will have to work hard to root out garbage people spew at us.
Who was right? In 1993, Harvard psychologist Daniel Gilbert and his associates ran a series of experiments to see which of these two theories was correct.[2] The conclusion they reached is discomfiting; Descartes was wrong, and Spinoza was right.
If we all have a built-in tendency to believe everything we see and hear, it follows thast we should be careful what we see and hear. Of course, a lot of information is truthful, and should be accepted prima facie; cynicism isn’t always a good idea. If you’re driving and you see a ‘bridge out’ sign, you’d be foolish not to pay attention. But the Internet and social media have enabled new ways to share information with very little regulation or editorial standards, and there is evidence that even the mainstream media is increasingly biased in some areas. Instead of cynicism, a dose of skepticism is often in order.
How do you combat your own confirmation bias? It’s not easy. Nobody wants to admit that they’re wrong, but that is exactly what you’re asking yourself to do: to consider that there is a truth, or at least an objective opinion, that isn’t identical to your opinion, at least in part. We all want to be fair and objective, but it takes work. Here are some techniques to help you get there.
1. Don’t take the opposite side of the issue. It doesn’t work. A study by Charles Lord and colleagues[4] identified a [tk fix this] It’s not that we believe just what we want to believe, but that when we filter new information, we view whether or not it supports our beliefs, we tend to see it as a confirmation, that is, we interpret it as supporting the conclusion we’ve already formed. We accept confirming evidence at face value, but subject contradictory evidence to critical evaluation. Lord called this ‘biased assimilation.’
A better approach is to simply attempt to weigh the evidence as objectively and impartially as possible. Ask yourself, for each piece of evidence, if you would evaluate it the same way if the evidence was different. Play devil’s advocate. For example, if faced with a piece of research that suggests tariffs harm the economy, imagine that the results aid the economy. Then examine the research with an eye to understanding why it had the results it had. This approach will help you focus on the evidence and its strengths and weaknesses rather than your interpretation of it as support for your beliefs.
As an aside, this research reinforces the notion that you’ll do well to focus on more objective news sources and avoiding sources that you know are biased, even if you tend to agree with them.
2. Avoid polarity. If you look at new information as black or white, pro or con to beliefs you already hold, you’re just adding to the problem. Instead, consider the data as an opportunity for creativity. Brainstorm a bit; think of at least three explanations for the data. Three is a magic number because it steers you away from ‘this or that’ thinking.
3. Be aware of your hot buttons- any topic or issue that is highly charged emotionally. Hot buttons are magnets for social media and yellow journalism precisely because they’re highly charged: they’re click bait. But understand that the more emotional you are about a topic, the more likely you are to have biases. Awareness doesn’t mean avoiding these issues; avoidance isn’t going to help you overcome your biases. But don’t let others push your buttons.
4. Avoid looking for causes. Just take in the data and look at it as objectively as possible. The ‘why’ question looks for reasons and motives, and we are far to good at inventing them, most of which are pure conjecture. Even the people who make decisions are often poor at explaining why they made them. People will go to great lengths to justify bad decisions. Be wary of someone who tries to tell you he knows what’s going on in someone else’s mind. Instead of asking a subjective why, stick to the more objective who, what, when, where, and how questions.
5. Moderation in all things. When it comes to information, the best way to practice moderation is to look for moderate sources. In politics, instead of depending on left-leaning or right-leaning news outlets, look for more centrist sources. Media Bias/Fact Check[5] provides curated ratings of new sources’ biases; sticking to the ‘Least Biased’ category have minium bias and use very few loaded words (wording that attempts to influence an audience by using appeal to emotion or stereotypes.) SUJO[6], a new app available on the web as well as on both IOS and Android, shows news articles with objective, opinionated, and user-submitted articles’ titles in different colors- and it’s instructive to see the difference.
6. We need to be open to people with diverse opinions and to new experiences as a way to actively challenge our beliefs. If we can’t do that from time to time, we are, by definition, narrow minded. As John F. Kennedy reminded us, ‘Change is the law of life. And those who look only to the past or present are certain to miss the future.’
References:
Critical Thinking: Tools for Taking Charge of Your Learning and Your Life, Richard Paul & Linda Elder, Prentice-Hall, 2001
[1] Why You can’t Help Believing Everything You Read
https://www.spring.org.uk/2009/09/why-you-cant-help-believing-everything-you-read.php
[2] D.T. Gilbert, R. W. Tafarodi, P.S. Malone, “You Can’t Not Believe Everything You Read”, Journal of Personality and Social Psychology Aug 1993
[3] Do We Choose What We Believe?
https://blog.oup.com/2015/05/spinoza-ethics-of-belief/
[4] Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence
http://psycnet.apa.org/record/1981-05421-001
[5] Media Bias/Fact Check
mediaBiasFactChecker.com
[6] SUJO: Organizing Reality
https://web.sujoapp.com/