Column: Be Skeptical of What You Believe Is True


Having been something of a second-tier Mad Magazine for decades, Cracked reinvented itself for the Internet as a source of fascinating, funny, and frequently well-researched lists on a wide variety of subjects. Lists like “7 Terrifying Prehistoric Creatures (That Are Still Around)” and “7 Books We Lost to History That Would Have Changed the World.” Oftentimes their lists challenge common myths and urban legends, like “5 Things You Won’t Believe Aren’t In the Bible” or “6 Ridiculous Lies You Believe About the Founding of America.”

Among the lists that appeared on the first of this month was Soren Bowie’s “5 Guilty Pleasures That Are Secretly Saving the World,” which discussed how credit card debt was eradicating illness by reducing the transmission of disease via paper currency, how the death of print media was slowing global warming, how marijuana production helps the environment, and how texting was preventing brain tumors.

Topping the list was how misinformation on the Internet encourages people to be smarter and more skeptical, and it wrapped up with this sentence:

“Above all, the Internet has taught us how to discriminate between a well-informed piece with reliable sources and, say, an elaborate April Fools prank filled with bold-faced lies that sort of feel true.”

That’s right: The entire article was an elaborate and brilliant April Fools joke. And it was easily the best one I saw this year, offering not just another viral video gag, but a brutal smack to the reader’s ego. It works so well because it carefully exploits the same mental weaknesses that it’s criticizing. As such, the article is a fantastic little case study in the propagation of misinformation.

For starters, as Soren noted, all of the items feel as if they could or should be true. The items start with a commonsense observation that’s indisputable (i.e., people do charge/text/read online more), and then draw what would seem to be a commonsense consequence.

But common sense is unreliable when it comes to seeking out truth. People are subject to all manner of logical fallacies and faulty observations; we have to rely on things like the scientific method to strip away those biases that can hide inside our common sense.

Confirmation bias makes people seek out and prefer information that validates their pre-existing views, and the self-serving bias causes people to interpret information in a way that benefits them, and Soren exploits both of these starting from the title itself: “5 Guilty Pleasures That Are Secretly Saving the World.” People want to believe that texting is good for their health and that pot is good for the planet, and they can see how some information might support those conclusions, and so they’re predisposed to be less critical of an article telling them exactly that.

It’s these same sort of biases that drive interest in things like the Paleo Diet. I can lose weight by eating more delicious meat? That sounds awesome. So awesome, in fact, that I might be inclined to overlook expert opinion and research that says it’s overrated.

One thing confirmation bias can cause a person to do, intentionally or unintentionally, is to cherry-pick which evidence to believe and argue. Pseudoscience uses this tactic regularly, preferring anecdotes to scientific tests, or relying on poorly designed or unblended studies rather than well-designed, well-controlled, double-blinded studies that are better at eliminating possible bias by the researchers conducting the study. Evidence is weighed not on its quality, but on how helpful it is to the preferred conclusion.

This plays a major role in Soren’s item about cell phones and cancer. It was hypothesized at one time that the radiation from cell phones might cause cancer, and Soren was able to exploit that outdated belief, while ignoring the years of subsequent studies that have been unable to find a causal link between cell phones and cancer. One can continue to spread the idea of a false relationship simply by appealing to the small pool of poor supporting evidence, while ignoring the much larger pool of contradictory evidence.

Such cherry-picking is at the heart of claims that vaccines are dangerous and cause autism in children. This was once a seriously considered hypothesis, worthy of careful study. But years of studies have repeatedly found no link; moreover, the researcher whose paper kicked off the whole controversy was found to have committed fraud in his research. The weight of the scientific evidence is heavily on the side of there being no link, and yet the debate nonetheless drags on, because certain vaccine denialists cherry-pick which evidence to believe.

Moreover, one can create false confidence in its cherry-picked sources simply by how they’re used and presented. The Cracked article includes multiple links to purported sources, creating the illusion that the article is well-sourced; it’s an assumption based upon other Cracked articles doing the same. But here, the sources don’t always back up the article’s claims. For instance, the article suggests that several infectious diseases have nearly disappeared in North America because of credit card spending, and it includes a link to a paper in a science journal that documented currency-borne diseases. But the research concerned paper money in Nepal, and reported only the existence of the pathogens, not any reduction in disease. The reader’s failure to scrutinize the source material, to make sure that it validates the claims being made, allows the reader to be deceived.

Of course, we’re bombarded with so much new information every day that it’s impractical to check every source. Mental shortcuts exist because of the need to process large amounts of information in short amounts of time. Falling victim to misinformation is thus probably inevitable, but there are still two basic tools to fight back.

The first is to simply familiarize oneself with those mental shortcuts, and to be aware of the logical fallacies and biases that our minds are subject to. By knowing where your mental weaknesses are, you can better identify what information might be the result of others’ biases, and thus avoid falling victim to suspect information yourself.

But no system is perfect, and even the most skeptical person will fall victim to bad information sometimes. And what good critical thinking demands of you at that point is to keep your mind open to new, and potentially contrary, evidence. To be able to weigh the quality of that evidence against what you already believe to be true. To recognize that your existing beliefs may not stand on as firm a foundation as you once thought, and that you may have inadvertently fallen victim to your own biases and fallacies.

It can be a very humbling attitude to take. But in the long run, the best way to ensure that what you believe is right, is to always be open to the possibility that what you believe is wrong.

Loren Collins is the author of Bullspotting: Finding Facts in the Age of Misinformation. He wrote this essay for The Free Lance-Star in Fredericksburg, Va.