Democracy only works well when the citizenry is well informed. We are taught that if people are furnished with the facts, they will be clearer thinkers and better citizens. “Whenever the people are well-informed, they can be trusted with their own government,” Thomas Jefferson wrote in 1789.
But humankind is uniquely susceptible to ignorance and misinformation – and too many of our citizenry are ill-informed. In 1996, Princeton University’s Larry M. Bartels argued, “the political ignorance of the American voter is one of the best documented data in political science.”
Over the last few decades, political science research has established that most Americans lack even a basic understanding of how their country works. A large part of the reason Americans are so ignorant is because certain segments of the media lie daily in order to push a political agenda – and the First Amendment legally allows it. A public who wishes to be entertained does not want to read facts. The public does not listen to or read those segments of the media who put out facts from good sources – pure information is boring. But even if they read the facts it would make little difference because facts do not necessarily have the power to change minds. Research at the University of Michigan has proven that facts have quite the opposite effect. People who are shown facts that go against their beliefs, even with inarguable proof, will all too often dig in and become more entrenched. It’s called backfire.
In a series of recent studies, researchers at the University of Michigan found that when misinformed people were exposed to corrected facts in news stories, they would refuse to change their minds. This holds particularly true for those who are political partisans. Instead, these people often became even more stubbornly set in their beliefs.
In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Brendan Nyhan, a Robert Wood Johnson Foundation Scholar at the University of Michigan, and a colleague, Jason Reifler, devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted).
When Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took, for the most part, it didn’t. It backfired. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic, the greater the backfire.
The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but they chose to ignore the inconvenient truth that the Bush administration’s restrictions were not total.
Facts, the researchers found, were not correcting the misinformation, but, instead, could actually cause people to hold more strongly to the misinformation.
This suggests that once beliefs are internalized, they are very difficult to budge. The conclusion one can draw from these Michigan University studies is if the citizens are ignorant, facts will not enlighten them; if they are mistaken, facts will not set them straight. In other words, if the real facts do not agree with what you believe – you throw out the facts and cling to your own make-believe ‘facts’.
It appears that misinformed people, particularly conservatives, often have some of the strongest, yet inaccurate, political opinions. Most people vote based on their beliefs – which are seeped in emotion and objectively, provably false. Why? People tend to interpret information according to their views in order to find consistency. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This makes us more confident in said beliefs, and even less likely to listen to facts that contradict them.
“It’s absolutely threatening to admit you’re wrong,” says Brendan Nyhan, the lead researcher on the Michigan study. He said that the “phenomenon known as backfire is a natural defense mechanism to avoid cognitive dissonance.”
Back in 2000, James Kuklinski of the University of Illinois led an experiment in which more than 1,000 Illinois residents were asked questions about welfare – the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct – but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbing, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong anti-welfare bias.) Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a great problem in a democratic system. With FOX news or MSNBC to back you up, it has never been easier to believe you are right.
“It implies not only that most people will resist correcting their beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”
As a teacher (now retired), I am greatly alarmed by this. How could this be? I taught my students well. I reiterated – and did some fun things to help them remember facts. We also spent much time on how to think critically – how to look at both sides of an issue before making up one’s mind. But apparently, no teacher can change the mind of anyone whose brain is pickled with misinformation.
In an ideal world, citizens would be able to critically monitor the information they receive by looking up the facts. But doing so takes time and effort – and you have to discern whether your source is a good one – which in an overworked or stressed American can be exhausting. So we create shortcuts using emotional inference to cope with the rush of information we receive on a daily basis. Unfortunately, this causes too many people to be easily suckered by political falsehoods.
Political pundits who are out to get rich from the gullibility of Americans have become highly popular entertainment, while professional fact-checking operations, such as C-Span, are seldom watched (considered boring); and nonpartisan fact-checking websites such as Snopes and Fact-Check.org are considered to be an arm of the opposition. In other words, getting a politician or pundit to say that George W. Bush caused the 9-11 attack or that Barack Obama’s presidency is the culmination of a five-decade plot by the government of Kenya to destroy the United States is easy. Getting them to give straight facts is not. Those who do try to give straight, truthful facts are booed off the stage.
What about the interaction between the political ignorance of American citizens and our democratic ideals? Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. But, in actuality, too many Americans uncritically accept bad information that reinforces strongly held beliefs. They become more confident they are right, and are unlikely to listen to any new information.
In reality, a majority of Americans, particularly those who are on the far right, often base their opinions on personally held beliefs and emotions, which often has no basis in facts.
And then they vote.
If the citizens are ignorant, facts will not enlighten them; if they are mistaken, facts will not set them straight.
University of Michigan study:
University of Illinois:
James Kuklinski credentials:
Brendan Nyhan credentials: