fbpx
True, False Road Sign

Lies, Damn Lies…and What We Believe Anyway

I’ve been increasingly disturbed in recent years by the amount of misinformation that seems to flow through the political system in the United States. In relative terms, I doubt it has increased any over previous times, but as the magnitude and complexity of the issues the world faces grows, the need for collective understanding of the facts seems stronger than ever.

And yet, significant numbers of people persist in believing, for example, that Barak Obama is a Muslim, in spite of no evidence to support this belief. Or, during the health care reform debates in the U.S., the notion that “death panels” would have the power to deny benefits to the sick and elderly took root and grew with relatively little resistance among certain ideological groups.

Why is this sort of misinformation and misperception possible, and can it be corrected?  Based on a recent report, the outlook for the second part of the question is less than promising.

The report, titled When Corrections Fail: The Persistence of Political Misperceptions, is based on four experiments designed to test whether false or unsubstantiated beliefs about politics can be corrected.  Participants in the experiments read mock news articles about politically charged issues like stem cell research, tax cuts, and the existence of weapons of mass destruction (WMDs) in Iraq.  Each article contained content that could be misleading or deceptive. For a random set of participants, however, the articles also contained a factual correction of the misleading content at a later point in the article. Participants’ reactions to what they read were correlated with their ideological beliefs.

Here’s one of the major findings, as stated by the report’s authors:

In each of the four experiments, which were conducted in fall 2005 and spring 2006, ideological subgroups failed to update their beliefs when presented with corrective information that runs counter to their predispositions. Indeed, in several cases, we find that corrections actually strengthened misperceptions among the most strongly committed subjects.

For example, participants in one of the experiments were given a mock article that contained a statement George W. Bush actually made: “There was a risk, a real risk, that Saddam Hussein would pass weapons or materials or information to terrorist networks, and in the world after September the 11th, that was a risk we could not afford to take.” For a random subset of participants, the article also contained corrective information from the official report that established there were no WMD stockpiles or any evidence of production in Iraq prior to the U.S. invasion.

Liberal and moderate participants who received the corrective information were less likely than their counterparts who did not receive this information to believe there were WMDs in Iraq prior to the invasion.  The impact on conservative participants, however, was just the opposite: the corrective information actually strengthened their view that there were WMDs.

Liberals have little reason to be smug, however. In a different round of experiments, participants were asked to read an article with potentially misleading information about stem cell research and were then asked about their level of agreement with the statement “President Bush has banned stem cell research in the United States.” Again, some participants received a version of the article that contained clarifying information – namely, that Bush’s policies limited only government funded stem cell research, not privately funded research.

Conservative and moderate readers who received the corrective information were less likely to agree with the statement about a ban on research than their counterparts who did not receive this information. But liberal readers were significantly less likely to be impacted by the corrective information – they stuck to their belief that Bush had banned all stem cell research.

So, how do you change beliefs that are deeply held but factually incorrect? The reports authors’ reference other studies suggesting that, over time, bombarding people with a “sufficient” amount of clear, correct information can work. But I find it hard to place much faith in that approach given how fragmented our information channels have become. We no longer live in a world with three major news channels and one or two local newspapers to which everyone in a community subscribes. Instead, we tend to pick and choose among a wide variety of information sources that support what we already believe.

Game environments – the subject of my previous two posts (here and here) – may, in fact, be among the better choices for bringing together people of diverse beliefs and helping them form a common, accurate understanding of major issues.  More fundamentally, we need to place more emphasize than ever on developing and practicing good learning habits – like critical thinking and reflection – that prevent misinformation from making inroads in the first place.  As the report suggests, once the truth gets twisted, straightening it back out is no easy matter.

Jeff

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
Tweet
Share
Pin
Share