How We Process Information: Why Politicians Can Overtly Lie and People Still Believe Them

Why do people still believe him if there is so much evidence that proves he is lying?

Over the past few months I heard this sentence (or different iterations of it) on multiple occasions. Whether applied to Donald Trump or Hillary Clinton, Nigel Farage, Álvaro Uribe, Nana Akufo-Ado or John Dramani Mahama, the outrage was always the same. Behavioral economics can actually help in answering this question. By learning how we process information and what we do with it, we can better understand how the media gets to play the role it does and why sometimes it doesn’t help in disproving overtly false statements.

It is commonly believed that the vast amounts of information nowadays available and the way people share information should make it easier to separate falsehood from facts, yet as we saw in the latest American elections, this is usually not the case. Instead of identifying falsehoods, the multiplicity of opinions simply created a cacophony of errors that only after the elections started to be dispelled. Why does this happen?

In the era of internet and information, errors can be propagated and amplified. Combined with belief polarization and one’s likelihood to question news that contradict our prior beliefs (defined below), the likelihood of “liars” being discredited is extremely reduced.

As data consumers, generally, we do not collect dispersed information, contrast (or better yet, triangulate it) and reach a conclusion as to what is the “truth.” On the contrary, we usually fall into information cocoons or information segregation. Basically, we tend to just receive information that aligns or is biased in favor of our previous beliefs. As data consumers, we do not tend to read or watch the news in media outlets that follow a different editorial line. In other words, we are ideologically segregated (this phenomenon can also help explain why we can be convinced that there is a majority when there isn’t, we interact only with those who think like us).

In a surprising 2011 study, Gentzkown and Shapiro found that ideological segregation is even higher in face-to-face interactions than in online interactions. This makes sense insofar, at the end of the day, our friends tend to be somewhat similar to us (people we like). What about Facebook? Our Facebook feed might be representative of our face-to-face interactions, thus being highly segregated as well. The posts we see will not tend to discredit our opinions or the posts that we share ourselves.

As a direct consequence, we obtain one-sided views and enter – as professor Cass Sunstein says – into information cocoons and echo chambers which are, on the first place, a real problem for any democracy, but also a place where amplification of errors, hidden profiles, cascade effects and polarization are inevitable.

The situation becomes a bit more complicated if we add what is called belief polarization into the mix. We have what behavioral economist call “priors”, our beliefs without any information. The “posterior” is our updated belief once we have received information. We tend to think that when presented with the same information, our beliefs will converge. In many instances, this is not the case. We care directly about our beliefs (we are attached to them, we don’t want to relinquish them!) and thus we try to maintain them. Put simply, there are things we want to believe thus we do – we also discard the information that contradicts these beliefs (this is known as motivated belief bias and confirmatory bias).

If a media outlet reports that a certain politician that I dislike in my country is allegedly corrupt, I will probably believe it. But, if it comes out that Emma Watson was involved in the Panama papers scandal, I will probably question it or try to justify it (not that this happened…).

In a famous 1979 study, Lord, Ross, and Lepper conducted an experiment that demonstrated how people examine relevant empirical evidence in a biased manner. In the study, people holding different beliefs on the capital punishment were presented with two studies, one that seemingly confirmed their prior and one that seemingly contradicted it. The subjects rated the study that contradicted their prior beliefs as less reliable and became more convinced of the veracity of their prior opinions, that is, the subjects, instead of converging into a shared position (or closer position), became more extreme. In other words, belief polarization ensued.

Hypothetically, if I were a Trump supporter and a media outlet presents me with information about the feasibility of building a wall (and Mexicans paying for it) and another media outlet proves that this is almost impossible, I will take the former at face value and discard the latter. In reality though, due to ideological segregation and information cocoons, it is possible that I will not even consume the information that contradicts my prior.

In sum, people believe that they are unbiased information processors, but the reality is that we tend to be recipients of one-sided information, we process information in a biased manner, and even when we receive contradictory information, we become more polarized instead of converge towards one position.

This way of processing data has a direct impact on how media outlets can operate.  If we are not going to discredit media outlets for reporting dubious information, they can care less about their reputation or the veracity of the news they publish (and can rush into publishing without conducting a thorough fact-check first). If they report false news, people who want to believe them will still believe them because they want to do so. Reporting quality, despite the availability of vast information, is then much lower. In fact, as we have seen, some studies suggest that an outlet will be perceived as more truthful if it confirms our priors.

______

Sources for this article and reading recommendations:

Gentzkow, Mathew and Jesse M. Shapiro. “Competition and Truth in the Market for News.Journal of Economic Perspectives 22 (2008): 133 – 154.

Lord, Charles G., Lee Ross, and Mark R. Lepper. “Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequantly Considered Evidence.” Journal of Personality and Social Psychology 37 (1979): 2098 – 21-9.

Sunstein, Cass. Infotopia: How Many Minds Produce Knowledge. (New York: Oxford University Press, 2006)

 

[Disclaimer: for this article, I drew heavily from what I studied in my Behavioral Economics class at the Yale School of Management with Prof. Florian Ederer and Prof. Shane Frederick ]

Advertisements