FiveThirtyEight recently posted a piece called “The Impeachment Hearings Just Confirmed Voters’ Preexisting Opinions”: the same wave of new information has just made everybody more convinced of what they already thought:
One explanation of this phenomenon is “motivated reasoning”: a person finding data more reliable and arguments more convincing if they fit with what the person already hopes is true. But I think there’s something else going on too, which doesn’t require ascribing emotional self-deception to the people who disagree with you. After all, isn’t this how making up your mind works?
- You start out uncertain about something.
- You learn a bunch of information.
- Now you’re certain.
Well, yes and no: this is how belief-updating works in the best circumstances, but in other cases learning new information shouldn’t make your more certain, or should even make you less certain! If this sounds totally reasonable to you, great. But if you’d like to see some picture explanations, read on:
Let’s say you’re trying to decide which of three different possibilities is the true one: A, B, or C. You can plot how convinced you are of each possibility as a point in a triangle, where each corner represents total certainty in one of the three options:
Ideally, one starts somewhere in the middle, not particularly convinced of any one possibility, with no information in your bucket of evidence:
As evidence piles up, flavored by the the possibilities it supports, we gradually become more and more convinced of one possibility:
However, sometimes the evidence coming in isn’t so clear, and instead of becoming more and more convinced of one thing, we just wander around in circles:
At that rate, it’ll take a lot more evidence for a clear tendency to emerge toward A, B, or C.
So depending on what information you learn, you might become a lot more convinced of something, or barely more convinced at all. Here’s how you can even become less convinced, if a new mixture of evidence makes old evidence look less typical:
Now imagine if you were in this position and instead subscribed to the “new information should make you more certain” philosophy. Then your thought process would look something like this:
- You start out sort of convinced of Possibility B.
- You learn a bunch of information.
- Now you’re supposed to be more certain — but what should you be more certain of? By default, what you had been only sort of convinced of before: Possibility B!
And in this case, that would be exactly the wrong response to the evidence. If three people had started out sort-of convinced of possibilities A, B, and C, and each learned the same mixture of new evidence, they’d all end up thinking the evidence had pointed them further toward their original beliefs.
The takeaway: Where you end up relative to your starting convictions really depends on what the exact mixture of evidence looks like. There’s no substitute for carefully classifying and weighing the evidence, especially if different data points suggest very different interpretations. You might end up less certain for a while before the evidence makes you more certain again, either of your original thoughts or of the alternatives.
One thought on “More information doesn’t have to make you more certain”