If you smugly believe you’re right in a disagreement with a friend or colleague, a new study suggests why you may actually be wrong.
Researchers found that people naturally assume they have all the information they need to make a decision or support their position, even when they do not.
The researchers called it the “illusion of information adequacy.”
“We found that, in general, people don’t stop to think whether there might be more information that would help them make a more informed decision,” said study co-author Angus Fletcher, a professor of English at The Ohio State University and member of the university’s Project Narrative.
“If you give people a few pieces of information that seems to line up, most will say ‘that sounds about right’ and go with that.”
The study was published in the journal PLOS ONE. Fletcher completed the work with co-authors Hunter Gehlbach, an educational psychologist at Johns Hopkins University’s School of Education, and Carly Robinson, a senior researcher at Stanford University’s Graduate School of Education.
The study involved 1,261 Americans who participated online.
They were split into three groups who read an article about a fictional school that lacked adequate water. One group read an article that only gave reasons why the school should merge with another that had adequate water; a second group’s article only gave reasons for staying separate and hoping for other solutions; and the third control group read all the arguments for the schools merging and for staying separate.
The findings showed that the two groups who read only half the story – either just the pro-merging or the just the anti-merging arguments – still believed they had enough information to make a good decision, Fletcher said. Most of them said they would follow the recommendations in the article they read.
“Those with only half the information were actually more confident in their decision to merge or remain separate than those who had the complete story,” Fletcher said.
“They were quite sure that their decision was the right one, even though they didn’t have all the information.”
In addition, participants who had half the information said that they thought that most other people would make the same decision they did.
There was one piece of good news from the study, Fletcher said. Some of the participants who had read only one side of the story later read the arguments for the other side. And many of those participants were willing to change their minds about their decision, once they had all the facts.
That may not work all the time, especially on entrenched ideological issues, he said. In those cases, people may not trust new information, or they may try to reframe it to fit their preexisting views.
“But most interpersonal conflicts aren’t about ideology. They are just misunderstandings in the course of daily life,” Fletcher said.
These findings offer a complement to research on what is called naïve realism, the belief people have that their subjective understanding of a situation is the objective truth, Fletcher explained. Research on naïve realism often focuses on how people have different understandings of the same situation.
But the illusion of information adequacy shows that people may share the same understanding – if they both have enough information.
Fletcher, who studies how people are influenced by the power of stories, said people should make sure they have the full story about a situation before they take a stand or make a decision.
“As we found in this study, there’s this default mode in which people think they know all the relevant facts, even if they don’t,” he said.
“Your first move when you disagree with someone should be to think, ‘Is there something that I’m missing that would help me see their perspective and understand their position better?’ That’s the way to fight this illusion of information adequacy.”