There may be a psychological reason why some people not only get it wrong in an argument, but they get it wrong for sure.
These people believe they have all the information they need to form an opinion, even when they don’t, according to a study published Wednesday in the journal Plos One.
“Our brains are overconfident that they can come to a reasonable conclusion with very little information,” said Angus Fletcher, an English professor at Ohio State University who co-authored the study.
Fletcher, along with two psychology researchers, set out to measure how people make judgments about situations or people based on their confidence in the information they have, even if they don’t have all the facts.
“People make judgments very quickly,” he added.
The researchers recruited nearly 1,300 people with an average age of about 40 years. They all read a fictional story about a school that was running out of water because its local aquifer was drying up.
About 500 people read a version of the story that was in favor of the school merging with another, presenting three arguments in favor of the move and one neutral point.
Another 500 people read a story with three arguments in favor of staying apart, plus the same neutral point.
The final 300 people, the control group, read a balanced story that included all seven arguments: three in favor of the merger, three in favor of the separation, and one neutral.
After reading, the researchers asked the participants their opinion about what the school should do and to what extent they were confident that they had all the information necessary to make that judgment.
Polls revealed that most people were much more likely to agree with the argument – in favor of merging or staying separate – that they had read, and that they were often confident that they had enough information to form that opinion. Participants in the groups that had only read one point of view were also more likely to say they were more sure of their opinion than those in the control group that had read both arguments.
Half of the participants in each group were then asked to read the opposing party’s information, which contradicted the article they had already read.
Although people were confident in their opinions when they had only read arguments in favor of a solution, when presented with all the facts, they were often willing to change their minds. They also noted that they then had less confidence in their ability to form an opinion on the topic.
“We thought that people would stick to their original judgments even if they received information that contradicted them, but it turns out that if they learned something that seemed plausible to them, they were willing to completely change their minds,” Fletcher said, adding that the research shows highlights the idea that people do not consider whether they have all the information about a situation.
However, the researchers noted that the results might not apply to situations where people have pre-established ideas about a situation, as is often the case in politics.
“People are more open-minded and willing to change their minds than we assume,” Fletcher said. However, “this same flexibility does not apply to long-held differences, such as political beliefs.”
Todd Rogers, a behavioral scientist at the Harvard Kennedy School of Government, compared the results to the “invisible gorilla” study, which illustrated the psychological phenomenon of “inattentional blindness,” when a person fails to notice something. Obviously because she is focused on something else.
“This study captures that with the data,” Rogers said. “There seems to be a cognitive tendency to not realize that the information we have is inadequate.”
The study parallels a psychological phenomenon called the “illusion of explanatory depth,” in which people underestimate what they know about a given topic, according to Barry Schwartz, a psychologist and professor emeritus of Social Theory and Social Action at Swarthmore College. of Pennsylvania.
The idea is that if you ask a normal person if they know how a toilet works, they will probably answer yes. But if you ask him to explain how a toilet works, he quickly realizes that he doesn’t know how it works, but rather how to make it work by pressing a lever.
“It’s not just that people make mistakes. The problem is that they trust their mistake too much,” Schwartz said.
The antidote, he added, is “to be curious and humble.”
The researchers and Schwartz agreed that the fact that people in the study who were subsequently presented with new information were open to changing their minds, as long as the new information seemed plausible, was encouraging and surprising.
“It’s a reason to be a little optimistic: Even if people think they know something, they are willing to change their minds with new evidence,” Schwartz said.