Why People Believe Irrational Things

Table of Contents

  1. Why People Believe Irrational Things
  2. Wishful Believing
    1. Cognitive Dissonance
    2. Defense Mechanisms For Protecting Deeply-held Beliefs
      1. Rejecting the Evidence
      2. Confirmation Bias
      3. Ad Hoc Hypothesis
      4. Religious Faith
    3. How Smart People Can Believe Absurdities
    4. Motivated Numeracy and Enlightened Self-Government
    5. Why do smart Republicans say stupid things?
  3. Inference from Incomplete Evidence
    1. Examples
    2. Projection
    3. Fallacies Involving Incomplete Evidence
    4. Related Links
  4. Echo Chambers and Epistemic Bubbles

Why People Believe Irrational Things

Wishful Believing

  • Two conflicting dispositions drive belief:
    • the disposition to believe based on rational argument
    • the disposition to believe what you want to be true.
  • Wishful believing (wishful thinking) is the disposition to believe
    • what you want to be true
    • what accords with your deeply-held beliefs
    • what you have an emotional stake in believing.
  • Francis Bacon, 1620
    • “The human understanding is no dry light, but receives an infusion from the will and affections; whence proceed sciences which may be called ‘sciences as one would.’  For what a man had rather were true he more readily believes.” (Novum Organum, XLIX, page 26, year 1620).
  • Skeptic Michael Shermer, explaining why people believe “weird” things such as UFO’s, astrology, and alien abduction:
    • “More than any other, the reason people believe weird things is because they want to. It feels good. It is comforting. It is consoling.” (Why People Believe Weird Things , page 275,1997)
  • Examples:
    • Believing that channelers can communicate with the dead because you want to communicate with your departed spouse.
    • Believing in an afterlife because it’s comforting to believe that you and your loved ones will be together after death.
    • Believing that facilitated communication works because you’re desperate to communicate with your autistic child.
      • FC is a debunked technique for enabling an autistic or non-verbal person to communicate by typing on a keyboard, aided by a facilitator.
      • View Facilitated Communication
Cognitive Dissonance
  • britannica.com/science/cognitive-dissonance
    • Cognitive dissonance is the mental conflict that occurs when beliefs or assumptions are contradicted by new information. The unease or tension that the conflict arouses in people is relieved by one of several defensive maneuvers.
Defense Mechanisms
For Protecting Deeply-held Beliefs
  • Rejecting the Evidence
    • Believing that the truth of a deeply-held belief casts doubt on the evidence, rather than the reverse
  • Confirmation Bias
    • Being disposed to recognize, accept and remember evidence supporting a deeply-held belief while ignoring, rejecting and forgetting evidence casting doubt.
  • Ad Hoc Hypothesis
    • Protecting a deeply-held belief by explaining away disconfirming evidence, i.e. by explaining how the evidence is consistent with the belief.
  • Religious Faith
    • Protecting a deeply-held belief by regarding it, not as a hypothesis assessed by the evidence, but as a commitment to a way of life.
  • Living in an Epistemic Bubble
    • Avoiding people, situations, and sources of information that could cast doubt on a deeply-held belief. And seeking people, situations, and sources of information supporting it.
Rejecting the Evidence
  • Rejecting the Evidence is believing that the truth of a deeply-held belief casts doubt on the evidence, rather than the reverse.
  • For example, arguing that:
    • The creation story in Genesis is incompatible with the theory of evolution.
    • The creation story in Genesis is true, being the word of God
    • Therefore, the theory of evolution is false.
  • Rather than arguing:
    • The creation story in Genesis is incompatible with the theory of evolution.
    • The theory of evolution is well-confirmed and highly likely.
    • Therefore the creation story is highly unlikely.
Confirmation Bias
  • Confirmation Bias is the disposition to recognize, accept and remember evidence supporting a deeply-held belief while ignoring, rejecting and forgetting evidence casting doubt.
    • People remember the hits and forget the misses.
  • Francis Bacon in 1620:
    • “The human understanding when it has once adopted an opinion draws all things else to support and agree with it.  And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects; in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate….It is the peculiar and perpetual error of the human intellect to be more moved and excited by affirmatives than by negatives.” (Novum Organum, XLVI, page 23)
  • Example
    • The 2002 National Intelligence Estimate mistakenly assessed that Iraq had chemical and biological weapons.
    • In 2004 the Senate Select Committee on Intelligence issued a report on what had gone wrong.
    • The authors of the report concluded that intelligence analysts had been guilty of confirmation bias:
      • “In the case of Iraq’s weapons of mass destruction (WMD) capabilities, the Committee found that intelligence analysts, in many cases, based their analysis more on their expectations than on an objective evaluation of the information in the intelligence reporting.” (Page 18)
Ad Hoc Hypothesis
  • An Ad Hoc Hypothesis protects a deeply-held belief by explaining away disconfirming evidence, i.e. by explaining how the evidence is consistent with the belief.
  • The classic example is the psychic who, confronted with the negative results of a test of his abilities, claims the experimenter’s skeptical attitude disrupted his psychic powers. The example involves:
    • A deeply-held belief
      • the psychic’s belief he’s psychic
    • Facts casting doubt on the belief
      • the negative result of the test
    • The ad hoc hypothesis
      • that the experimenter’s attitude disrupted the psychic’s special powers.
  • The ad hoc hypothesis lets the psychic continue believing in his psychic powers despite the evidence to the contrary.
Religious Faith
  • Theists typically subscribe to Fideism, the view that religious truths are ultimately based on faith rather than on reason and evidence.
  • Fideists believe their religious doctrines, not based on a rational assessment of the evidence, but because of their need to believe that a just and merciful God is in charge, that life has a purpose, that there’s an afterlife, and that an objective moral code defines right and wrong.
  • Religious faith protects these beliefs by removing them from the realm of hypothesis, evidence, and reason.  They are regarded, not as hypotheses to be assessed on the evidence, but as a commitment to a worldview. Skeptical doubts are viewed as attacks on their chosen way of life, rather than as evidence casting doubt on a hypothesis.
How Smart People Can Believe Absurdities
  • In emails to Mark Meadows following the election, Ginni Thomas wrote:
    • “Release the Kraken and save us from the left taking America down.”
    • Biden’s victory is “the greatest heist of our history”.
    • “I can’t see Americans swallowing the obvious fraud. Just going with one more thing with no frickin consequences. We just cave to people wanting Biden to be anointed? Many of us can’t continue the GOP charade.”
  • Thomas evidently believed that Biden’s victory was fraudulent.
  • There’s no question Thomas is smart.  She has a law degree, worked for the Legislative Affairs Office of the Department of Labor, served as the Heritage Foundation’s liaison to the Bush White House, and became an entrepreneur in promoting right-wing causes.
  • So how can a smart person believe something contradicted by the evidence?
  • People may have an emotional stake in certain of their beliefs, e.g. beliefs about themselves (Ibsen’s “vital lie”), religious beliefs, and political beliefs.  These deeply-held beliefs are part of who they are.
  • People have developed methods for protecting these core beliefs against contrary evidence.
  • Perhaps not surprisingly, there’s evidence that smart people do a better job of protecting their deeply-held beliefs:
    • In an experiment, subjects were presented with a difficult problem drawing inferences from the results of a clinical trial of a new skin-rash treatment.  The more numerate subjects, those who were good at quantitative reasoning, did much better than the less numerate.
    • The subjects were then presented with the results of a study of a gun-control ban, which were numerically equivalent to the results of the first study.
    • As expected, subjects’ responses were politically polarized and less accurate than for the study of the skin treatment.
    • What was interesting was that numerate subjects were more polarized than the less numerate, presumably because they were better at spinning the numbers.
Motivated Numeracy and Enlightened Self-Government

Abstract of paper by Kahan, Peters, Dawson, and Slovic

Why does public conflict over societal risks persist in the face of compelling and widely accessible scientific evidence? We conducted an experiment to probe two alternative answers: the “Science Comprehension Thesis” (SCT), which identifies defects in the public’s knowledge and reasoning capacities as the source of such controversies; and the “Identity-protective Cognition Thesis” (ICT) which treats cultural conflict as disabling the faculties that members of the public use to make sense of decision-relevant science. In our experiment, we presented subjects with a difficult problem that turned on their ability to draw valid causal inferences from empirical data. As expected, subjects highest in Numeracy — a measure of the ability and disposition to make use of quantitative information — did substantially better than less numerate ones when the data were presented as results from a study of a new skin-rash treatment. Also as expected, subjects’ responses became politically polarized — and even less accurate — when the same data were presented as results from the study of a gun-control ban. But contrary to the prediction of SCT, such polarization did not abate among subjects highest in Numeracy; instead, it increased. This outcome supported ICT, which predicted that more Numerate subjects would use their quantitative-reasoning capacity selectively to conform their interpretation of the data to the result most consistent with their political outlooks. We discuss the theoretical and practical significance of these findings.

Why do smart Republicans say stupid things?

Why do smart Republicans say stupid things? Dana MilBank WaPo

  • The Ginni Thomas text messages revive a question that has been nagging me since the dawn of the Trump era: What makes smart people say truly stupid things?
  • Thomas, wife of the longest-serving current Supreme Court justice, is no dope. She has a law degree, worked for House Majority Leader Dick Armey, served as the Heritage Foundation’s liaison to the George W. Bush White House and became an entrepreneur in right-wing advocacy. Yet in text messages to the White House chief of staff, she told him to “release the Kraken,” echoed a bonkers QAnon canard about ballot watermarks, and asserted the lunacy that “Biden crime family & ballot fraud co-conspirators” were being arrested “& will be living in barges off GITMO to face military tribunals for sedition.”
  • Surely a well-informed, well-educated person such as Thomas couldn’t actually believe the nutty ideas her thumbs texted?
  • But here’s the truly crazy thing: She probably does. Recent advances in cognitive science suggest that highly intelligent people are more susceptible to “identity-protective cognition,” an unconscious process in which they use their intellect to justify rejecting facts inconsistent with their partisan identity.
  • “The really upsetting finding is that the better you are at particular types of cognitive tests … the better you are at manipulating the facts to reflect your prior beliefs, the more able you are to cognitively shape the world so it fits with your values,” says David Hoffman, a University of Pennsylvania law professor who studies cultural cognition.

Inference from Incomplete Evidence

People make inferences from incomplete evidence. Knowing only part of the story, they jump to conclusions.

Examples
  • Believing that a semi-automatic handgun is unloaded because the magazine has been removed.
    • Unaware there may be a round in the chamber.
  • Believing Saddam Hussein played a role in the 9/11 attacks because the US invaded Iraq as part of the War on Terror. (Why else would the U.S. invade a country halfway around the globe?)
    • Unaware that President Bush said in a news conference on September 16, 2003: “We’ve had no evidence that Saddam Hussein was involved with September the 11th.”
    • [In a poll by the Washington Post in August 2003, 69 percent of respondents said it was either very likely (32%) or somewhat likely (37%) that Saddam Hussein “was personally involved in the September 11 terrorist attacks.” In a 2006 Zogby Poll, 85% of troops surveyed in Iraq said the U.S. mission was mainly “to retaliate for Saddam’s role in the 9-11 attacks.]
  • Believing Lee Harvey Oswald was not the lone assassin of President Kennedy based on Oliver Stone’s movie JFK.
Projection

Even With 190,000 Dead, There’s a Lot That Voters Don’t Know (NYT, September 10, 2020)

  • Political science research has found that people don’t always vote in a way that reflects their policy preferences. A principal obstacle is lack of knowledge. In surveys, voters often can’t distinguish presidential candidates’ stances even on critical issues of the day.
  • Voters appear not to know Mr. Trump’s and Biden’s stances on the coronavirus. Only 40 percent of survey respondents said Biden supported masks more strongly than Trump. 41 percent said Biden supported closing businesses more strongly.  47 percent said Biden supported WHO more strongly.
  • These results are consistent with decades of research showing that a considerable share of the public doesn’t know the positions of the parties and presidential candidates, even on the most salient issues. By contrast, among respondents who are highly knowledgeable about the presidential candidates on other issues, more than 90 percent correctly place them on all three coronavirus questions.
  • It’s not just that people don’t know. When people don’t have a sense for party or candidate platforms, they tend to assume that their preferred party or candidate agrees with them on the issues. This phenomenon, which political scientists call projection, appears to be operating here. People’s perceptions appear strongly influenced by which candidate they like.
Fallacies Involving Incomplete Evidence

Echo Chambers and Epistemic Bubbles

  • People avoid people, situations, and sources of information that might cast doubt on deeply-held beliefs.  And they seek out those supporting them.
  • wikipedia.org/wiki/Echo_chamber
    • In discussions of news media, an echo chamber refers to situations in which beliefs are amplified or reinforced by communication and repetition inside a closed system and insulated from rebuttal. By participating in an echo chamber, people are able to seek out information that reinforces their existing views without encountering opposing views, potentially resulting in an unintended exercise in confirmation bias. Echo chambers may increase social and political polarization and extremism. On social media, it is thought that echo chambers limit exposure to diverse perspectives, and favor and reinforce presupposed narratives and ideologies.
    • Distinction between epistemic bubble and echo chamber:
      • An epistemic bubble is an informational network in which important sources have been excluded by omission, perhaps unintentionally. It is an impaired epistemic framework which lacks strong connectivity. Members within epistemic bubbles are unaware of significant information and reasoning.
      • On the other hand, an echo chamber is an epistemic construct in which voices are actively excluded and discredited. It does not suffer from a lack in connectivity; rather it depends on a manipulation of trust by methodically discrediting all outside sources. According to research conducted by the University of Pennsylvania, members of echo chambers become dependent on the sources within the chamber and highly resistant to any external sources.
  • Echo Chambers and Epistemic Bubbles, C. Thi Nguyen, Episteme
    • An epistemic bubble is a social epistemic structure in which other relevant voices have been left out, perhaps accidentally. An echo chamber is a social epistemic structure from which other relevant voices have been actively excluded and discredited. Members of epistemic bubbles lack exposure to relevant information and arguments. Members of echo chambers, on the other hand, have been brought to systematically distrust all outside sources. In epistemic bubbles, other voices are not heard; in echo chambers, other voices are actively undermined.
    • It is crucial to keep these phenomena distinct.
      • First, echo chambers can explain the post-truth phenomena in a way that epistemic bubbles cannot.
      • Second, each type of structure requires a distinct intervention. Mere exposure to evidence can shatter an epistemic bubble, but may actually reinforce an echo chamber.
      • Finally, echo chambers are much harder to escape. Once in their grip, an agent may act with epistemic virtue, but social context will pervert those actions. Escape from an echo chamber may require a radical rebooting of one’s belief system.