The architecture of truth

Photo by Mark Bessoudo

Misinformation can now be spread effortlessly through the echo chambers of social media at an unprecedented scale and velocity. These assaults on facts may seem postmodern, but they are nothing new. “Post-truth” narratives and the construction of alternative realities are merely a reflection of a much deeper and more systemic problem, one that did not originate in the twenty-first century.

The problem is one of human cognition. We have a tendency to exhibit numerous biases, fallacies, and illusions, the very lifeblood of post-truth narratives. These behavioral errors aren’t flaws in the system — they are built directly into the cognitive machinery. While problematic post-truth narratives may appear to be imposed on us from outside or above, they are actually more of a collective manifestation of our default cognitive set-point.

Misinformation is able to thrive in the twenty-first century for the same reasons why it has thrived in centuries past: it takes time and persistence to overcome our inherent cognitive biases and behavioral errors. Most people, most of the time, just don’t have the luxury (nor the interest) to put in the effort required to overcome them.

Various recent world events that have exemplified the extent to which blatant misinformation can have real-world consequences. In their wake many have placed blame on tech companies that served as conduit for the misinformation to proliferate. And while these companies do hold responsibility for safeguarding against certain forms of misinformation, they cannot realistically be expected to safeguard us from ourselves.

If we want a better democracy with well-informed citizens, the tools necessary for detecting misinformation can’t merely be outsourced to an algorithm. We still need to rely on the trustworthiness of experts, of course, but we also need to rely on the algorithms that reside inside our own minds. Behavioral errors and cognitive biases may be features of our brains, but so is the capacity to overcome them.

If we want a better democracy with well-informed citizens, the tools necessary for detecting misinformation can’t merely be outsourced to an algorithm.

The 2013 book The Art of Thinking Clearly by Swiss writer Rolf Dobelli effectively conveys this phenomenon. The book succinctly illustrates 99 of the most common errors that plague us, both individually and collectively as a society. The book is divided into chapters with enlightening titles such as“If Fifty Million People Say Something Foolish, It Is Still Foolish: Social Proof”, “Beware the ‘Special Case’: Confirmation Bias”, “Don’t Bow to Authority: Authority Bias”, and “Why We Prefer a Wrong Map to None at All: Availability Bias”. The book could serve as the perfect recipe for helping us combat the proliferation of post-truth narratives. [1]

We are rarely ever formally taught how best to overcome our intrinsic cognitive errors, let alone taught that they exist. This is what makes Dobelli’s book so notable, particularly for something as vital for the functioning of a healthy democracy with informed citizens.

The alternative — ignorance — offers the path of least resistance. Surrendering to the allure of groupthink and identity politics culminates in post-truth alternative realities that exist across the political spectrum. But this inability (or refusal) to reason honestly is no longer just a personal or individual problem — it has become a social problem for the entire world.

Sheila Jasanoff, professor of science and technology studies at the Harvard Kennedy School of Government, provides a remedy: “To address the current retreat from reason — and indeed to restore confidence that ‘facts’ and ‘truth’ can be reclaimed in the public sphere — we need a discourse less crude than the stark binaries of good/bad, true/false, or science/antiscience.” [2]

What’s needed, in other words, is a culture that values intellectual honesty and demands it from our leaders, ourselves, and each other. Intellectual honesty is both an awareness of one’s own limits of knowledge coupled with an openness to accept new ideas based on honest reasoning, careful observation and logical consistency — irrespective of in-group/out-group loyalties.

According to the philosopher and neuroscientist Sam Harris, intellectual honesty is what “allows us to stand outside ourselves and to think in ways that others can (and should) find compelling. It rests on the understanding that wanting something to be true isn’t a reason to believe that it is true.” [3]

Intellectual honesty is both an awareness of one’s own limits of knowledge coupled with an openness to accept new ideas based on honest reasoning, careful observation and logical consistency — irrespective of in-group/out-group loyalties.

Intellectual honesty, Harris argues, is what makes real knowledge possible. In the pursuit of truth, intellectual honesty should be the principle that trumps all others; it is the value that produces (and maintains) real knowledge. While facts are still important, they are not as important as the process by which they are gathered, debated, and agreed upon. If truth is a structure, then intellectual honesty is the architecture.

According to Jasanoff, public truths in democratic societies “are precious collective achievements, arrived at just as good laws are, through slow sifting of alternative interpretations based on careful observation and argument and painstaking deliberation among trustworthy experts.” Furthermore, the durability of public facts “depends not on nature alone but on the procedural values of fairness, transparency, criticism, and appeal in the fact-finding process” — the very virtues that are built into the ethos of science. (Harris would probably agree; he writes: “The core of science is not controlled experiment or mathematical modeling; it is intellectual honesty.”)

The inability (or refusal) to reason honestly is no longer just a personal or individual problem — it has become a social problem for the entire world.

When considering whether or not something is true, Harris contends, “one is either engaged in an honest appraisal of the evidence and logical arguments, or one isn’t.” [4] Merely admitting this has the potential to transform the way we think about truth in the public sphere.

In a society that fosters a culture of intellectual honesty, factual disagreements will still exist, but they would retreat into the background. Jasanoff concludes that even if factual disagreements in such a society are not resolved to everyone’s satisfaction, “the possibility remains open that one can return some other day, with more persuasive data, and hope the wheel of knowledge will turn in synchrony with the arc of justice.”

[1] Dobelli, Rolf. The Art of Thinking Clearly. New York: Harper, 2013.

[2] Jasanoff, Sheila. “Back from the Brink: Truth and Trust in the Public Sphere.” Issues in Science and Technology 33, no. 4 (Summer 2017).

[3] Harris, Sam. Letter to a Christian Nation. Vintage Books, 2008.

[4] Harris, Sam. “Intellectual Honesty.” What Scientific Term or Concept Ought to Be More Widely Known?, Edge, 2017.

Originally published at markbessoudo.com on September 5, 2017.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Mark Bessoudo

Mark Bessoudo

9 Followers

I write about cities, buildings, philosophy, sustainability, technology, culture & design. markbessoudo.com/hi