Saturday, March 23, 2019

Cognitive Bias: Even Einstein Gets Fooled

I am hard at work on a book on bipolar recovery, the third in my Bipolar Expert Series. A vital part of our recovery involves critical thinking. Unfortunately, our cognitive biases get in our way. Picking up from my previous blog post, Philosophy: Because We're Mindless Without It ...

So, you think you’ve got the answer - to life, to anything - well think again. As fate would have it, our brains default to answers without the inconvenience of grappling with the questions. Too often, wrong answers serve our needs every bit as much as right ones. Our DNA is programmed to find patterns in our everyday experiences, recognize their significance, and, based on these patterns, we plan and act accordingly. Wash, rinse, and repeat - again and again - and these patterns become part of our default operating system, how we view the world and everything that resides, therein. We may see ourselves as rational beings residing within an objective reality. Socrates would only laugh.

Inevitably, our patterns shape our identity. Then the real problems start. By now, we are so invested in our own sense of self that we will defend it at all costs, however horrifically our particular sense of self may be working for us. Our thinking - our capacity to face the facts fearlessly and make the appropriate course corrections  - has been hijacked. Like a slave eunuch bodyguard protecting a decrepit potentate, our thinking parries and thrusts against its perceived enemy, meeting facts and alternative viewpoints with rationalization and denial. Perversely, a successful rationalization lights up the brain’s pleasure circuits, encouraging us to persist in our delusions. Our warped reality holds, our identity stays intact, the sneering potentate tosses his slave a gold coin.

Change is possible, but first we need to know what we’re up against. That way, we won’t give up on ourselves. Trust me, even scientists - our very models of the apotheosis of rigorous thought - can be as willfully ignorant as the worst political extremist or the most naive conspiracy theorist. Cast your mind back to high school. Here we were - you and I - the C-students with no choice but to work with the brains fate issued us, forever rolling a rock uphill, being written off by our teachers as underachievers. There they were,  the A-students, flaunting their neurons, always showing up prepared, always handing in their homework on time, effortlessly solving all those “two trains leave the station” puzzles without resorting to pencil and paper. 

These were your future Einsteins, but even Einstein suffered from a major flaw in his operating system, one that led him on a 30-year wild goose chase in pursuit of a theory of everything that ended in nothing. For all his intellectual prowess, Einstein couldn’t reconcile himself to quantum physics. The major sticking point for Einstein was that quantum physics deals in probability rather than certainty, where the relationship between cause and effect appears decidedly more casual than common sense would lead us to believe. “God does not play dice with the universe,” he famously replied in a letter.

The technical term for Einstein’s failing is cognitive bias, another way of saying there are no end of ways our brains can play tricks on us. The field was pioneered by the Israeli psychologists Amos Tversky and Daniel Kahneman beginning in the 1970s. For their work, Daniel Kahneman shared the 2002 Nobel Prize in Economics (Nobels are not awarded posthumously, thus disqualifying the late Dr Tversky). In his best-selling 2011 book, Thinking, Fast and Slow, among many other things, Dr Kahneman articulates our tendency toward “optimistic bias.” Part of this has to do with our unwarranted overconfidence in completing projects on time and within budget, based on our inflated sense of our ability to control events. Whether remodeling a kitchen or conducting an overseas war, apparently we can all benefit from a good healthy dose of pessimism. 

Another one includes “framing.” Say, if you were told that a certain surgical procedure had a 10 percent failure rate, would you opt for the surgery? Maybe not. What if you were told, instead, that the procedure had a 90 percent success rate? Maybe yes. The same information both times, but the way it was presented in each case encourages a different response.

Perhaps the most glaring cognitive bias is “confirmation bias,” where we tend to accept facts that fit inside our belief systems and reject those that fall outside. Nowhere is this on better display than in what passes for political discourse in the US. For instance, according to  a 2011 PRRI/Brookings survey, when Obama was in office, only 30 percent of white evangelicals thought that a politician who commits personal immoralities is fit for public office. In 2016, with Trump as the Republican Presidential nominee, a whopping 72 percent suddenly decided that personal immorality did not matter.

Democrats are not immune: When Clinton was in office, feminists found themselves in the bizarre position of defending the President’s grossly inappropriate sexual behavior and attacking his victims.

If you think I’m engaging in “false equivalence,” here, you have a very good point. Part of false equivalence has to do with treating one side’s minor failing the same as the other’s major failing. My response is that reality is messy and full of contradictions. We may long for simple answers, but their only useful purpose is to making us temporarily feel good. We need to do better.

A sampling of more cognitive biases (from Wikipedia): “Anchoring” (relying too heavily on one piece of information, such as the first source you encountered while researching a particular topic), “availability heuristic” (such as giving too much weight to emotionally charged memories), the “Dunning-Krueger effect” (too stupid to know you’re stupid), “gamblers fallacy” (such as thinking flipping five heads in a row raises the chance of tails coming up on the next flip), “hyperbolic discounting” (a preference for immediate pay-offs), “overconfidence” (99 percent certain answers turn out to be 44 percent wrong), “status quo bias” (preference for things staying the same).

Then there are the various social biases, such as “authority bias” (accepting as Gospel truth the word of your favorite authority) and “ingroup bias” (such as favoring those one perceives to be members of one’s tribe).

Psychiatry’s five-factor model (FFM) affords us insights into how our preferences seem to be shaped from birth, including whether or not we are “open to new experience,” and “conscientiousness,” which places a high premium on loyalty and duty. Not surprisingly, political liberals are inclined toward openness to new experience (as well as favoring cats over dogs) while conservatives score high on conscientiousness and prefer dogs.

Are our beliefs truly that predetermined? Are are brains really that hopeless? When it comes to shaping our destiny, are we as powerless and mindless as steel filings drawn to a magnet? The metaphor comes from Oscar Wilde. In a parable, he wrote how the filings deluded themselves into thinking they were headed toward the magnet of their own free will. In his story, some of the filings “were heard to say it was their duty to visit the magnet.”

Socrates, where are you now?

John McManamy is the author of Living Well with Depression and Bipolar Disorder and is the publisher of the Bipolar Expert Series, available on Amazon.

Follow John on Twitter at @johnmcman and on Facebook.

Become a Patron!

No comments: