Thursday, March 28, 2019

Critical Thinking: What We Can Learn From Sherlock Holmes and Charles Darwin

The following is from a book I'm working on about bipolar recovery. This post is a continuation from my previous post, Cognitive Bias: Even Einstein Fooled ...

Our best defense against our cognitive failings is critical thinking. Two iconic figures leading very different lives employed remarkably similar methods. Both were keen observers and relentless appliers of logic, highly disciplined thinkers never jumping to conclusions but always ready to entertain a wild idea. The first is the most famous fictional detective of all time, the second the most famous natural scientist. In looking for role models who know how to use their brains, one can do no better than Sherlock Holmes and Charles Darwin.

Sherlock Holmes

Holmes and Dr Watson have just met. Instantly, Holmes deduces that Watson has returned from military service in Afghanistan. “From a drop of water,” he informs his new fellow lodger, “the logician could infer the possibility of an Atlantic or a Niagara without having seen or heard of one or the other.” 

That drop of water could be a man’s fingernails, his trouser knees, his shirt cuffs - clear give-aways to a person’s occupation and identity, and, perhaps, his or her complicity in a dastardly crime. “The world,” says Holmes, “is full of obvious things which nobody by any chance ever observes.”

Time for Holmes first mystery with Watson, “A Study in Scarlet.” The body is on the floor, blood-red letters on the wall spell, “Rache.” Action …

After careful examination of the crime scene,  Holmes informs two Scotland Yard detectives that the murderer was a man. Not only that: “He was more than six feet high, was in the prime of life, had small feet for his height, wore coarse, square-toed boots and smoked a Trichinopoly cigar. He came here with his victim in a four-wheel cab, which was drawn by a horse with three old shoes and a new one on his off fore leg. In all probability the murderer had a florid face, and the finger-nails of his right hand were remarkably long.” 

Also, the victim had been poisoned, and by the way: “‘Rache,’ is the German for ‘revenge,’ so don’t waste your time looking for Rachel.”

In the course of solving no end of mysteries, Holmes begins to spot a pattern to some of the crimes. It was as if a criminal mastermind were at work, leading a secret crime empire. A far-fetched idea, to be sure, but it turns out to be the only one that makes sense. As Holmes explained to Watson on numerous occasions: “When you have eliminated the impossible, whatever remains, however improbable, must be the truth.” Sure enough, the mastermind is unmasked, a certain Professor Moriarty. In a confrontation, Holmes and his nemesis fall off a cliff together. So, with Holmes being temporarily dead and otherwise indisposed, now is a good time to cue up our second role model.

Charles Darwin

Darwin comes from a tradition of gentlemen scientists, who, in their leisure time, bequeathed to us astronomy, physics, biology, geology, economics, and other disciplines, and in the process changed how we regard our universe, not to mention each other. Starting around the beginning of the seventeenth century and continuing into the twentieth, these gentlemen, plus the odd cleric or two (Darwin falls into both categories) took to collecting rocks and butterflies and such and looking up at the sky and taking long walks in the countryside and poking pointed objects at things. With precious few scientific instruments to work with, their greatest lab and field tool proved to be their heightened powers of observation. But it didn’t stop there. Observation may have given them the data, but logic gave them the ability to run with it. Otherwise, Darwin’s best-known work might have been his multi-volume monograph on barnacles. That opus consumed Darwin for seven years. As a true natural scientist, he simply couldn’t help himself. 

When Darwin commenced his work on barnacles in 1847, his ideas on natural selection were already well-formed. Back in the 1830s, he spent five years aboard the Beagle as a naturalist. Traveling down the eastern South American coast, he kept observing seashells where they weren’t supposed to be: on towering coastal cliffs, among fossil bones of extinct mammals, deep in the interior. Up the western coast, high in the Andes, he spotted more seashells. His shipboard reading included Charles Lyle’s Principles of Geology. Already, still in his early twenties, Darwin’s brain was being primed to ask big questions, solve big mysteries.

By the time the Beagle reached the Galapagos islands, Darwin’s attention was mostly on geology. He dutifully collected mockingbird and finch specimens, but it was only on later examination that he realized their significance: The mockingbirds separated themselves out by species, depending on location from island to island. The finches displayed a wide variety of beaks, suited to local conditions - one type of beak for feeding on seeds, another for insects, another for fruit, and so on. From a metaphorical drop of water here, a drop of water there, Darwin was beginning to imagine his own Niagara. It was inquiry from the bottom up - based on careful observation - not the top down. Going at it the other way is a bad idea. In the words of our favorite detective: “It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.”

The Darwin Reaction

In late 2010, back when I lived in rural San Diego, I took it upon myself to visit the Creation and Earth History Museum in Santee, 10 or 12 miles from my home. Until Ken Ham’s Creation Museum opened in Kentucky in 2007 and his nearby Ark Encounter park in 2016, Santee was the place to visit for your anti-Darwin fix. According to a display, “Helium Diffusion Dates Earth at 6,000 years.”

A large scale model of the Ark there illustrated the plausibility of floating a zoo in a wooden boat. According to a display, after the flood waters receded, Noah’s sons, together with the Ark’s animals, went their separate ways and across the oceans, via land bridges formed by a Flood-induced ice age. Actually, the ice age - according to creation belief, there was only one.

Another display asserted that Neanderthals were modern, “descended from Adam and Noah.” Their explanation: “Some compare Neanderthals to Eskimos. That would be consistent with humans who lived during the Ice Age.”

The museum makes a show of masking its talking points in science. Thus, the first law of thermodynamics and the principle of homeostasis are cited in support of the proposition that creation is constant and cannot be added to. It also boasts of a number of geologists affiliated with the museum. A large display shows how recent cataclysms such as Mt St Helens better account for the formation of the earth than the more time-consuming processes of plate tectonics.

It’s almost as if creationism were the true science. Indeed, we are informed that belief in evolution stemmed from “evolutionary religions,” ones that “reject the existence of a personal god who created all things.” A portrait gallery reveals the “bad fruits of evolution.” Alongside Charles Darwin, we have Karl Marx and Adolf Hitler.

In the bookstore on the way out, I came across a book titled Dinosaurs or Dragons? You can’t make this stuff up. But a 2018 reviewer on TripAdvisor leaves us with an entirely different impression: “We toured this museum in late December,” she wrote, “while visiting San Diego from PA.” She goes on to say: “Being lovers of science, natural history, world history, and Biblical truth, we knew we had to visit here. We thought the quality displays and fascinating content throughout could rival any secular museum, and we really appreciated the reprieve from the unsubstantiated "evolution" and "Big Bang" origins nonsense we find at typical museums (talk about "anti-science"!). Sure, there is a fair amount of reading required to get the most out of this museum (learning does take some mental effort, after all). … We highly recommend this museum if you want to learn and expand your knowledge of truth!”

It would be easy to write off the correspondent as some kind of crackpot, but again, being human she is contending with the same brain as the rest of us, loaded with the same cognitive traps. Moreover, her views almost certainly fit more inside mainstream opinion than those who watch TED Talks and listen to NPR. According to a 2018 Pew survey, only one in three adults in the US say man evolved through natural processes. An additional half are prepared to accept evolution, provided it is guided by God or a higher power.

By all means, continue to believe in God, but also keep yourself open to the possibility that our existence here - on this infinitesimally small plot of time and space - may be nothing more than a random series of accidents. The second book in my Bipolar Expert Series series, IN SEARCH OF OUR IDENTITY, goes into this in considerable detail. Knowing, for instance, that we are working with the same neurons as snails, with brains organized like those of rats and mice, with almost identical genomes to our primate cousins, we gain invaluable insights into how we think and feel and behave. The wisdom gained from these insights gives meaning to our experience and points the way to our recovery. Armed with our new tools of critical thinking, we dare to turn, “I think, therefore I am,” into something greater: “I think, therefore I will be.” A new you, at peace with yourself, at peace with the world. Take home message: Worship God, think like Darwin.

Final Word

We can’t leave Darwin without a brief word on “falsification.” Recall how Darwin observed seashells where they weren’t supposed to be. Now imagine another seashell find, again in an unexpected place, this time in the Canadian shield embedded in rock strata from the pre-Cambrian, when the earth was only just beginning to solidify. Such a find, most indubitably, would torpedo evolution, as any fan of Darwin would readily acknowledge. This type of acceptance of the rules is what keeps the game honest. Science works on the principle that while there may be experts in the field, there are no authorities. From the perspective of someone arguing from authority, though, it makes no difference where the seashells turn up - their conclusion will always be the same. The facts don’t matter. They are bound by no rules. It’s a fixed game.

To bring this down to earth, the greatest inquiry in your life will be into yourself. You need to be your own Darwin. When those metaphorical seashells in your world turn up in unexpected places, you need to be asking yourself hard questions, and you need to be doing it with the urgency of someone whose life depends on it. It does.

To the end, Darwin remained the keen and meticulous observer. His two block-busters - On the Origin of the Species and The Descent of Man - may have changed the world, but his last published book, a 326-page volume, reveals a man true to his roots, dedicated to the pursuit of knowledge for the sake of knowledge, forever a servant to the facts. The title: The Formation of Vegetable Mould Through the Action of Worms, with Observations on Their Habits.

John McManamy is the author of Living Well with Depression and Bipolar Disorder and is the publisher of the Bipolar Expert Series, available on Amazon.

Follow John on Twitter at @johnmcman and on Facebook.

Become a Patron!

Saturday, March 23, 2019

Cognitive Bias: Even Einstein Gets Fooled

I am hard at work on a book on bipolar recovery, the third in my Bipolar Expert Series. A vital part of our recovery involves critical thinking. Unfortunately, our cognitive biases get in our way. Picking up from my previous blog post, Philosophy: Because We're Mindless Without It ...

So, you think you’ve got the answer - to life, to anything - well think again. As fate would have it, our brains default to answers without the inconvenience of grappling with the questions. Too often, wrong answers serve our needs every bit as much as right ones. Our DNA is programmed to find patterns in our everyday experiences, recognize their significance, and, based on these patterns, we plan and act accordingly. Wash, rinse, and repeat - again and again - and these patterns become part of our default operating system, how we view the world and everything that resides, therein. We may see ourselves as rational beings residing within an objective reality. Socrates would only laugh.

Inevitably, our patterns shape our identity. Then the real problems start. By now, we are so invested in our own sense of self that we will defend it at all costs, however horrifically our particular sense of self may be working for us. Our thinking - our capacity to face the facts fearlessly and make the appropriate course corrections  - has been hijacked. Like a slave eunuch bodyguard protecting a decrepit potentate, our thinking parries and thrusts against its perceived enemy, meeting facts and alternative viewpoints with rationalization and denial. Perversely, a successful rationalization lights up the brain’s pleasure circuits, encouraging us to persist in our delusions. Our warped reality holds, our identity stays intact, the sneering potentate tosses his slave a gold coin.

Change is possible, but first we need to know what we’re up against. That way, we won’t give up on ourselves. Trust me, even scientists - our very models of the apotheosis of rigorous thought - can be as willfully ignorant as the worst political extremist or the most naive conspiracy theorist. Cast your mind back to high school. Here we were - you and I - the C-students with no choice but to work with the brains fate issued us, forever rolling a rock uphill, being written off by our teachers as underachievers. There they were,  the A-students, flaunting their neurons, always showing up prepared, always handing in their homework on time, effortlessly solving all those “two trains leave the station” puzzles without resorting to pencil and paper. 

These were your future Einsteins, but even Einstein suffered from a major flaw in his operating system, one that led him on a 30-year wild goose chase in pursuit of a theory of everything that ended in nothing. For all his intellectual prowess, Einstein couldn’t reconcile himself to quantum physics. The major sticking point for Einstein was that quantum physics deals in probability rather than certainty, where the relationship between cause and effect appears decidedly more casual than common sense would lead us to believe. “God does not play dice with the universe,” he famously replied in a letter.

The technical term for Einstein’s failing is cognitive bias, another way of saying there are no end of ways our brains can play tricks on us. The field was pioneered by the Israeli psychologists Amos Tversky and Daniel Kahneman beginning in the 1970s. For their work, Daniel Kahneman shared the 2002 Nobel Prize in Economics (Nobels are not awarded posthumously, thus disqualifying the late Dr Tversky). In his best-selling 2011 book, Thinking, Fast and Slow, among many other things, Dr Kahneman articulates our tendency toward “optimistic bias.” Part of this has to do with our unwarranted overconfidence in completing projects on time and within budget, based on our inflated sense of our ability to control events. Whether remodeling a kitchen or conducting an overseas war, apparently we can all benefit from a good healthy dose of pessimism. 

Another one includes “framing.” Say, if you were told that a certain surgical procedure had a 10 percent failure rate, would you opt for the surgery? Maybe not. What if you were told, instead, that the procedure had a 90 percent success rate? Maybe yes. The same information both times, but the way it was presented in each case encourages a different response.

Perhaps the most glaring cognitive bias is “confirmation bias,” where we tend to accept facts that fit inside our belief systems and reject those that fall outside. Nowhere is this on better display than in what passes for political discourse in the US. For instance, according to  a 2011 PRRI/Brookings survey, when Obama was in office, only 30 percent of white evangelicals thought that a politician who commits personal immoralities is fit for public office. In 2016, with Trump as the Republican Presidential nominee, a whopping 72 percent suddenly decided that personal immorality did not matter.

Democrats are not immune: When Clinton was in office, feminists found themselves in the bizarre position of defending the President’s grossly inappropriate sexual behavior and attacking his victims.

If you think I’m engaging in “false equivalence,” here, you have a very good point. Part of false equivalence has to do with treating one side’s minor failing the same as the other’s major failing. My response is that reality is messy and full of contradictions. We may long for simple answers, but their only useful purpose is to making us temporarily feel good. We need to do better.

A sampling of more cognitive biases (from Wikipedia): “Anchoring” (relying too heavily on one piece of information, such as the first source you encountered while researching a particular topic), “availability heuristic” (such as giving too much weight to emotionally charged memories), the “Dunning-Krueger effect” (too stupid to know you’re stupid), “gamblers fallacy” (such as thinking flipping five heads in a row raises the chance of tails coming up on the next flip), “hyperbolic discounting” (a preference for immediate pay-offs), “overconfidence” (99 percent certain answers turn out to be 44 percent wrong), “status quo bias” (preference for things staying the same).

Then there are the various social biases, such as “authority bias” (accepting as Gospel truth the word of your favorite authority) and “ingroup bias” (such as favoring those one perceives to be members of one’s tribe).

Psychiatry’s five-factor model (FFM) affords us insights into how our preferences seem to be shaped from birth, including whether or not we are “open to new experience,” and “conscientiousness,” which places a high premium on loyalty and duty. Not surprisingly, political liberals are inclined toward openness to new experience (as well as favoring cats over dogs) while conservatives score high on conscientiousness and prefer dogs.

Are our beliefs truly that predetermined? Are are brains really that hopeless? When it comes to shaping our destiny, are we as powerless and mindless as steel filings drawn to a magnet? The metaphor comes from Oscar Wilde. In a parable, he wrote how the filings deluded themselves into thinking they were headed toward the magnet of their own free will. In his story, some of the filings “were heard to say it was their duty to visit the magnet.”

Socrates, where are you now?

John McManamy is the author of Living Well with Depression and Bipolar Disorder and is the publisher of the Bipolar Expert Series, available on Amazon.

Follow John on Twitter at @johnmcman and on Facebook.

Become a Patron!

Friday, March 8, 2019

Reprise: Philosophy, Because We're Mindless Without It

The following is a slightly edited version of a post that appeared here ten years ago. I will be using this as a part of a chapter in my book on recovery and bipolar. The chapter has to do with developing our critical thinking skills. Lord knows, we need to bring back this lost art. Enjoy ...

Recovery begins in our heads, more specifically, our brains. Theoretically, thanks to neuroplasticity, it is possible to build ourselves a better brain. Barring that, there is always room for putting the one we’re stuck with to better use.

My blog, "Knowledge is Necessity," features an image of two philosophers. One, pointing a finger to the sky, is Plato. The other, with his palm facing the ground. is Aristotle. The two form part of a much larger painting by Raphael - his masterpiece - titled "The School of Athens." The canvas portrays more than twenty identifiable Greek philosophers - as well as a bunch of unidentifiable ones - clumped in small groups engaged in enlightened discourse.

What's so important about philosophy? Ten or 11 years earlier, I happened to catch the first three or four episodes in a 60-lecture video series, titled, "Great Ideas In Philosophy." Oxford scholar and Georgetown professor-emeritus Daniel Robinson explained that something extraordinary happened in ancient Greece.

Prior to Greek philosophy, Professor Robinson pointed out, thinking was essentially religious. Entire cultures were organized around the principle of encouraging everyone to think the same. No one questioned handed-down beliefs.

What the Greeks did essentially changed everything. Socrates and others urged their followers to think for themselves, to take nothing for granted, to challenge everything. In the face of Socrates' withering inquisitions, lazy thinking didn't stand a chance. Athens, it appears, wasn't quite ready for this, and Socrates paid in full measure.

His student, Plato, managed to die in bed, as did Plato's student Aristotle. Without them, our frontal lobes would have nothing to do. Not only did they turn thinking into a profession, they gave us the tools to think. The world was never the same. Every field of human enquiry bears their indelible stamp: Science, government, the arts, human nature, even religion. The way we appreciate Jesus is through the intellectual framework built by Plato and Aristotle. "The School of Athens" hangs in the Vatican.

Then there is Diogenes. You might refer to Diogenes as the anti-philosopher. When Plato described man as a "featherless biped," Diogenes handed him a plucked chicken. In Raphael's painting, Diogenes is seated alone, as if shunned by the others. Diogenes serves as a forceful reminder that yesterday’s bold free-thinking is today’s mind-crushing orthodoxy. This tends to happen when we organize our connected thoughts into schools of thought.

The antidote is to be our own philosopher. We learn, then challenge everything we learn. Then we relearn.

Knowledge is necessity, but knowledge is also elusive. It is something we strive for, rather than possess. If someone claims to have it, and offers you a piece of it - stop, think. What would Socrates do?

John McManamy is the author of Living Well with Depression and Bipolar Disorder and is the publisher of the Bipolar Expert Series, available on Amazon.

Follow John on Twitter at @johnmcman and on Facebook.

Become a Patron!

Sunday, March 3, 2019

Miriam's Tambourine

The following is from a draft book I'm working on about bipolar recovery. In earlier blog posts, we discussed being the hero in your own journey. We continue with that theme ...

Ma nishtana, asks the youngest at the Seder table. “Why is this night different than all other nights?” The occasion is the Jewish celebration of Passover, commemorating God delivering Israel from Egypt. Five or six dishes serve as gastronomic memory aids - bitter herbs, for instance, a reminder of their conditions as slaves, matzah in acknowledgment of their hasty departure.

Upon their safe passage out of Egypt, Miriam the prophetess played her tambourine and sang and there was much rejoicing. Forty years of trial lay ahead, and after that a seven-century path to another Exile, but for right now this was a time for celebration. God tests us, says Ecclesiastes, in order to show we’re no different than beasts. Alas, we meet the same end, drawing the same breath. There is no let-up. God is messing with us.

What is the point?

“Someday I will be laughing at this,” I recall saying to the crisis intervention team that had one look at me and gave me my diagnosis of bipolar. That was back in early 1999. So what kind of a person would I have to be to one day laugh at my current situation? Someone a bit more at ease with himself, in better shape to take on the next round of trials? Someone who can one day laugh? Maybe that’s the point. 

Chronologically, the Hebrew Scriptures ends on the return from Babylonian Exile and the rebuilding of the Temple. The story is set out in the Book of Ezrah, but one pivotal scene exists only in the imagination. So - imagine: A remnant of a remnant has returned back in the land of their ancestors. Somehow, away from home, against all odds, they had managed to keep their faith alive. But now here they are, fearful and cold, huddled together on a windswept peak, standing among the ruins of their old Temple. Their Elder gathers his people closer and tells them a story, one from a faraway time and place.

The people listen, they weep, they take heart. They resolve to recommit themselves to renewal, to rebuild - their Temple, their faith, their nation, themselves. There would have been no shortage of arresting images to give them cause for inspiration: God sending a rainbow, Moses venturing to Sinai, David entering Jerusalem. But the Elder summons another one, no less eye-opening - the spectacle of an ecstatic woman smacking on her tambourine.

Imagine: What kind of person would you have to be, having just narrowly escaped 

Pharaoh’s army, with no home to go back to, with a long road of trials just ahead, to inexplicably find cause for celebration? 

One day, I will laugh it this. Faith in God, that’s easy. Faith in yourself, that’s hard. Could it be that among that remnant on that hill, a girl, emboldened by the story she just heard, said: “Father, mother, one day, I too will play the tambourine. One day, I too, will celebrate.”

And perhaps the girl’s parents reflected upon this. Yes, one day, on this very hill, their daughter would lead her people in celebration, playing on her tambourine. But first, they remind her, looking upon the ruins of their Temple, there is work to do, rebuilding to be done.

John McManamy is the author of Living Well with Depression and Bipolar Disorder and is the publisher of the Bipolar Expert Series, available on Amazon.

Follow John on Twitter at @johnmcman and on Facebook.