Wednesday, December 28, 2011

The Year That Was: My 2011 Highlights and Lowlights

This is the time of the year for looking back, a la Time Magazine and CNN. Following is my personal (and highly idiosyncratic) view on how the year unfolded:

Person of the Year: The Three Stooges 

This was a no-brainer. Anytime I was stuck for a pictorial representation for a blog piece, these guys delivered. Need something to illustrate the fine points of personality? Moe, Larry, and Curly - and sometimes Shemp - were there. The intricacies if the human condition? Who better? The state of psychiatry today? I rest my case.

Historical Person of the Year: Adolf Hitler

Investigating historical figures sheds endless light on timeless behaviors and current conditions. This year’s crop included Lincoln (depressive realism), Ayn Rand (greed), Phil Ochs (depression and empathy), LBJ (bipolar grandiosity), and Chairman Mao (Machiavellianism). We even used President Obama to shed light on the unfortunate condition of chronically normal. And for pure evil - who best to illustrate the point? Wait, that’s not ...

Supreme Being of the Year: God

When I tell people this blog is about everything from God to neurons I really mean it. Okay, God and I have issues. In fact, earlier this year, in a post here, I reported that I fired God. No one owns the truth about God, but when we’re looking to explain life, the universe, and everything, who always turns up in the conversation? My own proof of God? Look at us - someone up there has to be laughing.

Neurotransmitter of the Year: Dopamine (of course)

From a PowerPoint slide presented by Nora Volkow, head of the National Institute on Drug Abuse, reported here earlier this year: “All drugs of abuse increase dopamine in the nucleus acumbens. But stress does, too.”

Or this, from a piece on deciding on a partner: “When all is said and done, it is the ventral tegmental area (VTA) that rules, not the parts of the brain we actually think with. The VTA is the dopamine-sensitive region in the midbrain that mediates pleasure and reward.” 

Trust me, compared to dopamine, serotonin is a wimp.

As I like to tell people, reported in another piece here: “If you think you are experiencing God - it’s probably dopamine. If you think you are experiencing love - it’s probably dopamine. That doesn’t mean God or love is not real, but we know dopamine is.”

Ah, the God to neurons thing again.

Psychiatrist of the Year: Emil Kraepelin (second year running)

As I reported last year: “Okay, he’s sort of dead - well, completely dead - which is a rather large technically. But even dead, this guy leaves the live psychiatrists for dead.”

This year, psychiatry excelled at playing dead. As those of you with at least two working neurons are aware, last year journalist Robert Whitaker published his eye-opening “Anatomy of an Epidemic,” which raised the startling proposition that mental illness is on the rise because of meds, not despite meds.

We waited for an intelligent response from psychiatry. And we waited - and waited. Except for a sicko hissy fit from psychiatry thought leader Andrew Nierenberg, psychiatry played dead. And continued to play dead.

This was like waiting for Godot. Last month, I had no choice but to conclude: “So, for right now, in the absence to date of any credible marshaling of the facts from psychiatry, Whitaker stands as the most authoritative voice on psychiatric treatment.”

Somehow, I cannot imagine Kraepelin - the father of diagnostic psychiatry who knew far more about manic-depression in 1921 than the failed proctologists who mindlessly dispense antidepressants today - sitting out this vitally important conversation.

Condition of the Year: Crazy

Crazy is not a dirty word. “Here’s to the crazy ones,” begins the classic 1997 Apple ad. “The misfits, the rebels, the troublemakers, the round pegs in the square holes ...  They push the human race forward, and while some may see them as the crazy ones, we see genius, because the ones who are crazy enough to think that they can change the world, are the ones who do."

Here’s to you, Steve Jobs.

That was the year that was. 

Who knows what next year will bring? Whatever happens, we’re all in it together. Many thanks to all of you here who gave me a reason to get out of bed each morning.

Tuesday, December 27, 2011

Robert Sapolsky Talks About the Biology of Human Behavior

We think in categories. But there are these problems. The first one being that when you think in categories you underestimate how different two facts are when they fall in the same category. When you think in categories you overestimate how different they are when there happens to be a boundary in between them. And when you pay attention to categorical boundaries you don’t see big pictures.

The speaker was leading neuroscientist Robert Sapolsky addressing a 2010 class in human biological behavior at Stanford. Dr Sapolsky is a leading researcher into how stress influences behavior, an endeavor that ranges from tracking brain circuitry in lab animals to studying baboons in the wild. He also possesses that rare gift of being able to communicate complex topics to the general public, with a number of highly readable mainstream books to his credit, including “Why Zebras Don’t Get Ulcers.” 

My whole approach to mental illness owes much to Dr Sapolsky, and it has been that way for years.
“How many people believe in free will?” he asked. He smiled. “That’s going to change,” he advised. Then he addressed the classic nature-nurture debate. “Who thinks human nature is all explained by nature?” he asked. “Who thinks it’s all explained by nurture?” Another smile. “Who thinks there’s a magnificent fascinating nuanced interaction between nature and nurture?”

The categories thing again. Thinking in categories does make it easier for us to remember stuff and evaluate stuff, Dr Sapolsky acknowledged. But there are a bunch of problems, especially if you overestimate the importance of the bucket you live inside of. “And thus everything about this behavior is explained by  - a gene, a neurotransmitter, a childhood trauma, a living inside one bucket.”

Human behavior is harder than that. On one hand:

Sometimes the stuff that’s going on in your body can dramatically influence what’s going on in your brain. (Such as the food we eat, with the notorious example of the “Twinkie defense.”) 

And on the other:

Sometimes what’s going on in your head will affect every single outpost in your body. (Such as trying to get to sleep as you are contemplating your own mortality. Chances are your heart rate will increase.) 

Rather, it’s more like ...

 ...the intertwining, the interconnections between your physiology and your behavior, the underlying thoughts, emotions, memories,  all of that, and the capacity of each to deeply influence the other under all sorts of circumstances.

First, we ask - what does the behavior look like? Then we ask what went on in that organism a half-second before that behavior occurred to cause it to occur? This is the world of what’s going on with neurons and circuitry, but ...

Just as we are about to get happily settled into that bucket, we push back a bit and say what smell, what sound, what sensory stimulation in the environment caused those neurons to get activated and produce that behavior? 

And then push it one step further behind, to hormone levels in the blood in the last few hours that changed how sensitive you are to those sounds and smells. Then we work our way further back through early development, fetal life, the genetic make-up of an individual, the genetic makeup of an entire population species.

From an endocrinologist’s perspective, Dr Sapolsky goes on to say, Hormone X may explain a behavior. But Hormone X is coded by a gene, so we’re not just talking about endocrinology, anymore -  we’re talking about genetics. And genes are subject to selection, so we’re also talking about evolution. And if we’re talking about sounds and smells and so on - acute triggers for human behavior - by definition we’re also talking about fetal development, which determines how sensitive those systems are to those sorts of stimuli.

Here’s the pathological danger of thinking in buckets. Dr Sapolsky asks us to guess who said this:

Normal psychic life depends upon the good functioning of brain synapses, and mental disorders appear as a result of synaptic derangements. Synaptic adjustments will then modify the corresponding ideas and force them into different channels. Using this approach we obtain cures and improvements but no failures.

The speaker was Egas Moniz, developer of the prefrontal lobotomy, speaking at the occasion of being honored with the Nobel Prize in Medicine in 1949. (Yes, you read that right. No, you are not experiencing psychosis.) Adding our own spin to this, it would be very easy to attribute that statement to any leading psychiatrist bought out by the pharmaceutical industry (which would include just about all of them).

Moniz wasn't an isolated example, Sapolsky informs us. Hence the challenge of breaking out of our buckets.

Dr Sapolsky tells us we have three intellectual challenges. The first is recognizing human circumstances where there is nothing fancy about us whatsoever. “Some of the time we are just a plain old off-the-rack animal.” (Put two female hamsters in a cage, for instance, and their cycles will sync. Put two female humans in a dorm room together, and the same thing happens.)

The second challenge is that although we appear to be just like every animal out there, we do something different with the similarities. For instance, we get stressed by the inevitability of our mortality or by reading something awful that has happened to a child on the other side of the planet.

The flip side of this is we can have compassion and empathy for loved ones and strangers. “It’s the same boring physiology as every other animal out there and we are using it in a way that is unrecognizable.” 

The third challenge is when we are doing something that no other animal out there has anything remotely similar to. For example - a couple comes home, talks, has dinner, talks, goes to bed, has sex, talks, falls asleep. They do this the next day and 30 days running. 

“Hippos would be repulsed by this,” Sapolsky lets us know. "Hardly any animal has nonreproductive sex, let alone day after day, and nobody else talks about it afterward."

The old way of looking at human behavior was by thinking of the brain as an intricate clock with pieces you can take apart and study and then put back together. But it is not as simple as that, Dr Sapolsky informs us. Behavior is more like a cloud, “and you don’t understand rainfall by breaking a cloud down into its component pieces and gluing them back together.”

Dr Sapolsky believes everyone on earth should be forced to learn about behavioral biology. Whether we’re on a jury or voting or wondering about a family member sunk in depression, “we’re behavioral biologists all the time, so it’s probably a good idea we be informed ones.”


Check out Dr Sapolsky’s 57-minute talk on YouTube

Saturday, December 24, 2011

Rerun: A Christmas Poem

Twas the night before Christmas, when all through the place
Not a thought was racing, not even a trace;
The meds were all stashed, in the cabinet with care;
A warning to my neurons, behave and beware.

When out on the lawn there arose such a clatter,
Something bad was going down, something was the matter.
Away to the window I flew like a flash,
Oh crap, not again, not another stupid crash.

When what to my wandering brain should appear,
A dude in a sleigh with eight friggin’ reindeer.
Now Dasher! now Dancer! Please tell me I’m dreaming!
On Comet! on Cupid! Time to start screaming!

To the front of the porch! up against the wall!
Get 911 here right now, I’m headed for a fall.
A vision in my head, a harbinger of doom,
Now dash away! dash away! To the emergency room!

He was dressed all in rags, from his head to his foot,
And his clothes were all tarnished in ashes and soot.
He came through the sliding door, in the back entry;
No way to blame his appearance on a dirty chimney.

A bundle of stuff he had flung on his back,
Like a homeless person, with his life in a sack.
I’m the Ghost of Christmas Present, he said in my home.
Dude, I replied, you got the wrong poem.

His eyes - they were hollow, his skin a sickly yellow.
His mouth it trembled, like a defeated fellow;
A stump of a smoke he held tight in his teeth,
Looked like he hadn’t eaten anything in a week.

My old lady was upstairs, zonked out on her meds;
My kids were in the next room, asleep in their beds.
Only one thing to do, very plain to see,
Time to call 911, protect my family.

I put down the phone, in spite of myself,
Something in my brain, maybe an elf.
Set yourself down, I said, You’re in the right poem,
I’ll see what’s in the fridge, make yourself at home.

His eyes how they twinkled! His dimples how merry!
You got that part right, he told me, very very very!
No matter how much we have, how much we own,
We’re all of us homeless, till we find the right poem.

And laying a finger, alongside of his ear,
And, giving a nod, he was no longer here.
Was it a dream? Was it psychosis?
Does my doc need to up my meds, on even higher doses?

But the feeling was real, a peace I had never known;
I was in the right place, in the right poem.
And I heard him exclaim, in a voice that was my own,
“Check what’s inside the fridge. You have found yourself a home.”


First posted on HealthCentral three years ago. A happy - and giving - holidays to all. Today marks the third anniversary of my blog. Many thanks to all of you who entered my virtual home strangers and stayed as friends.

Monday, December 19, 2011

Rerun: Where is God?

Christopher Hitchens' passing last week generated considerable discussion on his militant atheism, which you can read all about in his 2007 diatribe, "God is Not Great." The first major flaw in Hitchens' approach is that exclusively aiming his sights on fundamentalist nutjobs and their seriously disturbed world views is intellectually dishonest. The first rule of honest debate is to pick on someone your own size. This would have involved active engagement with a vast corpus of thoughtful theological heavyweights, past and present.   

The other flaw is that for all his intellectual brilliance, Hitchens exhibited a life of extreme spiritual stupidity. Let's put it this way - seeking Hitchens' outlook on matters metaphysical is as absurd as asking Larry the Cable Guy how to tie a Windsor knot.

Anyway, Karen Armstrong says it a lot better. This from a piece I posted in Oct last year ...

I just finished reading Karen Armstrong's "The Case for God," which does not actually make a case for God. Ms Armstrong (pictured here) is way too smart for that. The idea of some kind of infinite absolute outside our comprehension hardly lends itself to argument, much less proof, either theological or scientific. Believe it or not, until the nineteenth century, science and religion were pretty cool with that.

According to Ms Armstrong, religious fundamentalism - whether Christian or Muslim or other variety - is a relatively new phenomenon, as is most mainstream belief. A God beyond imagination does not submit to pat answers. Pat answers, in fact, are highly suspect. Define the infinite? Think about it. To define is to limit. To reduce God to our level of understanding is to lose touch with God.

Scripture, which raises more questions than answers, is always open to reinterpretation. Liturgy and observances are living meditations on eternal mysteries, not empty ritual. Through immersion in a practice, through rigorous inquiry, through opening oneself to new possibilities, new realizations emerge. One becomes a better person, in closer touch with God, whatever God is.

Throughout the ages, says Armstrong, religion was the means, not the end, and this applied to all faiths, irrespective of their surface differences. Even terms such as "dogma" and "belief" had different meanings and usages, more in the sense of entering into an ongoing and typically unpredictable dialogue rather than mindlessly yielding to foreordained assertions.

Ironically, the liberating spread of ancient Greek philosophy in the Middle Ages also gave rise to the type of stultifying over-intellectualism that would later set the scene for a hardening of attitudes. With the mass printing of Bibles, selected scriptural passages were deployed by competing faiths and sects to separate themselves out from one another.

Nevertheless, maintains Armstrong, faith and science maintained a sort of working mutual accord. Neither was trying to dictate to the other. Even the Galileo controversy, Armstrong argues, was overstated. (Galileo, it seems, was spoiling for a fight.) In any event, when the dust settled, religious and scientific authority appeared to be in harmony. Newtonian physics, in fact, seemed to prove the existence of God. Some higher power had to have set the mechanics of the universe in motion. This was as self-evident to Newton as it was to the rationalist philosophers his discoveries inspired.

Yes, Europe erupted into senseless religious warfare - change always exceeds society's capacity to peacefully absorb it - but with the emergence of a modern Europe came the belief that the solution to any problem (no matter how intractable) and the explanation to any phenomenon (no matter how mysterious) would yield to the power of reason.

God could be explained scientifically. Organized religion was comfortable with that. The Christianity of America's Founding Fathers was very different than the Christianity we practice today, irrespective of denomination. The catch is that reason has its limits. Alas, scientific enquiry breaks down in the pursuit of that which is beyond imagination.

On top of that, arid intellectualism failed to satisfy essential human needs, which set in motion a revivalist reaction. Still, no one seriously argued that Scripture trumped science. That would soon change. First, Charles Lyell showed that the earth was shaped - and still being shaped - by slow-moving forces spanning eons. Then Darwin came on the scene. Nevertheless, Armstrong is quick to point out, Darwin's ideas were readily accepted by the scientific community and not seriously challenged at first by religion.

That didn't last long. Scientism extremists became anti-religious. Religious extremists became anti-science. Even in mainstream religions, new emphasis was given to the literal interpretation of Scripture, with those presumed closer to God claiming an authority they never dared lay claim to before. Thus, in 1870 the doctrine of papal infallibility was adopted by the Catholic Church.

Nevertheless, by the 1960s, people were seriously asking, "Is God dead?" Mainstream church attendance was in sharp decline and religious fundamentalism was a fringe movement. A new secular culture was dawning.

Well, you know the rest of the story. In the US, today, religious fundamentalism is driving much of the political and social agenda. The history lesson, Armstrong tells us, is that every threat to the old order spawns an irrational reaction, with each new religious outbreak more bizarre than the one that proceeded it. A lot of what passes for both mainstream and fundamentalist religion today, if I am reading Armstrong correctly, bears very little semblance to the religion of the past.

The extremists in our midst - the likes of Dawkins and Sullivan - argue that our world would be a lot happier if we could somehow stamp out all religion and disabuse ourselves of the notion of God. But that, Armstrong argues, misses the point. Just because religion has no pat answers for God (even if they profess to offer them) does not preclude the existence of God. And religion at its best - even the ones we may find abhorrent to us - offers the prospect of bringing us closer to that which is beyond imagination.

Truth will always elude us, as does reality. But in our quest, we can arrive at reasonable approximations, which serve to launch us to our next round of approximations, then the next. Let the journey begin ...

Friday, December 16, 2011

Christopher Hitchens: An Appreciation

This is crazy. I just happened to be reading Christopher Hitchens’ recently released collection of essays, “Arguably,” when I found out he died today from complications from throat cancer. Christopher Buckley on the back cover cites Hitchens as “the greatest living essayist in the English language.” Now I have to regretfully disagree.

If I were modeling a pompous villain who justly gets his come-uppance for a novel I am not equipped to write, I would have no further to look than Hitchens. You simply could not get away from this insufferable unquotidian wanker. Leave your TV on for five minutes - the Shopping Channel, Lifetime, anything - and the ubiquitous and misanthropic Hitchens would materialize in all his Caliban glory, cortex fully loaded, glottis engaged, lips ablaze.

I think he was required to register his vocabulary with the police.

One of the joys of not having a TV these past 18 months was no Hitchens. But then a week or so ago his book miraculously materialized in my home. To me, this proves the existence of God. If I didn’t put the book there, who did? I’m sure if the militantly atheistic Hitchens had been confronted with the evidence, he would have recanted on the spot and taken up Holy Orders or something.

Okay, my brother dropped it off, but - surely - my brother had to have been working through God.

And therein lies the glory of Hitchens. Love him or loathe him, he had zero tolerance for sloppy thinking, a standard he ruthlessly applied to himself. Say what you want, this is a man who never insulted our intelligence by showing up to a gun fight with a knife.

Hitchens was always at his best in righteous indignation mode, whether in attacking Bill Clinton or Henry Kissinger or Mother Teresa. This from an essay, “Old Enough to Die”, an impassioned and carefully reasoned Philippic against sending kids to the death chamber:

So a sober panel of robed figures, calmly reviewing the life-or-death case of a disturbed child, determines in writing that said child may be “factually” or technically innocent, but further determines that this is not really any of its business.

A little later: “This February, Sellers was led out of his cell and put down like a diseased animal.”

Ah, the Hitchens I love.

Thursday, December 15, 2011

Illustrating Depression and Bipolar

As most of you who follow this blog know, at the beginning of this year I essentially blew up mcmanweb and started over. The site was in serious need of an updating, plus a facelift. My first phase involved a complete redesign, together with rewrites and reorganization of a lot of old articles. This consumed most of my time well into spring.

Throughout the rest of the year, I made incremental changes and additions.

I began mcmanweb around this time in 2000, with a small collection of articles on depression and bipolar. My goal was to create a comprehensive resource exploring mood disorders from every conceivable angle. Over time, I built up a collection of more than 300 artlcles (later pruned down to about 250), most written by myself, but with some personal accounts from contributors.

As the years went on, I essentially turned over just about all my content, save the pieces on famous people and personal stories. But by this time last year, it was clear to me that I had fallen way behind. I’ll just mention one of the matters I had to deal with, which involves design and the organization of my content.

The key to a successful website is making it easy for visitors to find what they are looking for, fast, and to facilitate their going deeper and wider. My basic navigation since the beginning has been successful -with content organized under categories such as Mood, Treatment, Science, Stories, and so on - and I stuck with what worked.

Thus, from my home page - as well as every page on the site - you can click on a category, which will take you to a landing page with the articles containing the information you are looking for.

But how do you bind a site together? At the same time, how do you differentiate between categories? How, in essence, do you lend coherence to the reader’s experience?

This time, I decided that a collection of old (and a couple of new) master’s paintings would define the look and feel to mcmanweb. A different old master would illustrate each category. Moreover, I carried forward these same old masters into every article. Thus, a Rembrandt for all my DSM articles, a Vermeer for my Treatment articles, and so on.

So here I am - today - nearly a year after I started the project when it occurred to me that I never explained my choice of illustrations to my readers. I just remedied that a few minutes ago, with short explanations on all my landing pages, which I have reproduced here. So without further ado ...

The pic to illustrate Mood is a close-up of La Tour’s “Magdalen at Night.” The woman seems to question her very existence. We have all been there.

The pic to illustrate Behavior is a close-up of Holbein’s “Henry VIII.” The monarch’s defying the Pope and killing off half his serial wives takes care of the seriously disturbed side of the personality equation. His sixteenth century rock star status - he was an accomplished lutenist, singer, organist, and composer, and a generous patron of the arts - captures the creative and positive side. Mind you, Henry’s way of resolving personal domestic quarrels can also be regarded as creative.

The pic to illustrate The DSM-5 is a close-up of Rembrandt’s “Moses.” Was there any other choice?

The pic to illustrate Treatment is a close-up of Vermeer’s “Cavalier with Young Woman.” I also use Vermeer to illustrate “Recovery.” The Treatment pic has two people in it, suggesting the wisdom of seeking expert help. The Recovery pic has a solitary woman actively engaged in a pursuit, reinforcing the notion that we are in charge.

The pic to illustrate Recovery is Vermeer’s “The Lacemaker.” Same explanation as above.

The pic to illustrate Science is a close-up of Dali’s “Exploding Raphaelist Head.” Dali was equally fascinated with Freud and the quantum building blocks of existence. This painting says it all.

The pic to illustrate Issues is a close-up of Raphael’s “The School of Athens.” Here we see Socrates engaging in a dialogue with a student. Socrates always challenged our cherished beliefs, strongly suggesting that anyone who claims to know the answers is a fraud. Indeed, if there is an absolute truth, there is no way of knowing it, much less knowing we know it.

The pic to illustrate Famous is a section from one of Warhol’s “Marilyn” prints. Marilyn’s iconic status made her a no-brainer to lead the parade of notables chronicled here. Warhol’s recasting of the same image in different shades suggests that what you see is not necessarily what you get.

The pic to illustrate Stories is Fragonard’s “A Young Girl Reading.” We all have stories in us. What the woman in the illustration does after reading one of them is up to her. Who knows, once she gets out of that chair.

The pic to illustrate Populations is a close-up of Bruegel’s “The Wedding Dance.” We may be one, but each one of us is also unique.

The pic to illustrate Relationships is a close-up of Klimt’s “The Kiss.” Ah, the possibilities. Alas, the ambiguities.


I cordially invite you to check out mcmanweb.

Tuesday, December 13, 2011

WTF?: How Hitler Ran Amok, Mao Died in Bed, and Your Jerk Brother Ruined Your Thanksgiving

Okay, I know what you’re thinking. But first let me recap:

In six pieces, we have investigated evil, using Barbara Oakley’s 2007 “Evil Genes” as our source. Dr Oakley noted that the type of people who specialize in making your life miserable are best described as Machiavellian, what she calls the “successfully sinister.” These may range from Hitler to the family jerk who ruins everyone’s Thanksgiving.

In her book, Dr Oakley laid out an impressive array of brain science to illustrate that all of us have far less dominion over our thoughts and actions than our over-inflated egos would lead us to believe. Someone whose brain is wired to over-react to stressful situations, for instance, is going to behave a lot differently than someone who isn’t. On and on it goes.

But there is no such thing as a bad gene or a good gene. What may be maladaptive in one environment can be supremely advantageous in another. And so in every generation we find our next crop of Ivan the Terrible’s, out for themselves at the expense of everyone else. Dr Oakley describes these individuals as “borderpaths,” combining traits of psycho/sociopathy and borderline, with a elements of narcissism and paranoia thrown in.

Fine, you may say. That may explain what makes these individuals tick, but what about us - their victims? What is it about us that allows them to get away with it - again and again and again? There will always be another Hitler, and you know that your next Thanksgiving is going to be as miserable as your last one.

What is going on here? As Dr Oakley makes clear, these individuals are successful for a reason. To a person, they are virtuoso manipulators, cut-throats, and con artists. If life is a game of chess, they are four moves ahead. You - what Dr Oakley loosely describes as the altruistic - never see it coming. Eventually we may get smart, but not before the damage is done.

Fine, but we’re talking about Hitler, here. Not to mention Stalin, Mao, Pol Pot, Milosevic, and a host of others. What gives?

Explanation Number One: Political and economic and social upheaval.

People are miserable, the rules suddenly change - anything goes. Hitler and Stalin and Mao were born with royalty running their respective countries. They came of age in a time when the lunatics were taking over the asylum, awaiting the inevitable lunatic-in-chief.

Explanation Number Two: Exploiting people’s fears and resentments is as easy as shooting fish in a barrel.

It’s the oldest and most pernicious con game in the world. Hitler was a master at it. Milosevic worked off of Hitler’s playbook. Pick a scapegoat - any scapegoat - shake and bake and serve at room temperature.

Explanation Number Three: Mob mentality.

I picked up some insights into this in my previous life as a financial journalist covering a boom-and-bust cycle in New Zealand and Australia back in the 1980s. If enough people start saying night is day long enough and loud enough, eventually even people with brains start believing it. Soon, everyone is driving in the dark with their lights off.

Explanation Number Four: People will believe what they want to believe, regardless of the facts.

This has been a recurring theme here at Knowledge is Necessity. Conspiracy theorists specialize in this sort of thing, but none of us is immune. We all tend to rationalize away inconvenient facts we don’t like, while pouncing on the tiniest shred of innuendo as holy writ. This comes back to Oakley’s theme that our capacity to think things through is vastly over-rated.

Explanation Number Five: Absolute power corrupts absolutely.

Give someone the position Fuhrer-For-Life and wait for Armageddon to visit your neighborhood.

Wrapping it Up

No doubt I left a lot of stuff out, but this should get us started.

More to come ...

Monday, December 12, 2011

Hitler on the Couch

As I promised last week, a study on Hitler. My starting point was Nassir Ghaemi’s recent “A First-Rate Madness,” which raised the extraordinary proposition that Hitler was far more “normal” than we give him credit for. What Ghaemi was driving at was that evil is not the exclusive domain of people with twisted minds. Perfectly normal individuals are as capable of gross inhumanities, or for that matter being royal pains in the ass.

In a piece in August, Reckoning with Evil, I laid out Ghaemi’s position, namely that until 1937 Hitler’s bipolar (his depressions and manias are well-documented) “seemed manageable.” Moreover, his hypomania appeared to benefit him in a way that influenced his rise to power, “fueling his charisma, his resilience, and political creativity.”

Then a quack physician put him on a cocktail of amphetamines and barbiturates that turned him into a raving maniac. Next thing, he was invading Poland. By 1943, he was receiving multiple daily injections, totally out of touch with reality, and was impossible to get along with.

Ghaemi’s analysis begged an alternative viewpoint, which sent me to Barbara Oakley’s “Evil Genes: Why Rome Fell, Hitler Rose, Enron Failed and My Sister Stole My Mother's Boyfriend.” As I reported in five previous pieces, Dr Oakley sees “borderpath” tendencies as the driving force of Machiavellian personalities, what she terms as the “successfully sinister.” Thus, a supreme Machiavellian such as Chairman Mao - responsible for more than 70 million deaths - deployed a vast range of psycho/sociopath, borderline, narcissistic, and paranoid traits to his considerable advantage, managing to die in bed at age 82, venerated as a God-figure.

Dr Oakley sees Hitler cut from similar cloth. Her main source is an OSS analysis prepared by a leading Freudian psychoanalyst, Walter Langer, during World War II. Dr Langer’s research was exhaustive, totaling 11,000 pages, and from this he created a criminal profile that is still regarded as authoritative.

Dr Langer characterizes Hitler as a “neurotic psychopath.” Ghaemi in a footnote takes issue with this diagnosis (his only reference to Langer), though it is clear the label is only a starting point. A quick Google search turned up an excellent piece, Getting Inside Hitler’s Head, by military journalist Brian John Murphy, and it is instructive to go off his account ...

As a child, Hitler learned how to manipulate his mother by staging temper tantrums until she caved in. Hitler carried over the same behavior into adulthood. His screaming raging fits were the stuff of legend, and throughout his career he was able to deploy these outbursts to his advantage. His public speeches - an extreme departure from standard German oratory - can be viewed as scripted tantrums that bent the masses to his will.

His father’s death at age 12 appeared to have a lot to do with turning him into an angry young man. Soon after, his performance in school plummeted and later he dropped out. As a down-and-out young man in Vienna, he became a rabid anti-Semite and extreme pan-Germanic xenophobe, unfortunately very “normal” for the time. Soon he found his calling in the trenches on the Western Front.

Hitler’s taste for war may have resulted in two Iron Crosses, but it also completely spooked his superiors, who vowed never to make him an officer. He failed to bond with his fellow soldiers, and avoided women. His later associations with women were characterized by sexual deviances and callous behavior. Six of his former lady friends attempted suicide. Two succeeded.

After the war, Hitler’s bitterness over Germany apparently being sold out by traitors fit right in with the sentiment of the day. In no time, he hit his stride as a political rabble-rouser, deploying his strange charisma, bitter misanthropy, and inexhaustible energy to stunning effect. Along the way, he spied on his socialist-leaning comrades-in-arms in the trenches, and succeeded in getting some of them hanged.

Exhibit A in Hitler’s psychopathy, of course, is Mein Kampf, written in prison following a failed populist uprising where he fired a pistol inside a beer hall. There his pathology is revealed in his own words, not to mention his demented thinking regarding Jews and other non-Aryans. It’s all there, except the “final solution,” and that can easily be inferred. By the time Hitler completed his blood-stained ascendance to total power as Chancellor in 1933, there was nothing standing in the way. That same year, he spoke with his military leaders about “conquest for Lebensraum” (interpretation: invading Poland). At his first cabinet meeting that year, he prioritized military spending.

Thus, by the time Hitler invaded Poland in 1939, he had a massive and well-equipped army and air force at his disposal, which he had already deployed beginning in 1936 to re-occupy the Rhineland and in support of Franco in the Spanish Civil War, plus to annex Austria and a piece of Czechoslovakia.

According to Murphy’s piece:

The Hitler Langer profiled was a man with a boundlessly grandiose concept of himself. Langer said Hitler believed fate set him apart as a superman, a chosen one, the messiah of a future German empire, who was infallible except for when he had engaged in what he called “the Jewish Christ-creed with its effeminate pity-ethics.” When crossed, Hitler wanted retribution that was godlike in its devastation.

Dr Oakley in “Evil Genes” pays considerable attention to delusional thinking, a trait common amongst conspiracy theorists, who are capable of maintaining their crackpot beliefs with great conviction in complete defiance of the facts. Hitler, needless to say, could always rationalize as legitimate his every action, no matter how bizarre and contrary to human nature.

Murphy notes that Langer’s analysis was made without reference to Hitler’s massive methamphetamine consumption, which only came to light after World War II. Clearly, Hitler’s drug cocktail greatly worsened his pathology. According to Murphy:

Witnesses describe the 56-year-old Hitler in 1945 as a shuffling old man wearing a uniform spotted with food and grasping for a handhold every few steps. His left hand trembled violently. Cake crumbs clung to the corners of his mouth. The bags under his eyes were swollen and dark. He drooled. ... By April 1945 he had little left physically or mentally.

So, did Hitler’s quack physician light “a fuse that exploded the entire world,” as Ghaemi maintains, or would Hitler have invaded Poland, anyway? Suppose he had been able to push ahead with his irrational ambitions, but in a far more rational and drug-free state of mind? Would the Nazis have actually won the Second World War?

Very scary thought.

Thursday, December 8, 2011

Chairman Mao: A Portrait in Evil

This is my fifth piece on Barbara Oakley’s eye-opening 2007 “Evil Genes: Why Rome Fell, Hitler Rose, Enron Failed and My Sister Stole My Mother's Boyfriend.” I stumbled into her book after a Google search involving Hitler and sociopathy. What prompted the search was Nassir Ghaemi’s recently published “A First-Rate Madness” that, among many other things, raised the extraordinary proposition that Hitler was far more “normal” than people give him credit for.

Yes, Hitler had a lot of stuff going on, including bipolar, Ghaemi acknowledges, but apparently this son of a Schiklgruber would have been just another Newt Gingrich had not his personal physician in 1937 turned him into the kind of raving meth addict that made invading Poland seem like a good idea.

All this begs the obvious question: What about the nutjob who published his lunatic ravings as “Mein Kampf” in 1926 - while serving a prison term for staging a shoot-out in the vicinity of a beer hall that was part of a crackpot attempt at a populist uprising?

We will save Hitler for another day. It turns out that Oakley’s book had a much better poster boy for her study in evil - Mao Zedong, “the perfect borderpath.” As you recall from yesterday’s piece, Dr Oakley views personality as far too complex to lend itself to easy DSM explanations. Nevertheless, the DSM can serve as a rough guide.

Thus, Dr Oakley sees elements of borderline and psycho/sociopathy (plus generous helpings of narcissism and paranoia) feeding into a take-no-prisoners Machiavellian mindset, what she calls “successfully sinister.” According to Oakley:

Mao was the most Machiavellian leader of the many Machiavellian leaders of the twentieth century. For three decades, he held absolute power over the lives of one-quarter of the world’s population.

To give a sense of perspective: All the wars of the world from 1900 to 1987 resulted in 34 million combat dead. Mao murdered twice as many.

As a boy, Mao rebelled against his teachers and staged highly manipulative showdowns with his father - unheard of behavior in traditional Chinese society. The pattern continued into adulthood with a succession of wives and womanizing and the neglect and cruelty he visited upon his children. As an aspiring revolutionary, he was ousted from the Communist ranks six times for his lack of ability to play well with others.

The following appeared in one party circular:

He is extremely devious and sly, selfish and full of megalomania. To his comrades, he orders them around, frightens them with charges of crimes, and victimizes them ... His customary method regarding comrades ... is to use them as his personal tools.

Mao had the last word. Ultimately, he had his critics tortured to death.

His manipulative behavior continued as “Great Helmsman,” playing off members of his inner circle against each other and bringing aboard new sycophants. Li Zhisui, Mao’s doctor and longtime associate - described his hero as “devoid of human feeling, incapable of love, friendship, or warmth.”

Li recounts sitting next to Mao at a performance. A young acrobat slipped and was seriously injured. The crowd was aghast, but Mao continued talking and laughing with no show of concern. There were occasions when Mao expressed sympathy, but according to Oakley, he lacked true empathy, the ability to put himself in the shoes of others.

But to write off Mao as a garden variety sociopath is far too simplistic. Dr Oakley contends a lot of borderline stuff was going on, as well, including wild mood swings and lack of impulse control, not to mention lack of continuity with his own identity. In all probability, Mao did not even believe in Communism. As he said of himself: “My words and my deeds are inconsistent.” Speaking with a forked tongue is normal in politicians, but Dr Oakley maintains Mao took it to pathological levels, such as admiring America in private while vilifying it in public. Observes Oakley:

There is no evidence, for example, that British prime minster Winston Churchchill secretly admired the Nazis or despised Roosevelt.

Complicating matters was a heavy addiction to barbiturates, which may have exacerbated his underlying pathologies. He was also addicted to sex, any form, possibly as a comfort from psychic pain. Hypocritically, Mao required his own people to endure ultra-puritanical constraints.

Meanwhile, Mao launched his country on a ruinous course with one daft economic enterprise after another. Thirty million peasants died of the ensuing famine from his “Great Leap Forward” of 1958-60. Mao’s response was to pretend it never happened. This type of “magical thinking” was a trademark of Mao’s behavior. The trait is identified with schizotypal personality disorder (schizophrenia lite). Mao’s second son had full-blown schizophrenia.

Another schizophrenia connection was his paranoia, which most likely served him well in his rise to power. Those who found themselves on his wrong side were decidedly less lucky.

Mao’s “Cultural Revolution” beginning in 1966 resulted in at least three million dead and the persecution of another one hundred million. As opposed to Stalin, who conducted his crimes against humanity mostly in secret, Mao made a spectacle of his personal reign of terror, delighting in the public torture and execution of his victims.

For all this, Mao was a charmer, a trait he shared with Stalin and other dictators. Another trait in common was his own mystical notion of his role as leader and messiah, fed by a brand of narcissism that morally justified doing whatever he thought right, no matter how wrong.

Mao died in bed in 1976 at age 82, venerated as a God-figure while leaving his country destitute and in shambles. How, you may ask, could one man get away with wreaking such havoc? The only explanation that remotely makes sense is that time and place and circumstances created Mao, just as similar conditions had spawned Hitler, Stalin, Mussolini, and all the rest. Social, political, and economic chaos gave Mao his head start. And once he gained absolute power no force stood in his way to stop him.

One could argue that in a more stable society, Mao’s outrageous psychopathy would have taken him out of the game at a very early age, but Dr Oakely reminds us that Mao was the ultimate Machiavellian, one inclined toward success. Thus:

In a capitalistic economic structure, Mao might have made his way to the top of a business enterprise. There, like a surprising number of managers today, he would have run roughshod over colleagues and subordinates while devising unreasonable programs even as he took out anyone who objected.

In politics, says Dr Oakley, an American-born Mao might have become a populist demagogue in the 1930s Huey Long mold (I will leave the obvious contemporary examples to others), rising to a high level of electoral success, but saddled with the major inconveniences of a free press and checks and balances.

Lest we congratulate ourselves on how Mao-proof our democracy is, the mini-Maos in our midst did a splendid job in running amok through the first decade of this millennium, thereby bringing the entire world economy to the brink of extinction in 2008. Ironically, the US was bailed out - at least temporarily - by post-Mao China. Scary thought ...


This is the fifth in a series based on Barbara Oakley's book, "Evil Genes." Previous pieces:

Figuring Out Evil
Figuring Out Behavior
Brain Science and Recovery
Why Evil Works

Wednesday, December 7, 2011

Why Evil Works

What does Mao Zedong have to do with that jerk brother or sister who wrecks everyone’s Thanksgiving? Funny you should ask. Barbara Oakley’s 2007 “Evil Genes: Why Rome Fell, Hitler Rose, Enron Failed and My Sister Stole My Mother's Boyfriend” masterfully connects the dots.

Her starting point is “the successfully sinister.” As I explained in an earlier piece, Figuring Out Evil, these are your classic Machiavellians - charismatic and ruthless - out for themselves at the expense of anyone unfortunate enough to happen to breathe the same air. The “high-Machs” have long been known to correspond to sociopathy, but not all sociopaths wind up in prison.

No, the successfully sinister have a way of ending up in far more desirable places. We are led to believe the cream finds its way to the surface, but at the end of the day we find the scum also rises. But don’t pin the rap on pure sociopathy. It seems high-Machs have an equally high correspondence to borderline personality disorder.

The two conditions overlap, but in key areas they are in diametric opposition. While sociopaths have no problems with their inflated sense of self (so much so that sociopathy is easy to confuse with narcissism), those with borderline tend to suffer from a breakdown in personal identity. But it’s not a case of one or the other. Dr Oakley steers clear of over-reliance on DSM labels, but is comfortable in their use as a rough guide.

Your boss from hell may lean toward sociopathy, for instance, with an assist from borderline, and a bit of paranoia thrown in. Someone else may major on borderline and minor in sociopathy with some extra credits in narcissism. In previous pieces, Figuring Out Behavior and Brain Science and Recovery, I mentioned how Dr Oakley and I rely on the same brain science (in particular the work of Daniel Weinberger of the NIMH) in support of the proposition that our genes are not coded with the DSM in mind.

The same brain science also makes it abundantly clear that we are not as in control of our thinking and behavior as our over-inflated egos would have us believe. Someone who is genetically wired to over-react to stressful events, for instance, is prone to act a lot differently in social situations than those who are not. This is neither good in and of itself. Indeed, panic is often the appropriate response. But when we need to dismantle a bomb or talk our way through airport security, cool as a cucumber is desirable.

Cool as a cucumber also works when sticking a knife in your best friend’s back.

Now multiply all the complicating factors by infinity. Certain genetic tendencies and environmental influences may either mitigate or amplify other ones. Different parts of the brain may be over-communicating or under-communicating with each other. It may be this way in that individual or that way in this individual. We’re a long way from understanding evil, or for that matter just plain being an asshole, but we all know what being victimized is like.

According to Dr Oakley, it all may have started in earnest about 10,000 years ago with the introduction of agriculture. Prior to that, there was little advantage in gaming the system. But with the beginning of densely-populated permanent settlements and sophisticated social structures came unlimited opportunities to climb to the top on the backs of others, with disproportionate benefits accruing to the successful.

The trusting and naive majority were no match for this new breed of Machiavellian. From their new positions of wealth and power, the successfully sinister acquired the opportunity to breed in vast numbers and thus pass on their genes.

But there was a major catch. As the ranks of the successfully sinister grew, there were fewer altruists left to prey upon. Moreover, the remaining altruists had picked up their own new set of coping mechanisms. Dr Oakley compares these back and forth shifts to a Darwinian arms race. Indeed, our higher cortical regions with their highly developed social software may have evolved eons earlier as protection against our scheming and opportunistic fellow humans.

Meanwhile, altruism carries its own selective advantages. Not everyone wants to mate with a Visigoth, for instance, much less have one in the neighborhood. The altruists (I’m using the term very loosely, here) may eventually gain the upper hand, but in the process they wind up sowing the seeds of a new generation of victims - ripe for plucking by a yet more sophisticated breed of the sinister. On and on it goes.

Fast forward to the chaos of the twentieth century and the type of environment the Emperor Caligula would have felt very much at home in. Hitler and Stalin certainly did. So did Mao.

More to come ...


This is the fourth in a series based on Barbara Oakley's book, "Evil Genes." Previous pieces:

Figuring Out Evil
Figuring Out Behavior
Brain Science and Recovery

Monday, December 5, 2011

Is There An RX for the Over-Prescription Epidemic?

Do we have an over-prescription epidemic? Here’s a snippet from a piece I wrote in 2002:

Paramijit Joshi MD, Chair of Psychiatry and Behavioral Sciences at the National Children's Medical Center in Washington DC, told the gathering she gets kids aged four and five on four or five medications. "I'm spending more time taking kids off medications than putting them on, as I don't know what I'm treating," she related.

And here’s an extract from something I wrote in 2003:

Gary Sachs MD of Harvard and principal investigator of STEP-BD reported on some early data, including the fact that patients entering the program were being treated with an average of 4.2 meds. Five percent were on eight meds or more and four percent were on 10 meds or more, leading him to comment on "exotic polypharmacy." Less than 20 percent were on just one drug.

Last month, a survey conducted by Medco reported that more than one in five adult Americans - one in four women - “took at least one medication commonly used to treat a psychiatric or behavioral disorder in 2010.”

Have our doctors gone insane? No, this isn’t an anti-meds post. I think just about all of us are for getting the right person on the right med to treat the right condition in the right situation, but we also know that the wisest call a doctor may make is to NOT prescribe a med.

This is a huge trust issue. The other day, “Pat” posted this on my mcmanweb site:

You have just confirmed every suspicion that I have ever had about the majority of psychiatrists being deluded, intellectually lazy and egomaniacal. My decision to get better without their help has just been reconfirmed, thank you. If they don't like that, maybe they can take my meds to help them feel better.

The piece she was commenting on was entitled, The Problem Clinician, the first of a three-part series describing my first (and last) grand rounds, delivered in 2008. It’s a story you have heard many times on this blog. The topic for my talk was meds compliance. Just sending patients out the door with a prescription is not treatment, I told the clinicians in attendance.

And - oh yes - when we tell you that we don’t enjoy being turned into fat stupid zombie eunuchs on the meds you prescribe and over-prescribe we’re not doing this to ruin your day.

I didn’t say it outright, but my intent was clear, namely: If you actually displayed a willingness to work with us in finding a smart meds strategy, there wouldn’t be major issues with compliance.

My audience showed their appreciation by clearing the room the second my lips stopped moving.

So here’s Pat, reading my article, making her own decision. I can only hope it’s the right one. It’s a shame, I thought, that her choice has to be all-or-nothing. Yes, we all want to be off of our meds. But why should our docs be out of the picture? Shouldn’t they be working with us to help us achieve our goal? Or at least come close to it?

Gianna Kali writing on her Beyond Meds blog was clearly thinking along similar lines. In her latest piece, A Plea to Prescribing Physicians and Psychiatrists: Please Help Us Heal, she reports on an article in the Irish Examiner in which a prominent psychiatrist disclosed that 60-80 percent of his work is helping people slowly get off drugs.

Taking her lead from this article, Gianna begins her plea:

The fact is there is a huge niche opening up for psychiatrists and other prescribing physicians who want to take the opportunity. People want and desperately need COMPETENT professional help in coming off of psychiatric drugs. We need prescribers to make the transition easier. ...

She goes on to say:

Many people come off meds with relative ease. Some of us, though, become crippled with iatrogenic illness. You will need to educate yourselves. Once you start making it be known that you can help — those of us who’ve been seriously and gravely harmed will start appearing on your doorstep. ...

She concludes with:

Please, it’s time that doctors learn how to help us. Some of you have unintentionally helped create the iatrogenesis that is now limiting our lives so much more than any “mental illness” ever did. Please start helping us heal now. We need you.

Yes, we do. But will they listen to us? I keep flashing back to all those clinicians bolting for the exits three-and-a-half years ago. A lot has to change. We have a long way to go.

Saturday, December 3, 2011

See the Man with the Stage Fright

The life of a writer is much like that of a weather observer at Advance Base in Antarctica in the winter of 1934, but without the snow and all the social distractions. The reference is to Admiral Richard Byrd, who spent five months completely on his own in the deepest of all deep souths. His chronicle of the experience, Alone, is a classic in the psychology of social isolation. I read it when I was about ten and even back then I could totally relate.

Ten-year-olds with books and real heroes - hold that thought.

Anyway, there are just enough of us loners in the world that we can be regarded as borderline normal. What is weird are the abrupt transitions into the rough and tumble of people contact.

Bang! Suddenly, on Wednesday, I needed to be on my game - not in sweat pants and a four-day beard - with a full array of social skills. The occasion was a NAMI San Diego board meeting, which I managed to get through in one piece. The next night was the real challenge. This was our annual business meeting and holiday potluck. It wasn’t that I had to socialize. I was also responsible for entertaining a crowd of 50 people for 30 minutes.

But first let me tell you about my boneless lasagna. I like to cook, and the lasagna I brought to the potluck was a killer. As I was reheating it in the kitchen area at the event venue, I happened to joke that I had picked all the bones out of it. By the time I set it down on the buffet table, everyone knew about my boneless lasagna.

Fortunately, the running joke went over well. The lasagna was also a hit.

Social situations used to terrify me. Now they merely pose an extreme challenge. In a few minutes I would be joining Lisa, our board president, on stage. We were running a quiz night, and I would be taking the lead. I was working off of a script of sorts, but I was also flying off the seat of my pants. We had pulled off a similar event last year, and the people there had witnessed a miracle without even realizing it. If only they knew the real me.

I stepped onto the stage without stumbling. Seven or eight tables had their quiz folders. I introduced myself and explained the rules. This was the difficult part. The audience had yet to settle down and a lot of cross-talking was going on.

“The prize,” someone - maybe Lisa - whispered to me. That’s right, the prize. This was my cue to have Annie, our events and development manager, hold up Walter Isaacson’s new bio of Steve Jobs.

I launched into my first question, which had to do with the rabbit people in NAMI San Diego. Ah! A NAMI San Diego round. People were getting it. On the second question, I found my stride. “The ukulele is catching on worldwide as the ultimate cool instrument,” I opened. “Name our NAMI San Diego ukulele phenomenon.”

Most people knew who that was - Devin - a very likable and committed young man. “Devin embodies what I love about NAMI San Diego,” I went on to say. Suddenly, I had their attention. “We often come to NAMI in need,” I continued. “Then we give back and volunteer. And some of our volunteers wind up working on our staff.” I saw the nodding heads. “It’s a circle of life,” I blurted out. Somehow, the Lion King reference worked.

The crowd was into the game. One of my questions had blue moon as an answer. Instantly, one of the tables in front - Team Darwin, I think - burst out into a spontaneous rendition of the classic Rogers and Hart ditty. A ten-year-old girl on the team - the brains of the operation - burbled merrily along. The crowd cheered. I looked to Lisa. “Score an extra point for this table,” I commanded.

That extra point figures mightily in this story.

Call me butter, ‘cuz I was on a roll. I pointed to Team Einstein, with two MDs and a lawyer at the table. “Don’t worry about these guys,” I advised the others. “They over-think everything.”

All too soon, it was time to wrap it up. Lisa tabulated the scores. We had a tie between the two front tables. Team Darwin (the Blue Moon people with the ten-year-old brains of the operation) - and the other guys. There was the girl. There was the Steve Jobs book on display right next to me.

Could we just declare Team Darwin the winner? I asked. No, the girl’s mom insisted we do it fair and square. Time for the tie-breaker question.  I flipped through my sheets of paper. “Movies and TV,” I announced. I scanned down the questions:

“Which actress starred in Breakfast at Tiffany’s?” No, I decided, not that one.

“What was Marilyn Monroe’s real first name?” Not that one, either.

Then my face lit up. Here we go, I announced. “Name Snow White’s Seven Dwarfs.”

For the rest of my life, I will never forget the girl’s reaction. Her face beamed a thousand watts as she levitated from her chair. Then she grabbed a pen and started scribbling furiously.

Then came the moment of truth. How many dwarfs did you get? I asked Team Darwin. Five, came the response.

Oh-oh. That didn’t sound like enough.

How many? I asked the other table. Five, they answered.

Whew! My team had dodged a bullet.

Did I dare risk another tie-breaker question? I hesitated, then I knew exactly what to do. “Team Darwin has fewer people,” I announced. “Therefore, they have more brain power per population.” Therefore - this time the girl's mom showed no resistance - “I declare Team Darwin the winner!”

I gestured toward the girl. “Come and get your prize,” I said holding up the book.

Really, I should get out among people more often.


Your brow is sweatin' and your mouth gets dry,

Fancy people go driftin' by.

The moment of truth is right at hand,

Just one more nightmare you can stand.

See the man with the stage fright

Just standin' up there to give it all his might.

And he got caught in the spotlight,

But when we get to the end

He wants to start all over again ...
- The Band

Friday, December 2, 2011

The Blood-Brain Barrier

I just finished rewriting my mcmanweb article on the blood-brain barrier, which I first published in 2003. Here is the new version in full ...

You may have heard of "antisense" therapy. The idea is to synthesize strands of RNA that bind to disease-causing strands of messenger RNA and thus stop them dead in their tracks. The technology is being researched for cancer and other diseases. Imagine being able to switch off depression before it happens.

The catch? There are many, but the big one is that any drug which targets our gray matter must first cross the blood brain barrier (BBB). The name conjures up a kind of cross between the Berlin Wall and a coffee filter, but in fact refers to nearly 400 miles of narrow capillaries throughout the brain, all filled with tightly packed endothelial cells that are exceedingly selective in what gets through. Endothelial cells are also present in capillaries in the body, but the spacing there poses no difficulty.

The BBB is to protecting the brain internally as the skull is to protecting it externally. The problem is the BBB does not differentiate what it keeps out. Life-saving chemicals, if they happen to be the wrong chemicals, simply won't get through. With very few exceptions, only small molecules soluble in fat clear the barrier. Only two percent of small-molecules get through. These include alcohol, caffeine, and nicotine.

Small-molecule compounds have been used to treat affective disorders, schizophrenia, chronic pain, and epilepsy, but they leave a lot to be desired. The problem, says William Pardridge MD of UCLA writing in the Jan 2003 Archives of Neurology, is that "small molecules are largely palliative medicines with often unfavorable safety profiles."

There are no chronic diseases, other than infectious diseases, that are cured by small-molecule drug therapy.

Large-molecule drugs have the potential to cure patients with neurological disorders, he notes, but none of them can cross the BBB. The following paragraph is worth quoting in full:

Despite the importance of the BBB to neuropathic agents, this area is underdeveloped in the neurosciences. To my knowledge, no pharmaceutical company in the world has a BBB drug delivery program! It is not unusual for an entire conference to be convened on a given neurologic disorder (eg brain tumors), with no discussion of targeting drugs through the BBB.

Writing six years later, in the Sept 2009 Alzheimers Dementia, Dr Pardridge points out that "even if Big Pharma wanted to start a BBB drug targeting program, there would be few personnel trained in the BBB to hire, because no academic neuroscience program in the US emphasizes BBB transport biology, much less BBB drug targeting."

Moreover, Dr Pardridge notes that clinical trial failures are attributed to the test drug rather than the obvious fact the drug might not even be getting through to the brain. Further development on the drug - and even all CNS drug development - is abandoned. We all suffer.

(The absurd alternative is "trans-cranial brain drug delivery," a euphemism for invasive brain surgery to deposit the wonder drug.)

Dopamine is a small molecule, but its chemical structure prevents it from crossing the BBB. Its precursor, L-DOPA, however can literally hitch a ride on a certain type of amino acid transporter and sneak through the BBB Trojan Horse style.

"Preparation of Trojan Horse liposomes for gene transfer across the blood-brain barrier," reads a 2010 article authored by Dr Partridge.

Dr Pardridge and his team have been working on encasing genes in fatty spheres called liposomes, which are coated with a special polymer, to which certain antibodies are attached. The antibodies trick the brain-capillary receptors into letting the liposomes pass, where they can deliver their payload to brain cells.

In one set of experiments that induced Parkinson's symptoms in rats, Dr Pardridge's team injected the rats with liposomes containing a gene that boosts production of the enzyme tyrosine hydroxylase, which is a building block of dopamine. Three days later, the rats' abnormal movements were reduced by 70 percent.

Another set of experiments doubled the lifespans of rats with brain tumors. Weekly injections resulted in the successful delivery of antisense RNA, which blocked production of a malignant growth factor.

Ah! Right back where we started, with antisense therapy!

In a 2005 article in NeuroRX, Dr Pardridge discusses the possibilities of all manner of antisense and peptide molecular Trojan Horses aimed at various targets. But with mental illness, there is a major catch, namely - where are those targets? Consider: We know for instance that amyloid plaque is associated with Alzheimers, and that Alzheimers research is directed at busting up these plaques.
But where is the equivalent of amyloid plaque for depression or mania or anxiety?

Hopefully, Alzheimers and brain tumors and other conditions with clearly identifiable targets (and thus foreseeable treatments) will vastly increase funding for research into BBB drug technology. That is certainly Dr Pardridge's wish. And dare we hope? Imagine, for instance, a ready-made molecular Trojan Horse that could be customized to deliver an antisense agent capable of shutting down an over-reactive stress response before it happens.

Maybe we're just dreaming. But, oh, the possibilities ...

Sunday, November 27, 2011

Rerun: Mozart, Genius, and Practice-Practice-Practice

A piece in today's NY Times, Sorry, Strivers: Talent Matters, takes issue with the conventional wisdom, endorsed by NY Times columnist David Brooks, that geniuses are made, not born. My 2009 piece below was faithful to Brooks' interpretation of genius (it's all about practice-practice-practice) while remaining skeptical of the idea that it's ONLY about practice-practice-practice.

Read on ...

Consider Mozart, who wrote his first symphony in utero and performed in his own rock opera at age five months, changing his own diapers (admittedly with mixed results) between acts. Clearly this is genius personified.

Not so fast, writes NY Times columnist David Brooks. Those early compositions of his were strictly kid stuff, and his performing skills as a child prodigy are highly over-rated. The Mozart you encounter in concert and opera halls is the product of an adult mind honed to a fine creative edge through years and years of unstinting effort.

Writes Brooks:

“What Mozart had, we now believe, was the same thing Tiger Woods had - the ability to focus for long periods of time and a father intent on improving his skills.”

Rather than some mystical divine spark or high IQ, genius may be as mundane as practice-practice-practice. Citing two new books - “The Talent Code” by Daniel Coyle and “Talent is Overrated” by Geoff Colvin - Brooks says it helps to have some kind of adult role model as a kid, say a novelist living in your town. Then you might dare imagine yourself writing your own masterwork. Armed with this ambition, you would start reading novels and literary biographies and thus attain a core knowledge of the field.

Mind you, it doesn't hurt if you have a bit more going for you than Lennie in "Of Mouse and Men."

Anyway, here you are - somewhere north of Lennie and south of Einstein - slowly building up your body of knowledge. Next thing, you're engaging in the intellectual equivalent of playing with your food, moving ideas around, divining patterns (excellent for the memory), and otherwise thinking like a novelist.

Then practice-practice-practice until your mind turns labored conscious skills into effortless unconscious ones. But the mind is sloppy, Brooks advises, and tends to settle for good enough. So, you practice your routines slowly. You break down your efforts into tiny parts and repeat-repeat-repeat until the brain internalizes a better pattern of performance.

At the right time, a mentor steps in who provides feedback, corrects your tiniest errors, and pushes you to tougher challenges. By now, your brain is programmed to understand and solve future problems.

According to Brooks, the primary trait is not genius. Rather, “it is the ability to develop a deliberate, strenuous and boring practice routine.” The hard wiring of our genes plays a part, but Brooks concludes, “the brain is also phenomenally plastic. We construct ourselves through behavior. As Coyle observes, it’s not who you are, it’s what you do.”

So back to Mozart. According to critics, as reported in Wikipedia, Mozart composed his "breakthrough work," his Ninth Piano Concerto, when he was 21. The concerto has been assigned a "Kochel listing" of 271, which implies a vast body of work that fell short before the composer hit his stride. Practice-practice-practice.

But for Mozart, good enough was not good enough. After forming a friendship with Franz Joseph Haydn and developing an appreciation for the Baroque masters, Mozart did the equivalent of changing his golf swing, which set the stage for the transcendent pieces by which we know him best.

"The Marriage of Figaro", "Jupiter Symphony", and his "Requiem" - among many others - are the work of a man in his thirties.

In short, geniuses are made, not born. Or are they? Certainly others have labored as long and hard as Mozart only to become industrious drudges lacking that - ahem - divine spark. Think Salieri.

So why don't we forget about outcome - we can't control whether we will end up geniuses or not. But we can control process - the art of constantly challenging and reinventing ourselves through practice-practice-practice. Do we have it in us to become Mozart? Who knows? Can we fashion our modest talents into something more formidable? Chances are you're doing it right now.

In today's NY Times piece, the authors cite a study that tracked intellectually gifted kids into adulthood. According the authors:

The remarkable finding of their study is that, compared with the participants who were “only” in the 99.1 percentile for intellectual ability at age 12, those who were in the 99.9 percentile — the profoundly gifted — were between three and five times more likely to go on to earn a doctorate, secure a patent, publish an article in a scientific journal or publish a literary work. A high level of intellectual ability gives you an enormous real-world advantage.

Tuesday, November 22, 2011

Rerun: Creativity - We Are Killing It Off; Hopefully There Will Be Enough Creative Thinkers Left To Rescue Us From the Disaster We Are Headed Into

 From August, last year ...

"I excelled at every subject just for the purpose of excelling, not learning. And quite frankly, now I'm scared."

High school valedictorian Erica Goldson (pictured here) had the guts to speak out. In an address to her graduating class, she spelled it out in a way that even the stupidest teacher in the audience could comprehend, if not accept:

Between these cinderblock walls, we are all expected to be the same. We are trained to ace every standardized test, and those who deviate and see light through a different lens are worthless to the scheme of public education, and therefore viewed with contempt.

She went on to say:

And now here I am in a world guided by fear, a world suppressing the uniqueness that lies inside each of us, a world where we can either acquiesce to the inhuman nonsense of corporatism and materialism or insist on change. We are not enlivened by an educational system that clandestinely sets us up for jobs that could be automated, for work that need not be done, for enslavement without fervency for meaningful achievement. We have no choices in life when money is our motivational force. Our motivational force ought to be passion, but this is lost from the moment we step into a system that trains us, rather than inspires us.

Coincidentally, last month Newsweek ran a cover feature, The Creativity Crisis, that reported that for the first time, measures of creativity in US school kids are way down. The implications are enormous. As Newsweek points out:

The potential consequences are sweeping. The necessity of human ingenuity is undisputed. A recent IBM poll of 1,500 CEOs identified creativity as the No. 1 “leadership competency” of the future. Yet it’s not just about sustaining our nation’s economic growth. All around us are matters of national and international importance that are crying out for creative solutions, from saving the Gulf of Mexico to bringing peace to Afghanistan to delivering health care. Such solutions emerge from a healthy marketplace of ideas, sustained by a populace constantly contributing original ideas and receptive to the ideas of others.

Ironically, as the rest of the world is moving beyond from the old "drill and kill" method of learning, the US is heading in precisely the opposite direction toward prepping students for acing standardized tests. Meanwhile, arts in the schools have been liquidated.

Creativity is not just about the arts. It's about generating original ideas, across all fields of endeavor. The creative process involves both "divergent" and "convergent" thinking. In the divergent phase, the brain is literally roaming the library stacks, gathering up books by the armload. A strong body of research suggests that creative individuals may have brains that are less than efficient at filtering out incoming information.

But we are easily overwhelmed by too much information, not to mention sensory input and emotion - which may explain much of mental illness. This is where the convergent phase comes in. The brain becomes ruthlessly efficient in weeding out irrelevancies and focusing only on the facts that matter. Finally, the brain needs to find associations between apparently unrelated facts and ideas to come up with an original solution.

In his outstanding 2009 book, "How We Decide," science writer Jonah Lehrer reports on what was going on in the minds of the flight crew of United Airlines Flight 232 from Denver to Chicago when one fine day over Iowa in 1989, the rear engine of their DC-10 exploded and took out all three hydraulic systems.

Without a functioning hydraulic system, Captain Al Haynes had no control of his plane. UA 232 was on the verge of flipping into a death spiral. Emergency procedures never anticipated a total loss of hydraulics. The manual had not provided for this contingency. The experts on the ground had no answers. Haynes and his crew were totally on their own.

As Lehrer reports, the first remarkable thing to happen was that Haynes and his crew fought back their panic. Haynes then did a mental scan of all the cockpit controls he could operate without hydraulic pressure. The list was a short one, and only one was useful - the thrust levers. But you couldn't steer a plane with thrust levers.

Or could you?

Haynes' DC-10 had two working engines. If he idled one while boosting the other - what they call differential thrust - in theory he could steer the plane. It was a crazy idea. No one had ever thought of it before, much less tried it. Lehrer notes that the pilots dealt with potential information overload only by focusing on the most necessary bits of data:

For instance, once Haynes realized that he could control only the throttle levers - everything else in the cockpit was virtually useless - he immediately zeroed in on the possibility of steering with his engines. He stopped worrying about his ailerons, elevators, and wing flaps.

Meanwhile, inside the brain, the prefrontal cortex took an abstract principle - the physics of engine thrust - and applied it "in an unfamiliar context to come up with an entirely original solution."

The brain at this convergent stage of creative thinking was uncompromisingly disciplined and rational. Without a strong "I" in the cockpit, mentally we are nothing more than flakes and fruitcakes. Likewise, without wide horizons in the divergent stage, we are mere industrious drudges unable to think our way outside of a paper bag.

Haynes and his crew managed to get UA 232 to an emergency landing strip. But they couldn't control the speed of the landing. The plane skidded into a cornfield and shattered into several sections, leaving 112 passengers dead but sparing 184 lives.

Meanwhile, in the US, our school system is gearing us to a crash landing. It is conditioning our young to become mere takers of tests and uncritical followers of received wisdom. Bound to the past, our next generation may lack the means to think our nation into the future, much less work its way out of our next round of current social, political, environmental, and economic jams.

Thank heaven, then, for boat-rockers such as Erica Goldson. "We are the new future and we are not going to let tradition stand," she told her graduating class. "Once educated properly, we will have the power to do anything ... We will not accept anything at face value. We will ask questions, and we will demand truth."

Ah, there is hope.