Tuesday, February 28, 2012

Personal Note

I'm head down, ass up, putting together an ebook I plan to self-publish as a Kindle edition. I'm aiming to get the book out in two weeks. The book is based on a number of my pieces here at Knowledge is Necessity. It will be a humorous memoir with the title, "Raccoons Respect My Piss (But Watch Out For Skunks): My Funny Life on a Planet Not of My Choosing That I Eventually Came to Call Home."

I know - I need to make the title longer.

I will be posting reruns this week, and - of course - updates on my forthcoming book. Thanks for bearing with me, and stay tuned ...

Saturday, February 25, 2012

Rerun: McMan's Dispensable Rules and Observations for Right Living

I've been tied up with a lot of volunteer work and work on new projects. Looking forward to returning to live posting fairly soon. In the meantime, this from April last year ...

My grandson’s birth in Sept 2009 inspired me to come up with two posts along the lines of the clan elder (me) offering his sage advice to the newest member of the tribe. The piece below represents a reshuffle of my original two lists, plus some new stuff. I make no claim to originality (one of my aphorisms is a shameless rehash of Diogenes). My status as a dispenser of wisdom derives from an unparalleled lifetime streak of doing everything wrong. Enjoy ... 

Four Rules for Living with Perspective
  1. Remember, Hannibal never won a battle with his elephants.
  2. Caviar is fine, but peanut butter will always be your friend.
  3. We elude happiness far more than happiness eludes us.
  4. God has a sense of humor. Trust me, every day you will do something to make Him snort milk out His nose.
Four Rules for Making Wise Decisions
  1. The Wise Man knows when to quit while he’s behind.
  2. If you challenge Tiger Woods to a game - make sure it’s not golf.
  3. Ration your hate. Don’t indulge.
  4. When you reach into your pocket searching for a one dollar bill and all you can come up with is twenties - try not to express your disappointment.
Four Rules for Right Conduct
  1. There is no excuse for dancing like a white man.
  2. You are a book responsible for your own cover. Expect people to judge.
  3. We are who we pretend to be. You can’t go wrong pretending to be JFK or Martin Luther King.
  4. If you suck up to the rich and powerful, you won’t have to do your own laundry. If you do your own laundry, you won’t have to suck up to the rich and powerful.
Four Observations About Meaning
  1. Friends are a way better investment than money.
  2. A good poop is way better than mediocre sex.
  3. Our purpose here on earth is to laugh at farts.
  4. There is one constant in life: Ursula Andress will always be the all-time number one Bond Girl.
Four Observations About the Mysteries of Life
  1. Thoreau danced to a different drummer, but he also died a virgin.
  2. Napoleon lost an entire army in north Africa and an entire army in Russia. Still, he had no trouble recruiting volunteers for Waterloo. Go figure.
  3. God has a funny way of treating people He loves most. Just ask Joan of Arc.
  4. The oldest known redwood is 2,200 years old. An idiot with a chainsaw only needs one day.
Four Observations on Reality
  1. If you think you are experiencing God - it’s probably dopamine.
  2. If you think you are experiencing love - it’s probably dopamine.
  3. That doesn’t mean God or love is not real ...
  4. ... but we know dopamine is.

Thursday, February 23, 2012

Rerun: Is Republicanism the New Stupid?

From Sept, 2010, still relevant ...

"Republicanism isn't a party. It's a diagnosis." A friend of mine happened to relate that to me in a conversation about a year ago, and I have no reason to dispute it. In fact, we actually have the brain science to lend credence to his statement. The same findings also indict Democrats, though I would contend there are mitigating circumstances. It breaks down like this:

It appears that nearly all of us are wired to register moral outrage, but we have very different on and off buttons. The same event can turn us all into avenging angels of God, but for entirely different reasons. A conservative, for instance, might want to kick a beggar. A liberal would kick the person who kicked the beggar.

Yes, environmental factors loom large, but a 2005 NY Times article brought attention to a Virginia Commonwealth University survey of a large sample of identical and fraternal twins on such divisive issues as taxes, labor unions, and x-rated movies. It turned out the identical twin pairs showed much greater concordance on political and social issues than did their more fractious (and apparently less) fraternal counterparts.

We have decades of research to back the proposition that our genetic makeup contributes mightily to our gut-level reactions to all manner of things that go off in the world around us. That same body of research also indicates that our pretenses at reasoned discourse are little more than elaborate justifications for our thoughtless emotional reactions.

In his excellent book, "How We Decide," science writer Jonah Lehrer cites an analysis that found that only 16 percent of voters with "strong party allegiances" during the 1976 US Presidential campaign were persuaded to vote for the other party. In a more recent study, political partisans had their brains scanned as they were read out the on-the-record inconsistencies of George W Bush and John Kerry. Predictably, the prefrontal cortices - the seat of reason - were recruited, which should have been a good sign.

For instance, if exposed to the fact that while on the same day George Bush promised "to provide the best care for all veterans" his administration cut medical benefits to 164,000 veterans, you might expect a Republican to seriously question his or her cherished beliefs. Or at least register some level of primal disgust.

Instead, the Republicans (and Democrats, too, when exposed to stupid Kerry tricks) felt a rush of pleasurable emotion. What seemed to be happening was that the thinking regions of the brain were activated - not to dispassionately weigh the facts and formulate some kind of rational response - but to fabricate a favorable interpretation of the facts, no matter how unpleasant those facts happened to be.

Thus, when the thinking brain had successfully arrived at "mission accomplished" - that is, a palpably absurd conclusion - the lower regions of the brain slobbered like a dog gorging on red meat.

As Lehrer contends, these and many more studies force us to rethink the long-held notion that reason, judiciously applied, overcomes ignorance and blind instinct. Adolph Hitler proved us all wrong on that count.

Now I know why I regard engaging in any kind of dialogue with a Republican as a total waste of time. I came to this unfortunate conclusion back in the nineties, but it wasn't always this way. Before that, I actually cultivated conservative friends. I also worked in a field (financial journalism) which involved total immersion in conservative opinion.

These individuals had a strong influence in my moderating many of my core beliefs and turning me around completely on my more flaky ones. Likewise, I like to think that I exercised a similarly beneficial influence. But in today's highly divisive political climate - the worst in my estimation since the Vietnam era - that simply is not possible. Heaven help if I were to point out to a Republican that Clinton actually turned federal deficits into federal surpluses.

I'm sure Republicans can make similar complaints, but how can I take them seriously when they cite Sarah Palin or Glenn Beck with approval? Hopefully, we can eventually restore reason to the dialogue. In the meantime - forgive me for my attitude - I have to go along with my friend: Republicanism is a diagnosis.


I've been very busy with other projects and volunteer work, so beg your indulgence in going with reruns for the next little while.

Monday, February 20, 2012

Rerun - Presidents Day Special: Lincoln and His Depressions

I originally published this as a newsletter piece in 2005 and soon after on mcmanweb. Enjoy ...

The year is 1860. In a makeshift meeting hall, the Illinois delegation to the approaching Republican Convention is meeting to consider which of their own to back as a favorite son for the Presidential nomination. There is no clear-cut favorite. Moreover, it’s widely acknowledged the choice will be an empty gesture. The nomination is virtually a done deal. William Seward of New York, the party’s leading light, has nearly all the delegates he needs for a first ballot victory.

But then something completely unexpected happens. Abraham Lincoln is introduced. A distant relation enters carrying two split log rails. From them hangs a banner:

Abraham Lincoln
The Rail Candidate

The crowd goes wild. The hall shakes so much that the canvass roof flies off the building. The image of a humble rail-splitter is all this group of delegates needs to give Lincoln its enthusiastic backing. The dynamics of the nomination has completely changed. Illinois’ freshly-minted favorite son is on his way to becoming a serious contender.

The meeting breaks up the next day. In the nearly empty hall, a man sits alone, elbows bent, hands pressed to his face. He confides to someone who approaches him, "I’m not feeling too well." The man is Abraham Lincoln. He is battling a crushing depression.

The event is recounted in Joshua Shenk’s outstanding 2005 book, "Lincoln’s Melancholy: How Depression Challenged a President and Fueled His Greatness." Writes Mr Shenk:

Lincoln’s look at that moment – the classic image of gloom – was familiar to everyone who knew him well. … He often wept in public and cited maudlin poetry. He told jokes and stories at odd times – he needed the laughs, he said, for his survival. As a young man he talked of suicide, and as he grew older, he said he saw the world as hard and grim, made that way by fates and forces of God. ‘No element of Mr Lincoln’s character,’ declared his colleague Henry Whitney, ‘was so marked, obvious and ingrained as his mysterious and profound melancholy.’ His law partner, William Herndon said, ‘His melancholy dripped from him as he walked.’

Mr Shenk relates that depression was a constant throughout Lincoln’s adult life. He never overcame it. He never rose above it. His life was one long unceasing litany of sorrow. At times, he completely gave in to his condition. He would fail to get out of bed. He would behave very strangely. He would alarm his friends and associates.

"I am now the most miserable man living," the 31-year-old Lincoln confessed. "Whether I shall ever be better I can not tell; I awfully forebode I shall not; To remain as I am is impossible; I must die or be better."

But other forces were also at work, Mr Shenk contends. Depression turned him into a hard-headed realist, untainted by the pitfalls of misguided optimism. His uncanny melancholic third eye allowed him to think like a visionary. And even though he was a religious skeptic, his tribulations would imbue him with a higher wisdom and deeper humanity, so much so that he occupies a unique place in history as an American saint.

It is easy to fall into the trap of romanticizing Lincoln, but the facts speak for themselves. As his life unfolds, one cannot help but have the impression of being in a higher presence. It’s almost a religious experience. Mr Shenk makes the experience all the more moving by allowing us to view the great man through the eyes of our illness. The result is both inspirational and heartbreaking. To begin ….

The Early Years

Abraham Lincoln was different from day one. A voracious reader, intellectually curious, and a sensitive individual in a rural environment that only saw merit in physical labor, the young Lincoln was regarded as lazy and in need of discipline.

There was much cause for sadness in Lincoln’s life. His only brother died in infancy. His mother and aunt and uncle succumbed to an epidemic when he was age nine. Ten years later his sister died giving birth to a still-born infant. His father and mother were disposed to melancholy, and one side of the family "was thick with mental disease."

Despite this, young Lincoln made it into adulthood showing few signs of depression. His first major episode coincided with the death of Anne Rutledge in 1835 when he was 26. Lincoln had long since left the family farm to seek his fortune in the one-horse town of New Salem, Illinois. Many historians contend that there must have been a love interest between Rutledge and Lincoln, but Mr Shenk says there is no evidence.

Depression is not as simple as cause and effect, Mr Shenk reminds us, citing a number of psychiatric sources, especially in someone predisposed to the illness. Any number of apparently innocuous occurrences can set off an episode, including several converging at once. According to one account, Lincoln bore up to Anne’s death fairly well. Then came heavy rains that seemed to unnerve him. He took to walking the woods alone with a gun and talking of suicide. Everyone in the village became aware of his strange behavior, and one concerned couple took him in for a week or two.

Finding His Way in the World

By Lincoln’s late twenties, friends and colleagues regarded him as "melancholic." The condition was virtually indistinguishable from the modern conception of depression, but did not carry the same stigma. Back in those days, despite an individual feeling "unmanned" by his affliction, there was considerable leeway for males to express their feelings in public, especially with the Romantic movement entering full flower.

In Lincoln’s case, his sorrowful demeanor induced people to come to his aid.

Nowhere was this more apparent than when the young man turned up to practice law in Springfield, Illinois with all his worldly possessions in two saddlebags. A store proprietor, Joshua Speed, urged his forlorn customer to take the bags upstairs to his room and the two became fast friends.

In an age when contact with the opposite sex was severely circumscribed, young men were encouraged "to pair off and form a special bond" as part of their grooming for greater responsibilities. Lincoln and Speed even shared the same bed for four years, but this was fairly common practice not to be mistaken for homosexuality. Nevertheless, gender roles were defined quite differently. It was acceptable for young men to display their affection for one another. This kind of intimacy encouraged the expression of one’s innermost thoughts and feelings, including depression.

Mr Shenk points to a number of forces at work when Lincoln was coming of age. On one hand, it was an age of hope. The new economy for the first time gave ambitious young white men like Lincoln the opportunity to realize the dreams of the Founding Fathers. Steam power and the telegraph effectively shrunk the world and created a whole new mobile labor force. Advances in medical science instilled the belief that God was not punishing an individual, which effectively destigmatized illness. This spawned a whole new movement in self-improvement.

At the same time, thanks to a new religious revival, a loving redemptive God replaced the harsh vengeful God of John Calvin. Rather than predestination to hellfire and brimstone, men and women had the power to make moral choices and find their way to God’s favor.

For the first time in history, the individual did not have to subsume his needs to the needs of the tribe or community. But with this new freedom came new fears and anxieties. Gone was the communal security blanket. Ever present was the specter of failure, with full responsibility borne by the exposed individual. America, the land of opportunity, led the world in mental illness.

It was in this heady atmosphere of hope and insecurity that young Lincoln, now a hotshot lawyer and rising star in the state legislature, was to become badly unhinged.

Lincoln's Breakdown

Many historians attribute Lincoln’s depressive episode of the winter of 1840-41 to his breaking off his engagement with Mary Todd. But much more was happening in Lincoln’s life, Mr Shenk points out.
In the legislature, Lincoln had hitched his political wagon to ambitious public works projects designed to open up the hinterlands to economic development. This included an elaborate network of rails, canals, and roads.

Then came the economic depression of 1837. Revenues dried up and the debt exploded. Lincoln used up all his political capital urging the legislature to stay the course, which proved a disaster. By the end of 1840, the state was teetering on the brink of bankruptcy, forcing the abandonment of Lincoln’s beloved projects. The rival Democrats rode into power on the aftermath of the debacle, and Lincoln was cast as one of the scapegoats. He barely held onto his seat in the legislature, his political career virtually finished.

At the same time, he was laboring under a heavy workload as a lawyer, with nine cases before the state supreme court.

As for Mary Todd, the exact time of the break-up is unknown, obviating a simple cause and effect. Another woman had turned him down, and he may have had an interest in yet another. On top of this, his dear friend Joshua Speed was making plans to move back to Kentucky. Then the weather turned bitterly cold.

In January 1841, Lincoln was confined to his bed, and his condition was the talk of the town. He put himself in the care of a physician, which likely made him much worse. Standard medical treatment involved purging the body by aggressively drawing blood, ingesting mercury and other poisons, inducing vomiting, starving the patient, and plunging him in cold water.

A concerned Joshua Speed told Lincoln that if he did not rally he would die. Lincoln replied he was not afraid to die. Yet, ironically, his perceived failures may have stoked his will to live. He confessed to his friend an "irrepressible desire" to accomplish something before he died that would "redound to the interest of his fellow man."

Some 20 years later, Lincoln would remind his friend of that conversation.

Finding Himself

In late 1842, Lincoln bit the bullet and married Mary Todd. His way of dealing with his depression was by throwing himself into his family and his work, but it wasn’t until he reached his mid-forties that he found a cause that animated him. The Missouri Compromise of 1820 had regulated the extension of slavery in the western territories. In practice, it operated as a containment policy that implicitly recognized slavery’s wrong. Lincoln foresaw slavery’s eventual end, but it was not a process, he believed, that could be speeded up.

That all changed in 1854 with the passage of the Kansas-Nebraska Act. Suddenly the northern territories were in play. Three years later, the Supreme Court’s infamous Dred Scott decision held out the prospect of legalized slavery in the northern states, as well. Slavery was no longer a wrong. It was about to become a universally recognized right. Passions on both sides were awakened, but the situation clearly favored the south.

Lincoln’s melancholia allowed him to see events with preternatural second sight. Southerners with a vested interest in the outcome stood a clear chance of having their way over largely indifferent northerners. It was the thin edge of the wedge that could put an end to free labor markets everywhere and dash the dreams of the Founding Fathers. The clock was being rewound back to the Dark Ages, and Lincoln was not confident of his ability to put a stop to it. Nevertheless, he felt compelled to speak out against the madness, even at the risk of his career.

Paradoxically, his political career took off, though true to melancholic form he saw every slight setback as a major failure. The new political reality spelled the end of Lincoln’s Whig party. In its place stood the newly-formed Republican party. In 1858, Lincoln found himself in the national spotlight in his series of debates with the author of the Kansas-Nebraska Act, Stephen Douglas. Both were contesting the same Senate seat.

The Senate was Lincoln’s lifelong dream. In an era of lackluster Presidents, this was the forum of his heroes such as Daniel Webster and Henry Clay. But Lincoln was prepared to sacrifice his ambitions for the cause. His antislavery position ran ahead of public opinion, but he strongly felt the greater interest was better served by enlightening the voters.

Lincoln also saw ahead to 1860, when Douglas was likely to be the Democratic standard-bearer in the Presidential election. In the debates, he forced his rival to expose himself as too moderate for his southern backers. The next Republican candidate for President, he knew – certainly not he – would benefit.

Lest we mistake Lincoln as morally flawless, he neither viewed African-Americans as biologically equal to whites nor did he envision the two races living together in harmony. The world was a stupid place back then, arguably only slightly more stupid than it is today.

In early 1860, Lincoln traveled to New York to deliver an address to the Cooper Institute. He brilliantly succeeded in linking the dreams of the Founding Fathers to the anti-slavery position, and threw down the gauntlet on right versus wrong. He brought down the house, and achieved rave notices everywhere. No one was quite ready to seriously consider him as Presidential timber. Yet …

Improbably, on the strength of his new-found image as the rail-splitter, Lincoln won his party’s nomination on the third ballot. The election was a shoo-in. Thanks in part to the Lincoln-Douglas debates from two years before, an irreparable schism had formed in the pro-slavery ranks. The Democratic party splintered three ways, allowing Lincoln to win with just 40 percent of the popular vote.

Even more improbable, Lincoln’s well-known melancholia was not seen as a character flaw. Today, the immensity of a Lincoln-sized depression would disqualify a candidate from virtually any elected office save dog-catcher. Back in Lincoln’s time, living successfully with a mental illness was viewed as a character virtue. Maybe they weren’t all that stupid back then, after all.

The Presidency

By the time Lincoln was sworn in, seven southern states had bolted from the Union. Facing the Republic’s gravest crisis, he assumed office with no executive experience, forced to govern from an untenable position. One slight overstep and the border states would join the South, ending all hope of reunification. When hostilities broke out, the North lost far more battles than it won, forcing all and sundry to second-guess his leadership. As the terrible carnage mounted, much of the population lost its resolve, leaving Lincoln with a very weak bargaining hand. When he pressed his position harder, rebellion threatened to erupt on the home front. Few believed there would be a successful conclusion to the war. No one thought he could be reelected.

Of all things, a lifetime of living with depression admirably prepared him for the task. He possessed both the intestinal fortitude and the moral will. And the insights he had acquired from a lifetime of sorrow seemed to connect him to a higher power. As Joshua Shenk explains, over the course of his adulthood, Lincoln passed from fear to engagement to transcendence.

In other words, having decided that he WOULD live, he then decided HOW to live. When faced with the challenge of a lifetime, he proved more than ready.

But first came more personal tragedy. During his term of office, His favorite son, Willie, died. Of his four sons, only one would live to adulthood. On learning of the death of Willie, he wept convulsively.

In 1862, Lincoln deviated from a previously-held position by proposing to his cabinet the emancipation of slaves from all union-held southern territory. The move risked alienating the border states, but would serve to give the war a higher moral purpose. Nevertheless, Lincoln entertained no delusions about whose side God was on. Death had visited far too many northern households for him to believe that the Almighty was playing favorites. "My greatest concern is to be on God's side," he advised a colleague.

The Emancipation Proclamation would be the first step toward universal freedom and enfranchisement. Soon after, Joshua Speed would pay a visit, and Lincoln would remind him of their conversation some twenty years earlier, when only his desire to accomplish something great gave him the will to live.

"I believe in this measure my fondest hopes will be realized," he confided to his friend.


On assuming his second term of office, Lincoln spoke the finest words ever uttered in the English tongue:

With malice toward none, with charity for all, with firmness in the right, as God gives us to see the right, let us strive on to finish the work we are in, to bind up the nation's wounds.

He had six weeks to live, his last days filled with a transcendent lightness of being. It was as if, his mission on earth accomplished, he were ready to be taken up into heaven. On April 14, 1865, a man with a gun obliged. Now he belonged to the ages.

Final Words

In Lincoln’s depressions, we see the illness in its full destructive horror, one that nearly succeeded in cutting short the life of a promising young man and made the rest of his existence miserable. This is the side of depression with which we can all unfortunately identify. But we also see an aspect to his depressions that equally resonates with us – how our suffering can strengthen us, ennoble us, and embolden us, often to achieve the impossible.

Our sense of achievement need not be the same as Lincoln’s, nor for that matter what our families may expect of us. It is simply enough that we survive from day to day with the kind of grace that is used to define courage. Believe me, if Lincoln were to visit you right now, he might admonish you to make your bed, but he would do it in the way of a funny story. And he would let you know how proud is of you - no doubt about it. Take heart. Lincoln lives in us all. Walk tall.

Friday, February 17, 2012

Is Bereavement Part of Depression? And What the Hell is Depression, Anyway?

Willa Goodfellow’s latest Prozac Monologues piece raises the very important discussion about how bereavement fits (or not) into depression. Ronald Pies, one of the two principal figures behind the proposed DSM-5 “bereavement exclusion” to the depression diagnosis, has left a comment.

The discussion is framed in such a way that the nominal topic - bereavement - unlocks the key to the real issue, namely can any two people actually agree on what depression is all about? What about depression-like behavior?

Some background: The DSM-IV expressly rules out the depression diagnosis if the symptoms are attributable to bereavement for a period of two months or less. The DSM-5, due out in 2013, would drop this exclusion. This has created the mistaken notion that the DSM-5 is proposing to turn bereavement into a psychiatric illness. Allen Frances, who oversaw the DSM-IV, recently told the NY Times that “the revisions will medicalize normality.”

Let’s turn to what the DSM-5 is actually proposing. In the updated depression diagnosis, the symptom checklist would stay the same. In the fine print below, this gets the axe:

“The symptoms are not better accounted for by Bereavement ...”

Willa’s post sees this as the last piece in restoring the complete depression diagnosis. She points out that the DSMs I and II attempted to separate out depressions they saw as situational (exogenous) from those they saw as biological (endogenous). The DSM-III abolished this distinction, essentially viewing depression as a depression, but left in bereavement as an exception. The DSM-IV continued with this.

Willa asks us to view depression as something that happens when life throws too much at us, a point of view backed by some very impressive brain science. Some of us may be genetically resilient, but others (namely, us) prove highly vulnerable, owing to a hyperactive stress response. Says Willa:

What difference does it make whether the one damn thing too many is loss of a job or loss of a loved one?  It's still one damn thing too many.  And doctors need to take time to figure out what is going on with the person sitting in the office on her last nerve, not say, “There, there. You'll feel better in a couple months.”

As Dr Pies’ says in his comment:

[If] it looks like a duck, walks like a duck, and quacks like a duck, it's likely to be a duck, until proved otherwise. That is: if a patient shows up in the doctor's office meeting the full symptom and duration criteria for Major Depressive Disorder(MDD); but happens to have lost a loved one within the past two months, we should not withhold the diagnosis of MDD, simply because it occurs in the context of bereavement.

Are we clear on this? Good. Now let’s muddy it up. In an article on my website, Placing Depression in Context, I too observe the old clinical vs situational distinction, with reference to the DSMs I and II, and like Willa I view the distinction as naive and unscientific. But, nevertheless, I also see merit in bringing back some of the old reasoning. As I put it:

The endogenous-exogenous distinction does encourage us to examine where our depression might be coming from. If your marriage is falling apart, for instance, or your situation at work is going badly, it is obviously worth exploring this association. Sort of like investigating whether a person with a pulmonary disorder is working in an asbestos mine. For some crazy reason, the "modern" DSM-III of 1980 and its successors didn't think this was important.

I also looked at normal vs abnormal. In other words, are some of our depressions a normal reaction to an abnormal situation? Aren’t we supposed to feel depressed when we have lost a loved one? Moreover, if life is getting to be too much for us, our depressions may be telling us that we may need to make an immediate course correction. From my article:

This is straight out of evolutionary psychology. Depression has been called the end of denial. The rose-colored glasses come off. Reality takes over. Maybe instead of banging your head against the same wall - again and again and again - you need to cut loose destructive friends, bail out of a bad relationship, rethink that toxic work environment.

Listen to your depression. It may be an unwelcome guest in your brain, but it is definitely telling you something.

But my article also describes a situational depression I found myself in back in 2004, one that very easily could have led to a clinical depression. I simply did not have the luxury of leaving things to chance, not with my vulnerable brain. I immediately changed my routines and found a new project to work on.

In other words, I was feeling depressed. I needed to act right now.

This is precisely Willa’s point. Prior to reading her piece, I was on the side of not changing the bereavement exclusion. Now I’m teetering the other way. But this is because Willa’s piece challenged me to rethink depression, not bereavement. Depression is never what we think we think it is. Something to think about ....

Thursday, February 16, 2012

Willa Goodfellow's Prozac Monologues: Still Going Strong

I’ve been telling people for years that my beat covers everything from God to neurons. A month or two ago, I incorporated “From God to Neurons” into the subtitle of Knowledge is Necessity. In mid-2009, I had the pleasure of discovering online the other person on the planet blogging from God to neurons, Willa Goodfellow (pictured here).

Willa was only a few months into her vastly wise and funny and totally unique Prozac Monologues. My hypomanic delight over my find was muted by my ever-faithful depressive realism. As I put it in a review at the time:

"Promising bloggers have an unfortunate tendency to burn out, so I urge all of you to drop a comment on her blog site offering encouragement. To Willa: It's very easy for bloggers to get discouraged, particularly when dealing with depression. But clearly we need you. Stick with it."

Willa stuck with it, and we established a great online friendship. Last summer, I had the pleasure of meeting her face-to-face at the NAMI national convention in Chicago. Think of below as a Willa sampler from the past several months. Enjoy, then check her out for real ...

Yes, we ARE getting sicker.  We live in times that make us sick.  We struggle to pay bills while our bosses speed up the assembly line.  Those of us who don't get laid off can't quit, because we can't afford health insurance.  Our support systems, extended family, neighborhoods, religious communities, social organizations - the buffers of stress - have been ripped away, replaced by reality TV and Facebook hysteria.


Keep skunks and bankers at a distance.

Live a good and honorable life.  Then when you get older and think back, you'll enjoy it a second time.

Timing has a lot to do with the outcome of a rain dance.

If you get to thinking you're a person of some influence, try ordering somebody else's dog around.


... This is why, if your antidepressant works for you, you are just plain lucky.  It happens to treat the problem in your particular brain.  Most of the time, it treats somebody else's problem.


While God was blessing Tim Tebow's hard work on Sunday afternoon, 720 children around the world died of hunger.  270 people committed suicide.  Two of them, by the way, were veterans of the United States Armed Forces.

That was before overtime.  Good thing overtime was short, huh?

So on Monday morning, nearly 1000 mothers were asking, If God could help Tim complete that pass, couldn't he have paid some attention to my child?  Billions still listen for their answer.


For the Israelites, the Babylonian Exile resulted in an explosion of creativity, poetry, philosophy, history, new forms of worship, the legal code, and the development of a religion that was larger than their prior notions of land=success=God's favor.  They came up with a religion that could handle exile, handle loss.  It could travel and face the future.

Their brains found new patterns. ...


I'm into changing my brain.  In that mass of electrical wiring, some potentially healthy pathways are blocked by the detritus of dead dendrites.  Other destructive pathways are carved into canyons of well-worn automatic responses.

Changing my brain will take time.  It is taking decades.  It will take at least another blogpost.


From the Damned-If-You-Do-And-Damned-If-You-Don't Department, the medications for schizophrenia and bipolar mostly reduce the positive symptoms (delusions in the case of schizophrenia, high energy in bipolar - the symptoms that scare your families and your care providers who write the prescriptions).  They tend to increase the negative symptoms (thereby relieving the anxieties of your families and your care providers who write the prescriptions), providing that synergistic effect that nails you to the sofa.


Evidently inspired by Fox News, Merry Christmas is no longer an expression of joy and good cheer, but a battle cry against the First Amendment and the great American experiment of freedom and tolerance of difference.


Inevitably, certain symptoms get more attention than others.  Psychiatrists are not concerned when patients sleep too much, do an astounding amount of work in three days or die twenty-five years before our natural lifespan due to complications of obesity, as long as we don't have hallucinations or delusions or try to end our misery by self-harm.

It's all about the descriptors, and how nervous they make people. ...

It's like, the DSM tells you what color the car is and how many cup holders it has.  Big Pharma has made a lot of money tinkering with the placement of the cup holders.  Meanwhile, what patients want to know and what scientists actually are working on nowadays is, what's under the hood?


Mahatma Gandhi was not the first freedom fighter.  But he is the great theoretician.  He gave us the map.

First they ignore you.
Then they laugh at you.
Then they fight you.
Then you win.

Four simple steps.  The good news -- we have already taken the first.  Got that one down pat.

Go to Prozac Monologues ...

Wednesday, February 15, 2012

Revisiting the Normal vs Crazy Thing

Last night, I had a nightmare that I danced like a white man. This was way worse than my recurring dream where I’m married to Sarah Palin. Naturally, it was a huge relief to wake up and - oh crap! - well, at least I’m bipolar.

Most of you know what I’m talking about. We have a different way of perceiving reality, which of course affects our behavior. Too often, the result is outsider status. No one wants that. On the other hand, I bet no one ever told you this: “You’ll really love So-and-So! He’s so normal!”

Funny thing about our doctors. They may inform us that they will have us back to normal in no time, but they never actually say, “We’ll have you normal again.”

“Normal” is a reference to the status quo, how things are going “out there.” This is the world we need to learn to function in. But we don’t necessarily have to be “normal” to function in “normal.” This is hardly a condition we would aspire to. I always sort of knew this, but the light bulb went off last year when I read Nassir Ghaemi’s 2011 “A First-Rate Madness.” Dr Ghaemi pointed out that normal merely represents a statistical average and hardly an ideal.

How about crazy, then? I love that 1997 Apple ad. “Here’s to the crazy ones,” it starts out. "The misfits. The rebels. The troublemakers. The round pegs in the square holes.”

We see short clips of Einstein, Edison, Amelia Earhart, and others. These are “the ones who see things differently. ... They change things. They push the human race forward. And while some may see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world, are the ones who do."

I keep coming back to crazy vs normal again and again. What prompted today’s piece is a comment from Liz in response to my recent repost on Darwin and evolutionary psychiatry:

I have been struggling for a long time to try and figure out how it is that bipolar disorder was somehow an evolutionary advantage. This comment of yours really hit home and brought tears to my eyes:

"I like to contend that it took a crazy person to run into a burning forest and enthusiastically bring a flaming souvenir back to the cave."

I know that as a bipolar person I am able to experience a different reality and range of emotions than people who have chemically "balanced" brains. It's helpful to hear an anecdote about how this difference in reality perception can actually make being bipolar useful rather than a burden.

Liz, I hear you. We are a minority surrounded by the chronically normal. It’s not easy living in a world where everyone dances like a white man. The only thing worse is actually dancing like a white man.

Further reading from mcmanweb

Normal - Highly Over-Rated
Psychic Perception
You See Four; I See 28

Tuesday, February 14, 2012

In Memoriam: Charles Sakai

I just learned that someone dear to me, Charles Sakai, passed away two or three weeks ago. Charles was a comrade-in-arms - mental health advocate, history buff, and Mahler fan. We’d been in each other’s lives for at least ten years, the last three or so as Facebook friends.

We met online sometime in the very early days of my writing about my illness, around 2000, and exchanged emails sporadically.

We met face-to-face in 2003 at a DBSA conference in Long Beach. Both of us had signed up for the talent show. I did a tap-dance number. That’s right, I really tap-danced. I can’t even begin to describe my inner wrestling match with my social anxiety as I got up on that stage. Charles, by contrast, was a natural. He sang karaoke in an unforgettable off-key voice, but the crazy thing was he seemed to be channeling the entire positive life force of the galaxy as he was doing it. It was an amazing performance. Needless to say, he drew the loudest applause of the evening.

We were also participants in a talent show together the next time DBSA came to California in 2006. Again, he brought down the house. The last time we saw each other was the last DBSA conference I attended, in Orlando in 2007. I was one of the break-out speakers, and for my talk I unpacked my didgeridoo.

The purpose of the didgeridoo in my talk was to illustrate the benefits of settling into the kind of stop-and-smell-the-roses state that is vital to managing stress and maintaining wellness. I had only just taken up the didgeridoo three or four months earlier, and about all I could do with it was drone a single monotonous tone. But I thought this would be good enough to lead my audience through a guided meditation.

I think I succeeded, instead, in mystifying just about everyone in the room. My audience was quiet when I stopped, the response I anticipated with a meditative exercise. Suddenly, there was Charles, bursting into loud applause, carrying on as if he had just heard Louie Armstrong reincarnated belting out West End Blues. Charlie came up to me right after his talk, enthusiastically snapping photos of me with my didge and posing for photos with me.

Gotta love this guy.

I last saw him as the conference was breaking up. We would, of course, see each other at the next conference.

I finally figured out how to use Facebook sometime in 2008. Charles was there, constantly encouraging me, giving me a reason to keep going through all my constant down periods when I would have gladly pulled the plug on writing about mental health in exchange for spending the rest of my life rolling pizza dough. I would post a link to my latest blog piece, he would reply with a “like” or a comment.

Okay, not when my posts revealed my politically liberal tendencies. But I was delighted to discover we both resonated to a lot of the same off-beat stuff, such as military history and Mahler.

I can’t remember precisely when I shared on Facebook a YouTube video of a snippet of a Mahler symphony, but there was Charles, enthusiastically responding. Holy cow! both of us seemed to be saying at once. You’re a Mahler fan, too?

To the uninitiated, Mahler is a cult. Mahler fans worldwide share a special bond. Not any old special bond. A special special bond.

Charles, of course, would keep asking if I would be attending this DBSA conference or that DBSA conference. I would keep informing him, no, not this year, maybe next year. We would also talk about my visiting Colorado Springs. Charles was very involved with DBSA there, which has a very active chapter with lots of great people I have bonded with over the years.

I kept thinking that one of these days I would get to Colorado Springs. Alas ...

But we still had Facebook. I recall enthusiastically informing him late last year that I had purchased a ticket to see Dudamel conduct the LA Phil in Mahler’s Sixth Symphony. He responded with equal enthusiasm.

Then, soon after, his “likes” and “comments” stopped. I didn’t think too much of it at first. People go into hibernation. Maybe my liberal politics was getting too much for him. The Presidential primary season was in full swing, and I was making no secret of the fact that I considered Republicanism a diagnosable illness. Surely, he would be back.

Two weeks ago, I attended my Mahler concert. It was the most profound musical experience of my life. As soon as I got in the door, I was posting on Facebook. Not just one post. A status post, two YouTube snippets, a link to a just uploaded rerun of an old Mahler blog. Charles will love this, I thought.

No “like.” No comments.

A week went by. Two. I was going through one of my rolling pizza dough moments. Where was Charles? This morning I went to the wall of his Facebook page. Someone had posted:

For everyone that was unaware of what happened to Charles, he got sick, so they took him to the ER and then he was put in ICU in Denver. One week later he died. He had a very advanced form of cancer. By the time he found out, it was too late ...

Now I know. Charles, you were there at the Mahler with me. You had to be. I was thinking of you the whole time. It was one hell of a concert, wasn’t it?

Monday, February 13, 2012

Can Integrative Psychiatry Save Psychiatry?

Integrative psychiatry involves incorporating complementary and alternative medicine (CAM) into clinical practice. I first came across the term in mid-2003 at a two-day conference, “Non-Pharmaceutical Approaches to Mental Disorders” staged in Pasadena by the nonprofit organization, Safe Harbor.

Two weeks earlier, I had attended the six-day American Psychiatric Association annual meeting in San Francisco. The main event there was Pharma-driven psychiatry, but the brain science on display signaled a new paradigm in the making. Only one session (to my knowledge) involved vitamins and supplements. This wasn’t going to change.

So, here we are, nearly nine years later. Psychiatry is facing a major identity crisis and Safe Harbor keeps soldiering on. Safe Harbor was founded by businessman Dan Stradford, who lost the dad he knew to mental illness. Medical treatment failed to return his father, at least the father he knew. There had to be a better way,

Several months after the conference - I ran into Dan at a NAMI national convention in Cincinnati. I was outside, taking a break, enjoying the sun. I greeted Dan walking past, and he sat down beside me. It looked like he needed a break, as well. One of the loneliest feelings in the world is being outside of your time zone when exhaustion overtakes you. I think Dan was having one of those moments.

In vulnerability lies strength.

Safe Harbor just released the 108-page “The Flying Publisher Guide to Complementary and Alternative Treatments in Psychiatry,” available as a free download. Dan is one of the four authors. The document gives a good run-down on lifestyle, nutrition, mindfulness, and other things we need to consider incorporating into our recovery. Of particular interest is a chapter co-authored by Hyla Cass (pictured here) entitled, “The Integrative Psychiatrist.”

Dr Cass is another person I met at the Safe Harbor conference. A former assistant professor at UCLA, she now runs her own private practice, appears regularly in the media, and markets her own line of supplements. Yes, this raises the same kind of concerns as MDs financially linked to Pharma, but let’s focus on what she says:

“During my residency at Cedars-Sinai/UCLA Medical Center,” she writes, “I eventually found that the standard ‘couch and Prozac’ combination of psychoanalytic and pharmacological treatments had their limitations.” This would lead her down the path of nutrition and lifestyle. Depressive or anxious or other symptoms, she found, could be related to such things as low blood sugar. viral and fungal infections, hormonal imbalances, allergies, toxins, and specific nutrient deficiencies.

The no-brainer approach to this is a generous order of lab tests. Dr Cass cites one instance of a 55-year-old woman arriving at her practice being treated for numerous physical complaints by her internist and depression and anxiety and insomnia by her psychiatrist. A lab test revealed a magnesium deficiency.

Dr Cass’ exams include a standard range of screenings that measure for anemia, thyroid, cholesterol, and so on. In addition, depending on the patient’s symptoms and information she has gathered, she will order additional labs that measure for nutritional deficiencies, toxic minerals, and allergies.

Every patient fills out an inventory checking for stress, depression, anxiety, and sleep, plus a far more involved questionnaire that screens for symptoms that suggest issues with lifestyle, brain chemistry, thyroid function, adrenal function, blood sugar, digestive imbalances, toxin overload, headaches, arthritis, and osteoporosis, plus men-only and women-only problems.

In addition, Dr Cass screens for personal and family histories. According to Dr Cass:

From the patient information, physical assessment, and labs, a picture begins to emerge. While the client could be primarily suffering from stress, where lifestyle changes or counseling would be in order, more often I find physical issues - commonly a number of them - impacting behavior, emotions, and cognitive function.

When these issues can be pinned down (such as hormonal imbalances) the solutions may present themselves. Various specific nutritional remedies tend to be her first choice, her credo being: “Apply a continuum of treatments, always beginning with the safest, most natural, and most benign.” Medications as needed are also part of her toolbox. Hence the “integrative” in integrative psychiatry.

In essence, what do we make of a depressed individual with a vitamin B deficiency? Is it depression? Or is the vitamin B deficiency the real issue? These are the very same type of questions brain scientists are asking, namely what is really going on? Sensitivity to certain substances? A glitch in the wiring? How about what is going on in the person’s life right now? How about family? On and on.

Our new understanding is screaming for new diagnostics. The kind of lab screens and surveys that Dr Cass employs, but also the type of gene scans and qEEGs that are coming on the scene. And - always, always, always - sitting down and listening to the patient. High tech with human touch. Integrative psychiatry - we are long overdue.

Sunday, February 12, 2012

Charles Darwin and Evolutionary Psychiatry

In honor of Darwin's 2003rd birthday, from mcmanweb ...

Here's an interesting fact: Peacock tails drove Darwin crazy. The sight of one "makes me sick," he wrote. These feathered accessories played havoc with his work-in-progress theory of natural selection. Surely, any bird stupid enough to flaunt their colors in the wild wouldn't live long enough to mate.

Darwin's solution seems obvious enough today, but back in the nineteenth century it was a scientific breakthrough, a work of genius. The showy tails, he figured out, were chick magnets. The flashier, the better. The well-endowed cock, so to speak, won the right to make a deposit. The bird's genes would live on, even if its owners' days were numbered.

Evolutionary biologists refer to this as a trade-off. A high fever, for instance, may aid in the destruction of deadly pathogens, and without the inconvenience of coughing we would all likely die from pneumonia. Take away our ability to experience pain and we would never know our appendix has burst. The sickle cell gene, in turn, is protection against malaria.

A Darwinian Explanation for Mental Illness

Fine. But how does Darwin apply to mental illness? According to evolutionary biologist Randolph Nesse MD of the University of Michigan:

Psychiatrists still act as if all anxiety, sadness, and jealousy is abnormal and they don't yet look for the selective advantages of genes that predispose to schizophrenia and bipolar disorder.

I heard Dr Nesse at the 2005 American Psychiatric Association annual meeting talk about the selective advantage in anxiety. Obviously, sufficiently anxious cave men and women were able to steer clear of saber toothed tigers long enough to find an opportunity to pass on their genes to the next generation.

Dr Nesse asks us to imagine a distant ancestor of ours at an ancient watering hole. The poor guy hears a sound behind him. A lion? A monkey? Even if it’s just a mouse, panicking first and thinking later is not such a half-bad idea.

Anxiety traits are no mere artifacts of an earlier age. Anxiety is crucial to marshaling our wits. We could never survive one day in traffic without it, let alone the full range of personal interactions.

Dr Nesse compared the brain's limbic system to a smoke detector that is programmed to deliver 1000 false alarms for every genuine alert. The false alarms are the price of survival. Better to be too anxious.

Now imagine modern man in the supermarket having a panic attack while reaching for a bottle of water. The seriously anxious, it turns out, have hyper-sensitive smoke detectors. The false alarms and the hyper-sensitive in our midst tend to blind us to the fact that a certain degree of anxiety is good, that we would fail to exist as a species without it.

An application of evolutionary biology is Darwinian medicine. For instance, a medical doctor might want to think twice before prescribing something to lower a patient’s temperature. In patients with panic attacks, Dr Nesse has had success once he helps them realize that their response is not necessarily abnormal. Once that happens, often the power of the panic attack dissipates.

The Darwinian Bipolar Advantage

Our behaviors and emotions, according to evolutionary psychiatry, are adaptations the mind has made to recurring situations. In making a Darwinian case for bipolar, it’s easy to imagine highly energetic and productive and creative types having a selective advantage over their more mundane kinfolk. Think of mania lite. Passing on the risk of more serious manifestations was an acceptable trade-off.

I like to contend that it took a crazy person to run into a burning forest and enthusiastically bring a flaming souvenir back to the cave, raving on about the glories of barbeque. I’m sure this individual's reward was summary eviction by an enraged spouse. Ah, the price we have to pay. It’s never been easy being bipolar.

In my version of the story, the two made up and lived long enough to pass on their traits to the next generation, but only after one of them arrived at the concept of putting the meat on a spit rather than holding it bare-handed over the open flame.

Or bipolar could be a lot more elemental. The illness could be an adaptation to changes in the seasons. Think seasonal affective disorder. Think of a very long cycle. (Goodwin and Jamison refer to this in the second edition to “Manic-Depressive Illness.”)

The Darwinian Depression Advantage

But what about depression? Surely, there can be no selective advantage here. Think again. For one, depression may amount to a failure of denial. Depression is when the rose-colored glasses come off, when reality sets in. It opens the way to acceptance, to setting new goals and moving on with our lives.

Also, sometimes it’s helpful to be too depressed to press our luck. If mania is all about daring, depression is about caution. The daring have an advantage in life's ultimate prize, the opportunity to mate. So do the cautious.

Depression also provides an opportunity for regrouping and recouping, not to mention a time of introspection and reflection. Think of depression as an enforced time-out. In its own perverse way, depression may set the stage for needed psychic healing.

As with anxiety and mania, we are talking more benign manifestations. The more virulent versions of depression, it seems, are part of the price we have to pay.


Schizophrenia is far too horrific an illness to see any obvious selective advantage. Yet, the culprit genes have been transmitted from generation to generation, even in Einstein's family. What gives?

First, it is not helpful to look upon schizophrenia as a simple disease. About a hundred suspect genes have been fingered. One of these genes - COMT - has a variation that enhances thought processing in one context but disrupts it in another. Another gene - DISC 1 - helps integrate neurons into the mature brain.

In this context, schizophrenia can be seen as the breakdown in the processes responsible for building and maintaining a complex brain.

Schizophrenia may also be seen as part of a spectrum. At the schizophrenia extreme, the brain is far too active for its own good, characterized by runway thoughts such as psychotic delusions. A lighter version may well be schizotypal personality disorder, characterized by various oddball behaviors and "magical thinking." Tone this down a bit more and we may be talking about eccentrics who think outside the box.

Nancy Andreasen MD, PhD of the University of Iowa describes Einstein as having having schizotypal traits, as well as a son with schizophrenia. Her original enquiry into creativity involved looking for a schizophrenia connection (also citing Newton and James Watson) but very quickly changed to bipolar.

There may be another aspect to "schizophrenia lite." The book, "A Beautiful Mind," chronicles the life of Nobel Laureate John Nash. His breakthrough accomplishments occurred as a young adult, before his outbreak of schizophrenia. But as the book makes clear, there is no way we can describe an apparently healthy John Nash as "normal." Even in a profession notorious for its eccentrics, Nash was very much an outsider.

We tend to think of mental illness as a complete break with reality or rationality, but these breaks don't just happen overnight. Subtle symptoms may manifest many years earlier, what the experts describe as "prodromal" states. Could Nash's "beautiful mind" be attributed to such a state? Who knows?

Working With What We're Stuck With

"Human biology," says Dr Nesse, "is designed for stone age conditions." Or, as Leda Cosmides and John Tooby of the University of California at Santa Barbara put it, "our modern skulls house a stone age mind."

In other words, we are the beneficiaries of a group of genes that did not anticipate the demands of modern living. Were we mere machines with replaceable parts, we could simply send our brains back to the manufacturer for a retooling. Instead, we are forced to work with what we're stuck with.

Dr Nesse cites the example of the eye. Those who champion intelligent design point to the wonders of the eye in support of their theory that creation is way too complicated to be left for chance.

But look closely at the eye, Dr Nesse advises. We have wires running between the lens and where the image is processed. No camera manufacturer would be dumb enough to do that. Plus the eye has a blind spot where the retina meets the optic nerve.

The eye of the octopus, Dr Nesse points out, has a far better "design." Through pure chance, he says, we and practically all the rest of the animal kingdom got stuck with the inferior version.

Scientists are in virtual unanimous agreement on evolution's main points, but evolutionary psychiatry is a speculative enterprise, not capable of definitive proofs. Indeed, a legitimate argument can be made that we are retrofitting psychiatry to conform to evolutionary precepts.

Then again, a much stronger case can be made that our behavior makes no sense without taking evolution into account. Instead of viewing all mental illness as solely destructive, we are forced to consider its advantages. And in looking at the advantages, we find potential in our own worth.

Call it the twenty-first century Darwinian challenge. Our ability to feel on levels deeper and higher than the rest of the population, crippling as it may be, has also given wings to our thoughts, ones that motivated our distant ancestors to climb out of their cozy rock condos in the first place and now seem destined to have us reach for the stars.

Saturday, February 11, 2012

Rerun: My Visit to the Local Creationist Museum (Seriously, I'm Not Making This Up)

In honor of Charles Darwin's birthday tomorrow, this piece from Dec 2010 ...

Believe it or not, this museum is only 10 or 12 miles from my home, outside San Diego.

This journey through time will be a very short one, as the entire universe, earth included, according to creationist belief, is only 6,000 years old.

This works way better than carbon-14 dating.

I missed whether it was a standard day or a metric day.

In support of a worldwide catastrophe, creationism cites the same geological evidence as science, though with some rather significant differences in interpretation.

Noah's sons went their separate ways, assisted by land bridges spanning the oceans, thanks to a Flood-induced ice age. The animals from the Ark dispersed along these same land bridges, perhaps not whales and other sea creatures.

And I thought Neanderthals survived in the form of Tea Party followers.

I wish I had our high school class valedictorian, Karl Van Bibber, to explain this to me.

If I can follow the logic, mutations (which are all bad) get filtered out of the gene pool, keeping creation constant. There is, however, the mother-in-law exception.

That's right, evolution is just a religion, which makes creationism the true science. Why aren't our kids being taught this in school?

The "bad fruits" of evolution. No good can come from allowing people to think for themselves. That's why we need knowledgeable people in authority to do our thinking for us.

Evolution apparently played a part in the Final Solution. Actually, murderous bigots were killing Jews en masse long before Darwin. The Catholic Church even made saints out of some of these medieval pre-Hitlers. (Sorry, I was trying really hard to keep this objective.)

A browse through the museum's book store. No, I didn't Photoshop the book title.

Friday, February 10, 2012

Shedding Light on Brain Research: A Scientist Responds to Whitaker

Last week, I wrote a piece highly critical of a post Robert Whitaker published on his blog, Mad in America. His post attacked a very recent Scripps Institute study, which became the basis of his own editorializing on the research agenda of the NIMH, namely that “decades of such brain research has not produced any notable therapeutic payoff.”

My post noted that Whitaker had a point concerning one issue (namely, the need to control for the effects of psychiatric meds in genes-brain research), but that he had left out some critical information about the study and that his editorializing was way off-base.

You can check out Whitaker's post: Rethinking Brain Research in Psychiatry.

And my post: Robert Whitaker: Dangerous in America

Several days ago, I emailed Elizabeth Thomas PhD (pictured here), lead author of the study in question. I did not ask her to take sides. I simply directed her to both blogs (if she were morbidly curious about our food fight) and asked for points of clarification. Following is her response in full, published here with her permission ...

Hi, John

Thanks for your email. Yes, I was morbidly curious about your blogging battle with Mr. Whitaker and I did want to respond. Sorry this is long-winded…..

First to defend our work a bit. Like most researchers working with post-mortem brain tissue, we are aware that a confounding factor in post-mortem research on schizophrenia is the unknown effect of antipsychotic drugs, which are known to alter gene expression. (I have actually published two reviews on the topic of antipsychotic drugs and regulation of gene expression [1, 2]). It is an issue that cannot be avoided, as most if not all collected brains deemed “psychiatric” are due to information from a psychiatrist’s report and, hence that patient would be receiving some type of medication.

Just FYI, to address this in our research, we do typically do two things: 1) treat rodents with the drug in question, to look for effect on gene expression in the brain; and 2) perform correlation analysis between expression values in each human subject and the recorded drug dose of each subject. In our recent paper, we did provide drug information in Suppl. Table 1 for a portion of the subjects we studied; unfortunately, drug information was not available for the Harvard subjects.

Nonetheless, Mr. Whitaker is correct in that we should have discussed the potential effects of antipsychotic drug exposure in that paper, as we have in previous studies using post-mortem brains from some of these same subjects ([4, 5]). As it turns out, previous studies have looked at the effects of antipsychotic drugs on histone acetylation in rodent brain [5, 6], as Mr. Whitaker suggested should be done.

It was found that haloperidol, one of the most commonly used drugs, did not alter global histone acetylation in the brain, but could elevate a phospho-acetyl mark on histone H3 at a particular residue [5]. Another study found that clozapine and sulpiride could elicit small increases in acetylation of histone H3 [6]. Hence, the findings in our paper showing lower histone acetylation in patients, who in fact were treated with haloperidol or other D2 receptor antagonists, are unlikely due to drug treatment (if you want to use the rodent argument). I regret that we did not mention these studies in the current paper.

The finding that histone acetylation is lower at certain gene promoters is consistent with a lowered gene expression profiles for these given genes that have been observed in subjects with schizophrenia. On a whole, dozens of papers have shown that brains from patients with schizophrenia show substantial deficits in gene expression; this was the impetus for our studies investigating whether epigenetic mechanisms of gene regulation could be responsible and or contributing to this phenomenon. 

Certainly, we cannot rule out that antipsychotic drugs could have an effect on gene expression in these subjects, but drug exposure is unlikely to explain the wide range of gene expression deficits detected. In any case, I do think Mr. Whitaker is correct about the importance of studies that would address the question of how psychotropic drugs may be affecting the developing brain, as many of these drugs are now given to younger patients.

Despite Mr. Whitaker’s claim that my response to our work was “the usual concluding pronouncement from such studies”, the reality is, in my opinion, that our findings provide a starting point to consider the only real new drug development for psychiatric disorders the field has seen in 50 years. Because we have shown that histone acetylation is lower in young subjects with schizophrenia, and that the acetylaton marks we studied are known to govern gene regulation, the use of compounds that could elevate histone acetylation (i.e. histone deacetylase [HDAC] inhibitors) could be useful to of restoring abnormal histone acetylation patterns and accompanying gene expression deficits in schizophrenia, leading to improved clinical outcome.

The possibility that these compounds could improve symptoms is supported by a recent study by Engmann et al., 2011 [7], who showed that the HDAC inhibitor, MS-275 could rescue cognitive deficits in a mouse model of schizophrenia. And further, that the mechanism for beneficial effects were via restoration of histone H3K18 acetylation deficits in the mouse brain. Other ongoing studies are testing other HDAC inhibitors in different rodent models of psychiatric disorders as well. (If my recent NIH application is funded, we will be testing novel HDAC inhibitors in a prenatal immune activation model of psychiatric disease).

As for your question about testing whether psychotropic drugs alter epigenetic pathways: There are two studies, as I mentioned above, that have been published, although the drawbacks of these studies were that only short-term treatments were used and epigenetic changes at specific genomic loci were not tested (only global levels measured by Western blotting).

A more important issue, in my opinion, is whether epigenetic drugs, such as HDAC inhibitors will truly represent a novel therapeutic avenue for psychiatric disorders. I would argue YES. New and improved HDAC inhibitors are currently being developed for various CNS disorders and my prediction is that they will also prove to be beneficial in treating patients with psychiatric disorders.

While it is expected that such compounds will have some “to be determined” side effects, they are unlikely to cause the same detrimental side effects of Parkinsonian symptoms and metabolic syndrome associated with the currently used antipsychotic medications. There is one recent clinical trial that has been completed showing improvement with valproate (an HDAC inhibitor) in schizophrenia, and several other trials are underway. (For more information see the clinical trial gov website). (Although it must be noted that the currently FDA approved HDAC inhibitors, such as valproate, are broadly acting compounds, unlike the ones in development, which would be more specific, hence less likely to cause unwanted side effects).

1. Thomas, E.A. Molecular Profiling of Antipsychotic Drug Function: Convergent Mechanisms in the Pathology and Treatment of Psychiatric Disorders. Molecular Neurobiology 34:109-28 (2006).
2. Thomas, E.A. Transcriptomics of antipsychotic drug function: What have we learned from rodent studies? Current Psychopharmacology, In Press.
3. Narayan, S., Tang, B., Steven Head, S.R., Gilmartin, T.J., Sutcliffe, J.G., Dean, B. and Thomas, E.A. Molecular Profiles of Schizophrenia in the CNS at Different Stages of Illness. Brain Research 1239:235-248 (2008).
4. Narayan, S., Head, S.R., Gilmartin, T.J., Dean, B. and Thomas, E.A. Evidence for Disruption of Sphingolipid Metabolism in Schizophrenia. Journal of Neuroscience Research 87:278-288 (2009).
5. Li J, Guo Y, Schroeder FA, Youngs RM, Schmidt TW, Ferris C, Konradi C, Akbarian S. Dopamine D2-like antagonists induce chromatin remodeling in striatal neurons through cyclic AMP-protein kinase A and NMDA receptor signaling. J Neurochem. 2004 Sep;90(5):1117-31.
6. Dong E, Nelson M, Grayson DR, Costa E, and Guidotti A. Clozapine and sulpiride but not haloperidol or olanzapine activate brain DNA demethylation. Proc Natl Acad Sci U S A 2008; 105: 13614-9.
7. Engmann O, Hortobágyi T, Pidsley R, Troakes C, Bernstein HG, Kreutz MR, Mill J, Nikolic M, Giese KP. Schizophrenia is associated with dysregulation of a Cdk5 activator that regulates synaptic protein expression and cognition. Brain. 2011 Aug;134(Pt 8):2408-21

Finally, thank you for supporting NIH funding for basic and medical research in your blog – we are definitely in dire need and without continued funding, we will not be able to address these critical questions that could help patients with psychiatric disorders.

Best wishes,

Elizabeth A. Thomas, Ph.D.
Associate Professor
Department of Molecular Biology
The Scripps Research Institute
3030 Science Park Rd, SP2030
La Jolla, CA  92037

Wednesday, February 8, 2012

The Cortical Factor: What Is Going On In Our Brains With Gay Marriage and All That?

Picking up from where we left off:

According to Robert Sapolsky of Stanford, the amygdala and the frontal cortex essentially regulate each other. The projections from the frontal cortex are inhibitory, as are the projections from the amygdala. In Sapolsky's words: “The frontal cortex is trying to get the amygdala to restrain itself. The amygdala is trying to get the frontal cortex to stop sermonizing at it.”

The two parts of the brain essentially work in opposition, and when the amygdala succeeds in silencing the frontal cortex, “that’s the world in which you are making astonishingly bad decisions. ... That’s the world of the amygdala getting very inaccurate rapid-fire information.” Out comes behavior that is seriously unregulated.

Eventually, in certain situations, the amygdala will habituate and learn not to be afraid, but only if the frontal cortex is healthy enough to convey the lesson.

Enter the anterior cingulate (ACC), part of the cingular cortex snaking beneath the outer cortices. This is a region of the brain implicated in empathy. Typically, in hypothetical exercises involving agonizing moral choices (such as do you strangle a crying baby to save the lives of a group of people hiding out from the Nazis?), those who make the cold-blooded utilitarian decision are shown to have the least activation in the ACC.

But life is more complicated than simple thought vs emotion, especially when the brain goes metaphorical on us. We are called upon to make judgments concerning abstract moral concepts. The problem is our brains did not evolve for doing symbols and metaphors. We are stuck with the old circuitry.

Thus, a test subject handed a warm drink in an elevator by a stranger will rate that individual as warm. Cold for a cold cup. This really happens. The brain mixes metaphor with reality.

Wait, it gets even more weird. Registering moral disgust? The insular cortex activates. This is the part of the brain that processes foul stimuli such as rotting fish. Contemplate the etymology of the word, disgust. Further contemplate that every language on earth employs similar terms for the same phenomenon. Says Dr Sapolsky: “When humans came up with something as fancy as moral transgression, where are you going to stick the sense of outrage you feel? I know - let’s hijack the part of the brain that tells you you’re eating rotten food.”

Dr Sapolsky cites a number of studies, one of which includes people wanting to wash their hands a la Pontius Pilate after recounting some moral failing in their lives.

As you will recall from the previous piece, Dr Sapolsky mentioned that by age 5 there is already a correlation between socio-economic status and the thickness of the frontal cortex and its resting metabolic rate. No question about it - this stinks. Dr Sapolsky agrees. As he observed: “That is one of those factoids that should have people rioting at the barricades.”

But Presidential candidate Mitt Romney would disagree. As he recently declared: “I'm not concerned about the very poor.” (And no, this is not out of context.)

This is a guy who obviously has no problem digesting unpleasant factoids and who is not losing any sleep over it, but that’s mixing yet another metaphor. Could it be that Mitt is one of those low-activating anterior cingulate types?

Dr Sapolsky does not mention Republicans, but that shouldn’t stop us from making our own connections. Says Sapolsky, citing Nobel Laureate Eli Wiesel: “The opposite of love is not hate. The opposite of love is indifference.”

Dr Sapolsky points out that love and hate are physiologically very similar. From a strictly biological perspective (such as brain activation, heart rate, and so on) it is very difficult to tell the two apart. Indifference is the real evil, which is another good reason to register our disgust.

But hold on. We may be disgusted at people who are indifferent to injustice (who obviously lack the capacity to register a mental gustatory response), but then we have a whole class of liberal-haters who define themselves as being disgusted with the disgusted. Recall this from yet another Presidential candidate, Newt Gingrich, in reference to Occupy Wall Street: “Go get a job, right after you take a bath.”

Ah, another stinking Republican (using a bath metaphor at that) who makes liberals want to throw up. Confused? So, apparently, is everyone. Dr Sapolsky cites the work of John Haidt of the University of Virginia in support of the proposition that affective response drives moral decision-making rather than the other way around.

Say, for instance, a brother and a post-reproductive sister want to have a sexual relationship? Is it okay for them to have one in private? How about burning a flag and stomping on it? Or cutting up your dead pet and eating it?

I don’t know about you, but I definitely registered a high-level gut reaction to that last proposition. Fine, but can I come up with a rational answer in response to the question: “What’s wrong with that?”

This is essentially the same question Dr Haidt asked his test subjects. They, too, had trouble framing rational responses. Basically, on an emotional level, something doesn’t “feel right.” The frontal cortex is spinning its wheels, the gut makes the call. Eventually, the thinking parts of the brain lock in, but too often in a rationalizing display of post hoc rubber-stamping.

As Sapolsky explains: “We are dealing with a very ancient brain, one that is not very good yet at separating the limbic world from the cortical one.”

Maybe that’s why we need a judiciary to protect ourselves from our own inept decision-making. Yesterday, a federal appeals court struck down California’s infamous Proposition 8, a voter referendum passed in 2008 that would have banned gay marriages in the state.

Is there anything wrong with disgust? Of course not. Of all things, disgust is morally neutral. Whether we’re dealing with moral issues or food, this is basically a rotten fish reaction we are talking about. We need our rotten fish reactions. They drove the civil rights movement, which is why we need to be suspicious of low-activation anterior cingulate guys who lack the kind of empathy that makes us capable of moral outrage in the first place.

The problem is we often fail to couple thinking to our disgust. “What is wrong with that?” It’s a question we need to be asking - over and over and over.


This concludes my three-part series on The Cortical Factor, based on Lecture 18 from Robert Sapolsky’s 25-part video series. See Part I, Part II. Stay tuned for more of Sapolsky, probably in another week or two ...

Tuesday, February 7, 2012

The Cortical Factor: Developing the Optimum Brain

Take the teen-age brain - please!

First, a quick review: In our previous post, Robert Sapolsky of Stanford (pictured here) explained how the frontal cortex is about doing the harder thing, if it is the right thing to do. Essentially, the more developed cortical areas modulate our more primitive limbic impulses, including learned (and virtually automatic) behaviors that are no longer stored in the frontal cortex.

This tends to involve the frontal cortex, boosted by dopamine, amping up weaker neural circuits and inhibiting stronger ones. Those with cortical damage or dementia experience major system failures. Their brains default to the stronger circuits, even if these represent the wrong thing to do in the particular situation. They fail to stay on task. They give into temptation. They fail to delay gratification in pursuit of the long-term reward.

REM sleep offers a spectacular example of the frontal cortex going off-line. We dream all kinds of crazy stuff. We do things in our dreams we would never contemplate doing awake, that is assuming we are adults. But there is a strange phenomenon called teen-agers.

The frontal cortex is the last part of the brain to fully develop - to fully form all the myelin on its axons, to get its full complement of synapses. The front part of the brain fully goes online for the first time at around age 25. Younger than that and we’re dealing with limbic systems with feet.

Interestingly enough, says Dr Sapolsky, because the frontal cortex is the last part of the brain to develop it is the part of the brain least constrained by genes and most sculpted by environment and experience.

So - take an adult and take a teen-ager. Assign each one a task and give a greater reward than anticipated. Dopamine levels go up, driving frontal metabolism, but a lot higher in the teen-ager. Then change things around. Do not give out the reward. Dopamine levels go down, only much lower in the teen-ager. Says Sapolsky:

The gyrations are much more extreme. The dopamine-driven metabolic changes in the frontal cortex are more dramatically large for reward, are more dramatically having the floor fall out from under it for lack of reward, for disappointment. It is a system that is essentially less regulated.

Major bummer: The frontal cortex loses neurons as it ages, which explains your grandmother telling you that your new hairdo looks rotten.

Can elevated resting metabolism in the frontal cortex be too much of a good thing? Dr Sapolsky gives the example of repressive personalities, individuals who are highly regimented, highly disciplined, not depressed or anxious, who do not express emotions very readily and are very bad at reading emotions in other people. “This is the roommate who always has all the work done three weeks before the due date.”

But low resting metabolism in the frontal cortex is not such a good thing, either. Think sociopath. Give a sociopath a routine brain task and they have to activate more of the frontal cortex than other people do.

Meanwhile, see how long a marshmallow on the table lasts with a four-year-old kid in the vicinity. Can the kid rein in his impulse with the promise there will be more marshmallows in 20 minutes if he leaves that one alone? In other words, how many frontal synapses do you have? Funny you should ask. The kids who held out the longest scored much higher on their SATs and other success measures years later.

Depressing fact: Already by age five there is a relationship between your socio-economic status and the thickness of your frontal cortex and its resting metabolic rate. What is that all about? This is the part of the brain that has one of the highest rates of receptors for glucocorticoids. Glucocorticoids are released in response to stress, and when too many of them are on the rampage neurons atrophy. Get born into the wrong family, be raised with the stress of poverty and already by age five the size and activity of the brain has taken a major hit.

Says Sapolsky: “That is one of those factoids that should have people rioting at the barricades.”

Instead, we have the phenomenon of Presidential candidate Mitt Romney, who recently declared: “I'm not concerned about the very poor.”

Have no fear, there is a neurobiological explanation for that. There is also an explanation for your moral outrage. More to come ...

Highly recommended: Dr Sapolsky's 25-part video lecture series, Human Behavioral Biology.

Monday, February 6, 2012

The Cortical Factor

I just finished viewing Lecture 18 in a 25-part video series by Robert Sapolsky of Stanford. Dr Sapolsky is to human behavior what Carl Sagan was to astronomy. There is no one better at explaining the topic to the general public than this man. It’s not even close. I first stumbled into Sapolsky in early 2003 and I’ve been something of a groupie ever since.

The video series I am watching is an actual undergraduate course in human behavior. I was looking forward to blogging on various highlights once I’d completed the series, but Lecture 18 represents the best exposition I’ve ever come across concerning how different parts of the brain talk to each other, so let’s get straight into it ...

Lower mammals have an olfactory hotline to the amygdala, the part of the brain in the limbic system that mediates fear and arousal and kickstarts fight or flight. Furry creatures are literally primed to react when they smell something funny. Mammalian limbic systems are standard issue for humans, except ours are wired to respond mainly to visual stimuli.

Ordinarily, the cortical areas of the brain are responsible for visual processing and sending info to the amygdala, but we don’t always have time to wait. There is a short-cut in the brain through the lateral geniculate (LG). The trade-off is that this information is less accurate. We’re more likely to make mistakes. As Dr Sapolsky explains, there is now strong evidence that this pathway is hyper-excitable in those with PTSD.

According to Dr Sapolsky, the frontal cortex should be regarded as part of the limbic system. Its role, essentially, “is getting you to do the harder thing when it’s the right thing to do,” as in behaving appropriately. Below is a screenshot of Dr Sapolsky illustrating in a very simple way two contrasting neural circuits. The circuit on the left has more axonal inputs going into its target neuron than the right. This makes it easier to activate this particular neural pathway.

But what if this pathway represents doing the wrong thing? To offset this, the frontal cortex essentially massages both circuits, inhibiting the left and biasing (rather than causing) excitation in the right. To accomplish enhanced excitation, the frontal cortex gets a boost from dopamine via projections shooting out of the ventral tegmental area and nucleus accumbens. Dopamine acts as the fuel in goal-directed behavior.

One illustration of doing the harder thing would involve reciting the months backward. The frontal cortex needs to be on its toes in processing the task, but those with damage in this area may have trouble over-riding the more habitual forward-recitation response.

Over time, learned behavior becomes automatic and gets stored elsewhere in the brain. Thus, someone with Alzheimer’s may not know what decade it is but still knows how to knit. This brings us to the famous example of Phineas Gage.

According to Dr Sapolsky, they take away your neurobiology license if you fail to mention this guy. In 1848, while tamping down blasting power during the construction of a railroad line, the powder exploded, sending the tamping rod through the side of Gage’s skull and out the top, taking out his left eye and emptying out most of his frontal cortex. (See top left image.) Amazingly, because the rod cauterized his blood vessels, Gage was able to get up and walk a mile-and-a-half to the nearest doctor.

Gage achieved a partial recovery, but experienced major problems controlling his behavior, leading his physician to conclude that this part of the brain “reins in our animal energies.” Interestingly enough, about a quarter of those on death row have a history of concussive trauma to the head.

Doing the harder and right thing tends to involve delaying gratification and not giving into temptation. Consider the m&m test. You hold five morsels in one hand and one in the other. The rule is you reach for five you get one, and vice-versa. People with cognitive impairments, even knowing the rule, may still reach for the five - they just can’t help it.

Lack of cortical input explains why our dreams make no sense. In REM sleep our frontal cortices are at their least active. That’s why in our dreams we do all sorts of things we would never want to do in real life. Thank heaven we're merely dreaming. Imagine, if we did some of that stuff in while awake. Oops - sometimes we mess up, and we find ourselves living with the consequences.

Much more to come ...