See Disclaimer Below.

Archive for the ‘Humanities’ Category

What was Gomillion v. Lightfoot?

In America, American History, Arts & Letters, History, Humanities, Law, Politics, Southern History, The South on August 21, 2013 at 8:45 am

Allen Mendenhall

This piece originally appeared here in the Encyclopedia of Alabama.

In Gomillion v. Lightfoot, the U.S. Supreme Court ruled in 1960 that Tuskegee city officials had redrawn the city’s boundaries unconstitutionally to ensure the election of white candidates in the city’s political races. The case was one of several events that laid the foundation for the 1965 Voting Rights Act, which prohibited discriminatory voting practices. The case was named for Tuskegee Normal and Industrial Institute (present-day Tuskegee University) professor Charles A. Gomillion, who was lead plaintiff, and the defendant, Tuskegee mayor, Philip M. Lightfoot, among other city officials.

Gomillion, dean of students and chair of the social sciences division at Tuskegee, for years had facilitated voter registration movements for blacks in Tuskegee. He learned in 1957 that several white citizens were promoting a bill in the state legislature to redefine the boundaries of the city to ensure election victories by whites in 1960. Resisting these efforts and urging others to oppose any referenda meant to disfranchise black voters, Gomillion and other activists appealed to the City Council, wrote to the County Commission, lobbied the state legislature, and published an open letter in the Montgomery Advertiser. Despite these efforts, Local Act No. 140, introduced by Samuel M. Engelhardt Jr., passed in the state legislature in 1957. It reconfigured the boundaries of the city from a simple square shape to a figure with 28 sides, removing from the city Tuskegee Institute and all but four or five of the nearly 400 black voters, but none of more than 1,300 white residents. Gomillion and the Tuskegee Civic Association treated this initial setback as an opportunity to institute legal proceedings and thereby to mobilize concerted political action.

Gomillion and other petitioners, black citizens of Alabama and residents (or former residents) of Tuskegee, alleged that the act violated the “due process” and “equal protection” clauses of the Fourteenth Amendment to the Constitution. They claimed that the redrawn city boundaries disfranchised black voters; therefore, they alleged, the act had a discriminatory purpose. In fact, the act’s author, Engelhardt, was executive secretary of the White Citizens’ Council of Alabama.

Tuskegee’s white citizens were trying to change the city’s boundaries to head off the rise in African Americans registering to vote. After World War II, local African Americans wanted to play a more active role in the city’s civic life, and whites became more determined to deny them that right. Redrawing the city’s boundaries had the unintended effect of uniting Tuskegee Institute’s African American intellectuals with the less educated African Americans living outside the sphere of the school. Some members of the school’s faculty realized that possessing advanced degrees ultimately provided them no different status among the city’s white establishment.

Initially, the U.S. District Court for the Middle District of Alabama, in Montgomery, headed by Judge Frank M. Johnson, dismissed the case, ruling that the state had the right to draw boundaries, a ruling that was upheld by the Court of Appeals for the Fifth Circuit in New Orleans. The case was appealed before the Supreme Court on October 18 and 19, 1960. Gomillion did not travel to Washington, D.C., with the lawyers handling his side of the case. Veteran Alabama civil rights attorney Fred Gray and Robert L. Carter, lead counsel for the National Association for the Advancement of Colored People (NAACP), argued the case, with assistance from Arthur D. Shores, who provided additional legal counsel. They claimed that the state’s intent in the redistricting had been to discriminate covertly against African Americans.

On November 14, the Supreme Court rendered a unanimous decision in favor of the petitioners. Justice Felix Frankfurter, writing for the majority, held that the act violated the Fifteenth Amendment, which prohibits states from passing laws depriving citizens of the right to vote, and thus reversed the lower courts’ rulings. Frankfurter likewise dismissed the city’s appeal of generalities about state authority. He conceded that states retain extensive powers, but that they may not do whatever they please with municipalities. The case showed that all state powers were subject to limitations imposed by the U.S. Constitution; therefore, states were not insulated from federal judicial review when they jeopardized federally protected rights. In 1961, the results of the decision went into effect; under the direction of Judge Johnson, the gerrymandering was reversed and the original map was reinstituted.

Additional Resources

Elwood, William A. “An Interview with Charles G. Gomillion.” Callaloo 40 (Summer 1989): 576-99.

Gomillion, C. G. “The Negro Voter in the South.” Journal of Negro Education 26(3): 281-86.

Gomillion v. Lightfoot, 364 U.S. 339 (1960).

Norrell, Robert J. Reaping the Whirlwind: The Civil Rights Movement in Tuskegee. New York: Alfred A. Knopf, 1985.

Taper, Bernard. Gomillion versus Lightfoot: The Tuskegee Gerrymander Case. New York: McGraw-Hill, 1962.

Unmasking

In Arts & Letters, Creative Writing, Essays, Humanities, The South, Writing on August 14, 2013 at 8:45 am

This essay first appeared here in Kestrel: A Journal of Literature and Art.

Allen Mendenhall

There is no remembrance of former things; neither shall there be any remembrance of things that are to come with those that shall come after.

                                                                                           Qoheleth 1 : 11

Southerners are particular about the way they preserve their loved ones; they encourage embalming, for instance, although at one time they shunned it as unconsented-to tampering with the body.  Eventually someone decided, rather wisely, that the deceased, had they a choice, would like a genteel display of their “shell.”  This meant more than sanitization: it meant dressing the dead like ladies or gentlemen on their way to church.  Which is precisely where they were going—just before they were buried in the ground.  For the most part, Southerners don’t cremate.  (A preacher once told me that the Bible discourages cremation.)

In the South—more than in other regions—funerals are hierarchical affairs: one’s nearness to the deceased signifies one’s importance to the family.  This holds for the church and burial service and is especially true if the departed was popular in life.  Being closest to the deceased, pallbearers shoulder the weightiest burden.

Nowhere is decorum more important than at a funeral procession.  It’s unseemly for one who’s not party to the procession to fail to bow his head and arrange a grave face as the procession passes.  If you’re in a vehicle, you pull over to the curb and, so long as it isn’t dangerous to do so, take up the sidewalk as if on foot.  Quitting the vehicle is, in general, inadvisable if by the time you encounter the procession the hearse is no longer in sight.  Or if, alternatively, the weather doesn’t permit.  If you’re in line, the modus operandi is ecclesiastic—ordered from clergy, to immediate kin, to next-of-kin, to distant family, to friends, to the rest.  Losing your place in line is, accordingly, like losing your intimacy with the family, for whom these rituals are carried out.

I was eight when Great-Granddaddy died.  Mom piloted me before his open-casket and whispered, “That’s not Great-Granddaddy.  That’s just a shell.  Great-Granddaddy’s gone to heaven.”

I looked down at the thing, the shell, the facsimile that seemed uncannily human, and said to myself—perhaps out loud—“That’s not Great-Granddaddy.  That’s something else.”  But the thing appeared real, strange, so nearly alive that it repulsed me.  Its eyes, thank God, were closed, but its mannequin face, vacant and plastic, nauseated me.

Mom prodded me away, hollering at my cousin to take me outside.  My first brush with death, while necessary, had not imparted a healthy understanding of mortality.

My grandmother, Nina, tried to familiarize me with the inescapable while I was still a boy.  Instead of taking me to playgrounds, she took me to cemeteries for what she called “Southern preparations.”  These outings usually occurred on warm spring afternoons, when azaleas bloomed bright white and pink, when yellow Jessamine vines crawled up walls and fences, when dogwoods yawned inflorescent, and when tulips, still un-beheaded, stretched with impeccable posture.  When, in short, nature was doing anything but dying.

Nina shared facts about various grave plots, giving the lowdown on so-and-so’s passing—“he died in Korea,” “he of aids,” “she during pregnancy,” and so forth.  When she finished, we fed the swans.

Which attacked me once.  I was standing on the riverbank, feeding the once-ugly ducklings by hand just as Nina had taught me, when, like Leda, I was enveloped by a feathered glory of beating white wings.  Traumatized, I no longer stood on shore but sat on the roof of the car.  To make me feel less sissy, Nina sat on the hood and pretended that she, too, was afraid.  It wasn’t their size exactly.  Nor the way they tussled with graceful wrath.  Maybe it was the mask about their swan eyes.  I’m sure it was that: the concealment, secret identity, veiled feelings.

Just before I got married, my fiancée, Giuliana, flew in from São Paulo to meet my family.  After supper, Nina insisted that I drive her through the cemetery.  I hadn’t been in years but instantly recognized the rod-iron gates that once seemed so colossal.  There was the river.  The ducks.  The swans.  In the distance, a family, their heads bowed, stood under a high green tent.

Giuliana was not disturbed by this detour.  Quite the contrary:  she felt in some way moved.  It was as if Nina had invited her into a private, intimate space: one that contradicted this modern world of medical science in which everyone tries to postpone or avert death.  In a cemetery one couldn’t help but think of decomposition, permanence, the soul.  One couldn’t help but track the beat of one’s heart, measure the inhales and exhales of one’s breathing.  One couldn’t help, that is, but cherish the fact that one’s alive.

My cell phone buzzed.  An unknown number flashed across the screen.  I answered, “Hello?”

“Mr. Mendenhall?”

“Yes.”

“Are you in the car?”

“No.”

“This is the cancer center at St. Joseph’s Hospital.  We need you to come in.”

I was twenty-four, and about to hear, “You have cancer.”

Nothing—not even a Southern upbringing—can prepare you for those three words.

The odd thing about preachers is that, depending on time and place, their company is either most welcome or most unwelcome.  When I got the call, the cancer call, my uncle, a preacher, was beside me, and I was, to that end, glad.  He made me feel the power of presence, to say nothing of companionship:  I was not alone.

My uncle—Uncle Steve—preaches in the only Southern Baptist church in Chicago.  Unlike most Southern Baptist preachers down South, he eschews the noisy and spectacular, preferring, instead, politesse and restraint.  Bookish and professorial, his voice nasal, his nose suitably sloped to hold up his saucer-sized spectacles, he loves theology and will tell you as much at the drop of a hat.  What with his general softness, he might, with a bit more age, have been mistaken for Truman Capote, with whom, incidentally, his father—my grandfather—had grown up in Monroeville, Alabama.

A man of custom, a student of Latin and Greek, fluent in Russian and French, a former lawyer and journalist, Uncle Steve is uncommonly qualified to carry on the sanctifying traditions of Western Civilization.  He is, in short, a gentleman and a scholar.  And he was in Atlanta that day, standing in the Varsity parking lot, his belly stuffed full of chili dogs, his ketchup-smudged face like an advertisement for this, the world’s largest drive-in restaurant.

I could feel his gaze moving over me and spared him the discomfort of asking what was the matter.

“I have cancer,” I said.

As the words issued from my mouth, my chest felt as though someone were driving a stake into it.  Cancer.  That thing other people got.  Old people.  Not young and healthy people.  Not me.

I tried to act normal, but in doing so betrayed what I really felt—terror.

Uncle Steve put his arm around me.  “Come on.  Let’s get to the hospital.”

Every hour on the hour, the employees of St. Joseph’s Hospital pray together.  These moments, though heavily orchestrated, bring peace to the ill and dying, the sick and suffering.  The nurses and doctors who wander the hallways pause while a disembodied, female voice recites the Lord’s Prayer, first in English, then in Spanish.  “Our Father, who art in heaven…”—the words echo off the cold, linoleum tiles—“hallowed be thy name.

This was happening when I walked into the waiting room.  A nurse, a heavyset black woman with the softest eyes I’d ever seen, was behind the counter, her necklace, weighed down by a tiny crucified Jesus, dangling at her pillow-like breasts.  She whispered, again and again, amen, amen, and then, looking up, took me in with those deep knowing eyes, spoke without speaking.  Sunlight streamed through the cool, trapezoid panes of glass in the ceiling, falling across her face and hair at a low angle.

At last the prayer ended.  She unfolded her hands and smiled formally.  “Good afternoon, how may I help you?”

Responding with “I have cancer” didn’t feel right, so I said, “I’m here to see Dr. Danaker.”

That was all she needed to know.

“Bless your heart, child,” she said.  And, for the first time, I got emotional.  She hugged me, calling me child again; then, right then, I wanted to be a child, wanted her to scoop me into her arms and cradle me, wanted her thick, strong body wrapped around me; but there, too, was Uncle Steve, dignified and collected.  I couldn’t break down in front of him.

The nurse ushered me into a white, windowless room with expansive tile walls and sat me on a tissue-papered chair, which swished and crackled whenever I readjusted my derrière.

There I was.  Conscious.  Being, yet trying to fathom not being.  I imagined myself in a coffin, like that horrid shell, Great-Granddaddy.  Which only made things worse, for I knew that, once in the coffin, I would have no notion of being there.  The problem was thinking itself.  I couldn’t imagine being dead because I couldn’t imagine not imagining.

On Sunday mornings, before church, dad had always made my siblings and me read from the obituaries.  This, he said, would acquaint us with the fragility of life.  He also thought the best way to learn was from experience.  But he’d known only one person who’d experienced death and, almost impossibly, lived to tell about it—Martin, a friend of the family, who’d apparently died three times and, on the operating table, been revived.  Martin loved cigarettes, which he called the backbone of Southern economy and which, he readily admitted, had brought about his three near-fatalities.

Except Martin didn’t put it in those terms.  To him, cigarettes had allowed him to float outside his body for a while, to see what death was like.  For better or worse, Martin didn’t tease a tunnel of light, greet a golden angel, or feel a fluffy cloud:  he simply “left” himself and, in a state of utter weightlessness, peered down on his body as would an outside observer.  Maybe that’s why dad didn’t like us talking to Martin about death: Dad wanted us to hear about St. Peter and heaven and departed relatives.

The trouble with Martin was that one never knew when to believe him.  Heck, we barely knew who he was.  Ephemerally at least, he’d been my aunt’s boyfriend; then, when she dumped him, he’d never gone away: he moved in with my other aunt, a single mother, and helped care for my young cousin.  Martin was present every Thanksgiving and Christmas, but neither got nor gave gifts.  A transplant from North Carolina, he had daughters somewhere—either the Carolinas or Virginia—and had graduated from the University of North Carolina at Chapel Hill, an achievement he was quite proud of.  He didn’t work.  Didn’t own a car.  And didn’t seem to have money.  His singular ability to access death could’ve been, for all we knew, lifted from a sci-fi novel.  Nevertheless, I believed him.

Ten.  That’s how old I was when I saw a dead body I wasn’t supposed to see.  A right turn on I-85, heading north, highway stretching to where sky and land sandwiched together.  I was in my school outfit, backpack in my lap.  Mom was in her tennis getup, checking the rearview mirror.  Traffic was slowing and stopping.  To my left was a vast gray sheet held up by blank-faced men.  Behind it, a woman.  Or what was left of a woman.  Arms and legs bent at impossible angles; head sagging, possibly unattached; a bloodied skirt lifted by the breeze.  Someone’s mom.  Or sister.  Or wife.  Or girlfriend.  Or daughter.  Here one minute, gone the next.  This wasn’t dignity.  This was mean and messy.

Death, they say, is not only universal but also the great leveler: it befalls kings and paupers, rich and poor, wise and foolish.  Solomon, Caesar, Constantine, Charlemagne, Napoleon: all died despite their glory in life.  What I never understood, and, frankly, still don’t, is why folks pretend death doesn’t happen.  The person who ignores death is delusional at best, narcissistic at worst.  Death is our sole commonality, the thing in this world we all await, about which we may commiserate.  It’s what makes us human.  I daresay one can’t fully love a person without knowing that person is temporary.

Francis Bacon once declared, “The contemplation of death, as the wages of sin, and passage to another world, is holy and religious; but the fear of it, as a tribute due unto nature, is weak.”  Weak it may be to the healthy and fit, but to the ill and ailing it seems only natural.  The person who claims he doesn’t fear death is either a liar or an incorrigible maniac—or else a coward, too faint-of-heart to face the facts.  Bacon himself had the good fortune of dying in two to three days, having contracted pneumonia while conducting an experiment in the snow.  Willfully blind to his fate, lying on his deathbed, he penned a letter to his friend, Thomas Howard, expressing relief that he hadn’t suffered the fortune of Caius Plinius, “who lost his life by trying an experiment about the burning of Mount Vesuvius.”

After surgery, I, like Bacon, was bedridden.  Soon a phone call would tell me one of two things: that I was okay, my cancer hadn’t metastasized, or else I wasn’t okay, I needed chemotherapy and my chances of living another two years were below fifteen percent.  A glued-together wound, resembling fat, blue, puckered-up lips, took up the length of my chest.  Visitors asked to see it and then regretted their request when I rolled up my shirt, revealing a moon-shaped, smurfy smile.  When the visitors left, and I was alone again, alone and quiet, I imagined what the malignancy would look like as it spread through my body, which I conceived of as a mini mine field: tunneled with small explosive cancer clusters about to be detonated.  How could this shell—which once ran a mile in under four-and-a-half minutes—expire?

I’m not in my brain but somewhere lower: near the chest, maybe, or the gut.  I couldn’t, for instance, stop a dream even if I wanted to.  Which is odd, because it’s my brain that’s dreaming—not someone else’s.  The brain works independently of me, or, to be precise, of what I perceive to be me: it’s like an unmanned motor boat zipping on the water.  Occasionally one of my siblings, or an old friend, will recall some long-ago event, which I’d otherwise forgotten, and then, suddenly, I’ll remember.  The brain has stored this memory somewhere—somewhere not readily accessible—but I, wherever I am in this shell, never felt compelled to find it.  The thought just exists up there, waiting.

It’s the soul, I suppose, that’s me.  When I lie awake at night and contemplate this interim body, which I inhabit the way a renter inhabits an apartment, I locate my self—that subjective knowing ego—whole and center, as though the brain, convenient as it is, has a mind of its own.  To be sure, I can borrow this organ when I study or otherwise require deep reflection; but when I tire of thinking, when I want a break, when I lean back from my desk, I’m very aware that I, my self, am moving from the head to just above the torso, where I belong.  And when I experience joy, compassion, anguish, despair—when, that is, I feel—it’s never with my head but with something deep within my bosom.  How does one explain this?  Perhaps we’re all antecedent to the body: little floating things confined to this definite, corporate form we didn’t choose, waiting, like thoughts, to be accessed—or released.    

Opossums, more commonly known, in the South, as “possums,” are, I’m told, a delicacy.  Nina’s got a cookbook that says so, though she claims she’s never cooked or eaten one.  I have my doubts, since my dad grew up eating squirrel, which, I think, is more revolting because squirrels are cute and handsome, whereas possums have that eerie look I associate with demons and devils—and masks.

At seven, I persuaded my brother to take a life.  A possum’s life.  It was a horrible affair, really.  One that, even today, is difficult to own up to.  Brett, being the gullible little brother he was—I convinced him once that the shadow-puppet giant who lived on the ceiling would kill him in his sleep—stomped on a squeaking pile of pine-straw while I looked on, presumably to punish him if he disobeyed.  Of course, the squeaking didn’t belong to the pine-straw, but to a tiny nest of baby possums underneath.

For some reason, I was initially proud of what I’d done, and, hours later, said as much to my mom.  Horrified, she made me show her the nest, since I’d “cried-wolf” before.  Sure enough, there, in the pine-straw, lay a bloody baby possum, whimpering, dying.

My first defense was I hadn’t done anything.  Brett had.  I’d simply stood by and watched.  Mom was smarter than that.  I don’t remember what she said—only that, once she said it, I began to cry.  And couldn’t stop.

It was this event, this murder of an innocent, that brought about my general appreciation for original sin, or least for the idea of innate human depravity.  Humans, you might say, are born rotten—so much so that most of us, in our youth, could stomp infant possums to death without understanding the wrongness of our action.  No doubt I regretted this behavior—this actus rea—but not because I felt guilty: it was, in effect, because I feared punishment—some combination of mom’s wrath and her spank-happiness.  A parent’s role is, among other things, to tame a child’s destructive impulses.  That’s what mom did—without succumbing to her own elemental aggressions.

She called the Chattahoochee Nature Center, a local environmental organization, and a worker there explained how to save the baby possum.  This, then, became my task, my agonizing punishment: to keep the possum alive.  Being intimate with death is one thing; being intimate with suffering quite another.  When I scooped the trembling creature up to my palm, it emitted a sad, pitiable squeak.  “Everything’s okay,” I whispered, “I’m not here to hurt you”—a funny assurance coming from the kid who’d just ordered its murder.

If truth be told, I wished I’d just destroyed the thing.  Better dead than in this wretched condition.  Still, the way it looked at me—its beady, searching eyes perusing my face—reminded me of how Ansley, my little sister, then only a year old, looked up at mom when she wanted to be fed.

I placed the creature in a shoe box, which I tucked beneath a shelf in my parents’ closet, the darkest place in the house.  More than anything, the possum needed darkness and silence.  I dug a hole in the backyard, tied two twigs together in the shape of a cross, and arranged a constellation of stones around what would’ve been a grave.  But the thing didn’t die.  It healed so well that, the next morning, it was squirming and scurrying and dad needed a net to contain it.  Even after the possum was free in the backyard, I left the grave untouched, a reminder that all things, even possums, eventually come to an end.

My Southern upbringing was all about learning how to die.  Like the Greek Stoics, Southerners believe in cultivating virtue, improving life, and, above all, accepting mortality.  Liberated from urban distractions, tied to land and home, they regard humans as custodians of the past; they keep gardens, preserve antiques, record lineage, mark battlefields, and salvage the efforts of planters, carpenters, raconteurs, and architects; they ensure, in short, the availability of history.  This can lead to nostalgia for times they never knew, bad times, ugly times, which is to say that this can cause Southerners to overlook—or, worse yet, revise—the inconveniences of history: slavery, for instance, or civil rights.  All the same, the Southern tradition, burdened as it is by various conflicts, retains virtues worth sustaining: community, family, religion, husbandry, stewardship.  These customs, however vulnerable, hardly need guardians.  They will, I suspect, persist, in some form or another, as long as humanity itself; for they are practical, permanent ideals—tested by generations—which people fall back on during disorienting times.  In a region haunted by racial brutality, these principles are, and have been, a unifying reference point, a contact zone where cultures—black, white, and Hispanic—share something spiritual despite their differences.

Living history, not just studying it, but consciously living it, is neither wicked nor wrong; the chronic, urgent awareness that everything you know and love will come undone, is not, I think, misguided, but utterly essential.  There’s something beautiful about facing the insurmountable.  When the world’s fleeting, death becomes a liberating, albeit terrifying, reality.  It throbs and pulsates and beats beneath the skin, inside of which we’re all raw skeleton.

For all this, however, I wasn’t ready.  Didn’t want to die.  Couldn’t even conceive of it.  The twenty-something years my family had been teaching me about death amounted to, not nothing, but not much, either.  Death, I suppose, is a hard thing to accept, and an even harder thing to fight, since fighting seems so pointless: deep down, you know you can’t win.  You might prevail once.  Maybe even twice.  But ultimately it’ll beat you.  It almost did me.

Friends ask how it feels to “beat” cancer.  I never can answer—not satisfactorily—for the experience is more like submission than competition: it’s a manifold process of coming to terms with the body, a thing doomed to decay.  When the doctor—Dr. Danaker—called to say the lymph nodes were benign, that the cancer hadn’t spread, I shocked him with a tired reply:  “Oh, good.”

“This is great news,” he assured me, as if I needed reminding, as if I hadn’t appreciated—indeed, hadn’t understood—how lucky I was.

“I know,” I said.

At this, the good doctor seemed annoyed.  “Ungrateful kid,” his tone implied.  But I wasn’t ungrateful.  Nor ecstatic.  I was, simply put, unbound—by life, by people, by things.  His take was that I had another chance, a fresh start, that I could put this nonsense behind me and move on.  My take was that, having embraced impermanence, I was done protecting myself from suffering, done seeking security through delusion, done dislocating from fate, destiny, providence, what have you.

Done: this, it is true, is weary resignation.  Yet it’s more than that: it’s a sweet but unhappy release, a deliverance, an unmasking.  Almost paradoxically, it’s freedom within—and despite—limitation.

What’s more exhilarating than that one should die?  What’s more mysterious, more horribly electrifying?  As one writer, Paul Theroux, has put it, “Death is an endless night so awful to contemplate that it can make us love life and value it with such passion that it may be the ultimate cause of all joy and all art.”  That is how you cope with this chilling, daunting, stupefying phenomenon: you do it every day until it’s serviceable and aesthetic, until at last you won’t know, can’t know, when it happens, until it’s pleasurable, a masterpiece, sublime in its regularity.  You keep it close, so close it becomes part of you, so close it’s at your disposal, so close that without it, you’re nothing, nothing if not boringly, thoughtlessly, mechanically alive, which is just another way of being dead.  You train and train and then it comes.

Law and Locality

In Arts & Letters, Humanities, Jurisprudence, Justice, Law, Libertarianism, Philosophy, Politics on August 7, 2013 at 8:45 am

Allen Mendenhall - Copy

On one common definition, law is a practice or set of rules based in custom and habit.  Law is not diktat.  It arises spontaneously through the interaction of human agents operating within and among social groups and precedes State promulgation.

Legislative enactment can reflect the law as it is constituted in the mores and traditions of groups, but it can also be the result of governmental usurpation.  The legislator does not embody the peoples he represents, and as society grows ever more complex and populations ever denser, as technologies link us more and more to one another in cyberspace and other virtual fora with disembodied communicants, the notion that the legislator speaks on behalf of his constituents becomes increasingly dubious if not downright absurd.

Local groups such as schools, clubs, community organizations, and churches have complex rules of exchange derived from shared mores and traditions.  They are more likely to speak accurately about the wants and needs of their community.  Their rules are not necessarily articulated, but tacitly understood.

These local groups recognize regulations not as monolithic, governmental impositions but as integrated schemes of social principles.  Group-members who fail or refuse to follow rules and regulations are punished.  On this local level, punishment can be simple: ostracism or public disapproval. A businessperson who violates another businessperson’s trust will lose business, just as he will lose clients by losing consumers’ trust; a church-member living in sin will likewise suffer from the judgment of his peers or, more appropriately, from the canon law pertaining to his sin.

In these examples, it is clear that the State should not intervene in punishing the wrongdoer; local custom and habit suffice to regulate conduct without resort to State violence or compulsion; therefore, private associations suffice to generate rules and their corresponding punishments.  Distant government bodies are not likely to conform to the intricate constitutions of local peoples and therefore are likely to exercise their disciplinary powers using punitive, exploitative, or arbitrary means.

William Lane Craig: Four Debates

In Arts & Letters, Christianity, Epistemology, Ethics, God, Humanities, Philosophy, Religion, Teaching on July 31, 2013 at 8:45 am

William Lane Craig

William Lane Craig, a philosopher and Christian apologist, is a member of Johnson Ferry Baptist Church, which my wife and I visited regularly when we lived in Atlanta and where my parents, siblings, grandmother, uncle, aunt, and cousins remain members.  Earlier this month, The Chronicle of Higher Education ran a profile piece on Dr. Craig.  Below are four high-profile debates in which Dr. Craig participated.  Enjoy.

1.  Dr. Craig debates Christopher Hitchens on the Existence of God.  The video has not been made available for embedding on external websites, so the best I can offer is a link.

2.  Dr. Craig debates Stephen Law on the Existence of God.

 

3.  Dr. Craig debates Peter Atkins on the existence of God.

 

4.  Dr. Craig debates Alex Rosenberg on the reasonableness of faith in God.

Pantry, 1982

In Arts & Letters, Creative Writing, Humanities, Poetry, Writing on July 24, 2013 at 8:45 am

Allen Mendenhall

 

A box of cereal, stale, ants running

Up the side, two brown bananas that

 

He says cleanse the pores

(If rubbed thoroughly),

 

An unwrapped chocolate bar

And a plethora of cans, unopened:

 

In a locked pantry, Little Maddy sits

Plucking the stems

 

Off Granny-Smiths.  Just ten more

Minutes.  Maddy, weary, wondering

 

Just when daddy would come home.

Time: the pantry is unlocked

 

And out comes light

And apples and, lastly, Maddy.

 

Daddy reaches

For the two rotting bananas,

 

Notes can upon unopened can,

Unwraps the chocolate bar,

 

Smears chocolate on his fingers,

Stops, thinks how unlikely it is

 

For apples to lose their stems.

Donna Meredith Reviews “Keep No Secrets,” by Julie Compton

In Arts & Letters, Books, Fiction, Humanities, Law, Law-and-Literature, Novels, Writing on July 17, 2013 at 8:45 am

Donna Meredith is a freelance writer living in Tallahassee, Florida. She taught English, journalism, and TV production in public high schools in West Virginia and Georgia for 29 years. Donna earned a BA in Education with a double major in English and Journalism from Fairmont State College, an MS in Journalism from West Virginia University, and an EdS in English from Nova Southeastern University. She has also participated in fiction writing workshops at Florida State University and served as a newsletter editor for the Florida State Attorney General’s Office. The Glass Madonna was her first novel. It won first place for unpublished women’s fiction in the Royal Palm Literary Awards, sponsored by the Florida Writers Association, and runner up in the Gulf Coast novel writing contest. Her second novel, The Color of Lies, won the gold medal for adult fiction in 2012 from the Florida Publishers Association and also first place in unpublished women’s fiction from the Florida Writers Association. Her latest book is nonfiction, Magic in the Mountains, the amazing story of how a determined and talented woman revived the ancient art of cameo glass in the twentieth century in West Virginia.  She is currently working on a series of environmental thrillers featuring a female hydrogeologist as the lead character.

Julie Compton

Above: Julie Compton

The following review is appearing simultaneously in Southern Literary Review.

Keep No Secrets, Julie Compton’s powerful sequel to Tell No Lies, is guaranteed to keep readers turning pages into the wee hours of the morning. Both of Compton’s courtroom thrillers are set in St. Louis, Missouri, where she grew up.

Like Jodi Picoult’s best works, Compton’s novels sizzle with all the trust, betrayal, love, and forgiveness family relationships entail—especially when you expose their private conflicts in a public courtroom. Her books seem to pose this question: how well can you know even those people closest to you?

Read Tell No Lies first. Though the sequel provides enough backstory to be a great read on its own, without understanding the first book you’d miss the riveting psychological development of the primary characters, all of whom star in the sequel as well.

In Tell No Lies, idealistic lawyer Jack Hilliard leaves behind a lucrative private practice to run for district attorney. The plot centers around a high-profile murder case. Jack is easy to like because he tries so hard to do the right thing. But there wouldn’t be a story if he were perfect. He yields to one temptation, which hurls his life on a downward spiral that nearly ends his marriage and his career.

The final plot twist leaves you wondering if Jack has been manipulated. Compton is that rare author who trusts her readers’ intelligence. She allows us to figure things out for ourselves, to experience the same doubts as Jack Hilliard. It makes the novel more like our own lives, where we can’t always tell what people’s motives are or know when they are lying.

Keep No Secrets begins four and a half years after the events of Tell No Lies. During that time, Jack Hilliard has worked arduously to repair the damage caused by his mistakes—and has largely succeeded. Until the night he finds his teenage son Michael having sex with his girlfriend. They are drunk. Being a white knight kind of guy, Jack gives the girl a ride home. In an effort to win back his son’s love and respect, Jack doesn’t tell his wife about Michael’s transgressions. That car ride sets off an unforeseeable chain of events that threaten to wreck Jack’s career and marriage once again.

Think that’s enough dirt to dump on a nice guy like Jack? Not a chance. The already untenable situation deteriorates further when Jenny Dodson, the woman involved in his earlier downfall, reappears after all these years, asking for his help. He can’t say no, but he vows to keep his wife truthfully informed of everything that happens. He does. Sort of. “The lies aren’t what he says; they’re what he doesn’t say”—this is a refrain Compton artfully employs several times.

This novel deals with social issues like the impact of adultery and sexual assault on families. Most readers are going to put themselves in the various characters’ situations and ask themselves if they would have behaved differently. Would we lie to protect a loved one? What if you knew something that would put the one you love in jail or in danger? Would you tell the truth? What if not telling keeps an innocent person imprisoned? How far should we trust the legal system? If a spouse gave us reason to doubt, could we forgive and trust again? When is it time to give a marriage another chance—and when is it time to walk away?

Compton’s novels are as fine as any courtroom thrillers out there. Though her use of present tense can be a bit distracting, the well-plotted series sparkles with psychologically complex characters.

For both undergraduate work and law school, Compton attended Washington University in Missouri. She began her legal career there, but last practiced in Wilmington, Delaware, as a trial attorney for the U.S. Department of Justice. She now lives near Orlando with her husband and two daughters and writes full-time. She is also the author of Rescuing Olivia, a novel of suspense, romance, and family drama.

Below: Donna Meredith

Donna Meredith

Abolish the Bar Exam

In America, American History, Arts & Letters, History, Humanities, Law, Legal Education & Pedagogy, Nineteenth-Century America on July 10, 2013 at 8:45 am

Allen Mendenhall

This article originally appeared here at LewRockwell.com.

Every year in July, thousands of anxious men and women, in different states across America, take a bar exam in hopes that they will become licensed attorneys. Having memorized hundreds if not thousands of rules and counter-rules — also known as black letter law — these men and women come to the exam equipped with their pens, laptops, and government-issued forms of identification. Nothing is more remote from their minds than that the ideological currents that brought about this horrifying ritual were fundamentally statist and unquestionably bad for the American economy.

The bar exam is a barrier to entry, as are all forms of professional licensure. Today the federal government regulates thousands of occupations and excludes millions of capable workers from the workforce by means of expensive tests and certifications; likewise various state governments restrict upward mobility and economic progress by mandating that workers obtain costly degrees and undergo routinized assessments that have little to do with the practical, everyday dealings of the professional world.

As a practicing attorney, I can say with confidence that many paralegals I know can do the job of an attorney better than some attorneys, and that is because the practice of law is perfected not by abstract education but lived experience.

So why does our society require bar exams that bear little relation to the ability of a person to understand legal technicalities, manage case loads, and satisfy clients? The answer harkens back to the Progressive Era when elites used government strings and influence to prevent hardworking and entrepreneurial individuals from climbing the social ladder.

Lawyers were part of two important groups that Murray Rothbard blamed for spreading statism during the Progressive Era: the first was “a growing legion of educated (and often overeducated) intellectuals, technocrats, and the ‘helping professions’ who sought power, prestige, subsidies, contracts, cushy jobs from the welfare state, and restrictions of entry into their field via forms of licensing,” and the second was “groups of businessmen who, after failing to achieve monopoly power on the free market, turned to government — local, state, and federal — to gain it for them.”

The bar exam was merely one aspect of the growth of the legal system and its concomitant centralization in the early twentieth century. Bar associations began cropping up in the 1870s, but they were, at first, more like professional societies than state-sponsored machines. By 1900, all of that changed, and bar associations became a fraternity of elites opposed to any economic development that might threaten their social status.

The elites who formed the American Bar Association (ABA), concerned that smart and savvy yet poor and entrepreneurial men might gain control of the legal system, sought to establish a monopoly on the field by forbidding advertising, regulating the “unauthorized” practice of law, restricting legal fees to a designated minimum or maximum, and scaling back contingency fees. The elitist progressives pushing these reforms also forbade qualified women from joining their ranks.

The American Bar Association was far from the only body of elites generating this trend. State bars began to rise and spread, but only small percentages of lawyers in any given state were members. The elites were reaching to squeeze some justification out of their blatant discrimination and to strike a delicate balance between exclusivity on the one hand, and an appearance of propriety on the other. They made short shrift of the American Dream and began to require expensive degrees and education as a prerequisite for bar admission. It was at this time that American law schools proliferated and the American Association of Law Schools (AALS) was created to evaluate the quality of new law schools as well as to hold them to uniform standards.

At one time lawyers learned on the job; now law schools were tasked with training new lawyers, but the result was that lawyers’ real training was merely delayed until the date they could practice, and aspiring attorneys had to be wealthy enough to afford this delay if they wanted to practice at all.

Entrepreneurial forces attempted to fight back by establishing night schools to ensure a more competitive market, but the various bar associations, backed by the power of the government, simply dictated that law school was not enough: one had to first earn a college degree before entering law school if one were to be admitted to practice. Then two degrees were not enough: one had to pass a restructured, formalized bar exam as well.

Bar exams have been around in America since the eighteenth century, but before the twentieth century they were relaxed and informal and could have been as simple as interviewing with a judge. At the zenith of the Progressive Era, however, they had become an exclusive licensing agency for the government. It is not surprising that at this time bar associations became, in some respects, as powerful as the states themselves. That’s because bar associations were seen, as they are still seen today, as agents and instrumentalities of the state, despite that their members were not, and are not, elected by the so-called public.

In our present era, hardly anyone thinks twice of the magnificent powers exercised and enjoyed by state bar associations, which are unquestionably the most unquestioned monopolies in American history. What other profession than law can claim to be entirely self-regulated? What other profession than law can go to such lengths to exclude new membership and to regulate the industry standards of other professions?

Bar associations remain, on the whole, as progressive today as they were at their inception. Their calls for pro bono work and their bias against creditors’ attorneys, to name just two examples, are wittingly or unwittingly part of a greater movement to consolidate state power and to spread ideologies that increase dependence upon the state and “the public welfare.” It is rare indeed to find the rhetoric of personal responsibility or accountability in a bar journal. Instead, lawyers are reminded of their privileged and dignified station in life, and of their unique position in relation to “members of the public.”

The thousands of men and women who will sit for the bar exam this month are no doubt wishing they didn’t have to take the test. I wish they didn’t have to either; there should be no bar exam because such a test presupposes the validity of an authoritative entity to administer it. There is nothing magical about the practice of law; all who are capable of doing it ought to have a chance to do it. That will never happen, of course, if bar associations continue to maintain total control of the legal profession. Perhaps it’s not just the exam that should go.

The Politics of Paternalism

In America, American History, Conservatism, Humanities, Jurisprudence, Law, News and Current Events, Politics, Southern History on July 3, 2013 at 8:45 am

Allen Mendenhall

This first appeared here at The American Spectator.

One of the Supreme Court opinions everyone is buzzing about — last Monday’s decision in Fisher v. University of Texas at Austin, a case involving that school’s affirmative action program — will not be monumental in our canons of jurisprudence.

The petitioner, Abigail Noel Fisher, a young white woman, applied to the university in 2008 and was denied admission. She challenged the decision, arguing that she would have been admitted under a colorblind system. The high court has now remanded the case back to the Fifth Circuit, holding that the lower court failed to properly ascertain whether the affirmative action program was the most narrowly tailored means to achieve the university’s diversity goal. In legal terms, the Fifth Circuit had failed to subject the program to “strict scrutiny.” Thus, additional litigation lies ahead; the case is not even over.

What will be remembered from Monday’s proceedings, though, is Justice Thomas’ concurrence, which treats affirmative action as paternalism — a word he implies but doesn’t use explicitly, at least not here.

The dichotomies “liberal” versus “conservative,” “left” versus “right,” complicate rather than clarify issues such as affirmative action. A better choice of words, if a dichotomy must be maintained, is “paternalism” versus “non-paternalism.” Viewing diversity in this light, as Justice Thomas does, enables us to understand and appreciate the forms that racism and discrimination take.

Those forms often are paternalistic: Person A assumes to understand the plight of person X and undertakes to care for and control him as a father would his children. Even if X were one day to achieve relative equality with A in real terms — opportunity, education, earning capacity — this dominance would persist so long as A views X as a needy inferior, and so long as X allows that presumption to persist.

Thomas’s concurrence places such toxic ideas under a microscope, and exposes the ironic double standards of those who resort to paternalism. For instance, the bulk of his concurrence describes how the university’s arguments in favor of affirmative action are the same or substantially similar to those once used to justify racial segregation and even slavery. “There is no principled distinction,” Thomas writes, “between the University’s assertion that diversity yields educational benefits and the segregationists’ assertion that segregation yielded those same benefits.”

Likewise, he adds, “Slaveholders argued that slavery was a ‘positive good’ that civilized Blacks and elevated them in every dimension of life.” Advocates of slavery and segregationists both argued, in other words, that their policies bettered the conditions of Blacks and minimized racial hostility on the whole. The form of these racist arguments is now being used to justify state discrimination through affirmative action programs.

The segregationists argued that integrated public schools would suffer from white flight; proponents of affirmative action argue that universities will suffer from a lack of diversity if discrimination is not allowed.

The segregationists argued that blacks would become the victims of desegregation once white children withdrew from public schools en masse and that separate but equal schools improved interracial relations; proponents of affirmative action likewise argue that minorities will be the victims if affirmative action programs are deemed unconstitutional and that diversity on campus improves interracial relations.

The segregationists argued that separate but equal schools allowed blacks to enjoy more leadership opportunities; proponents of affirmative action likewise argue that affirmative action programs empower minorities to become leaders in a diverse society.

The segregationists argued that although separate but equal schools were not a perfect remedy for racial animosity, such schools were nevertheless a practical step in the right direction; proponents of affirmative action likewise argue that it, although not ideal, nevertheless generates race consciousness among students.

In the face of these surprising parallels, Justice Thomas maintains that “just as the alleged educational benefits of segregation were insufficient to justify racial discrimination” during the Civil Rights Era, so “the alleged educational benefits of diversity cannot justify racial discrimination today.”

He should not be misunderstood as equating affirmative action with the discrimination unleashed upon blacks and other minorities throughout American history. Although he acknowledges that affirmative action does harm whites and Asians, he is chiefly concerned with how such discrimination harms its intended beneficiaries: above all, blacks and Hispanics. “Although cloaked in good intentions,” Thomas submits, “the University’s racial tinkering harms the very people it claims to be helping.” He adds that “the University would have us believe that its discrimination is…benign. I think the lesson of history is clear enough: Racial discrimination is never benign.”

Why aren’t affirmative action programs — which Justice Thomas at one point refers to as “racial engineering” — benign? He gives several reasons: They admit blacks and Hispanics who aren’t as prepared for college as white and Asian students; they do not ensure that blacks and Hispanics close the learning gap during their time in college; they do not increase the overall number of blacks and Hispanics who attend college; and they encourage unqualified applicants to graduate from great schools as mediocre students instead of good schools as exceptional students. Moreover, Justice Thomas cites studies showing that minorities interested in science and engineering are more likely to choose different paths when they are forced to compete with other students in those disciplines at elite universities. What Justice Thomas considers most damning of all, however, is the “badge of inferiority” stamped on racial minorities as a result of affirmative action.

Just one small personal example: When I was in law school, a few of the guys in my study group began comparing professors, as students do regularly, and they were quite open in their opinion that our black professor could not have been as intelligent, because she had benefited from affirmative action programs. Read the rest of this entry »

The 13 Virtues of Benjamin Franklin

In America, American History, Arts & Letters, Books, Ethics, History, Humanities, Literature, Western Civilization on June 26, 2013 at 8:49 am

Benjamin Franklin

In his autobiography, Benjamin Franklin listed 13 virtues by which he sought to live.  Here they are:

1.  TEMPERANCE. Eat not to dullness; drink not to elevation.

2.  SILENCE. Speak not but what may benefit others or yourself; avoid trifling conversation.

3.  ORDER. Let all your things have their places; let each part of your business have its time.

4.  RESOLUTION. Resolve to perform what you ought; perform without fail what you resolve.

5.  FRUGALITY. Make no expense but to do good to others or yourself; i.e., waste nothing.

6.  INDUSTRY. Lose no time; be always employ’d in something useful; cut off all unnecessary actions.

7.  SINCERITY. Use no hurtful deceit; think innocently and justly, and, if you speak, speak accordingly.

8.  JUSTICE. Wrong none by doing injuries, or omitting the benefits that are your duty.

9.  MODERATION. Avoid extreams; forbear resenting injuries so much as you think they deserve.

10.  CLEANLINESS. Tolerate no uncleanliness in body, cloaths, or habitation.

11.  TRANQUILLITY. Be not disturbed at trifles, or at accidents common or unavoidable.

12.  CHASTITY. Rarely use venery but for health or offspring, never to dulness, weakness, or the injury of your own or another’s peace or reputation.

13.  HUMILITY. Imitate Jesus and Socrates.

Franklin was a great man, even if he fell far short of his own high standards.  Lists like these can, I think, help one to improve oneself.  See my reading list for this year.

Pragmatists Versus Agrarians?

In America, American History, Arts & Letters, Book Reviews, Books, Conservatism, Emerson, History, Humanities, Liberalism, Literary Theory & Criticism, Literature, Nineteenth-Century America, Philosophy, Politics, Pragmatism, Southern History, Southern Literature, Western Civilization, Western Philosophy, Writing on June 19, 2013 at 8:45 am

Allen Mendenhall

This review originally appeared here at The University Bookman.

John J. Langdale’s Superfluous Southerners paints a magnificent portrait of Southern conservatism and the Southern Agrarians, and it will become recognized as an outstanding contribution to the field of Southern Studies. It charts an accurate and compelling narrative regarding Southern, Agrarian conservatism during the twentieth century, but it erroneously conflates Northern liberalism with pragmatism, muddying an otherwise immaculate study.

Langdale sets up a false dichotomy as his foundational premise: progressive, Northern pragmatists versus traditionalist, Southern conservatives. From this premise, he draws several conclusions: that Southern conservatism offers a revealing context for examining the gradual demise of traditional humanism in America; that Northern pragmatism, which ushered in modernity in America, was an impediment to traditional humanism; that “pragmatic liberalism” (his term) was Gnostic insofar as it viewed humanity as perfectible; that the man of letters archetype finds support in Southern conservatism; that Southern conservatives eschewed ideology while Northern liberals used it to present society as constantly ameliorating; that Southern conservatives celebrated “superfluity” in order to preserve canons and traditions; that allegedly superfluous ways of living were, in the minds of Southern conservatives, essential to cultural stability; that Agrarianism arose as a response to the New Humanism; and that superfluous Southerners, so deemed, refined and revised established values for new generations.

In short, his argument is that Southern conservatives believed their errand was to defend and reanimate a disintegrating past. This belief is expressed in discussion of the work of six prominent Southern men of letters spanning two generations: John Crowe Ransom, Donald Davidson, Allen Tate, Cleanth Brooks, Richard Weaver, and M. E. Bradford.

Langdale ably demonstrates how the Southern Agrarians mounted an effective and tireless rhetorical battle against organized counterforces, worried that scientific and industrial progress would replace traditional faith in the unknown and mysterious, and fused poetry and politics to summon forth an ethos of Romanticism and chivalry. He sketches the lines of thought connecting the earliest Agrarians to such later Southerners as Weaver and Bradford. He is so meticulous in his treatment of Southern conservatives that it is surprising the degree to which he neglects the constructive and decent aspects of pragmatism.

Careful to show that “Agrarianism, far from a monolithic movement, had always been as varied as the men who devised it,” he does not exercise the same fastidiousness and impartiality towards the pragmatists, who are branded with derogatory labels throughout the book even though their ideas are never explained in detail. The result is a series of avoidable errors.

First, what Langdale treats as a monolithic antithesis to Southern conservatism is actually a multifaceted philosophy marked by only occasional agreement among its practitioners. C. S. Peirce was the founder of pragmatism, followed by William James, yet Peirce considered James’s pragmatism so distinct from his own that he renamed his philosophy “pragmaticism.” John Dewey reworked James’s pragmatism until his own version retained few similarities with James’s or Peirce’s. Oliver Wendell Holmes Jr. never identified himself as a pragmatist, and his jurisprudence is readily distinguishable from the philosophy of Peirce, James, and Dewey. Each of these men had nuanced interpretations of pragmatism that are difficult to harmonize with each other, let alone view as a bloc against Southern, traditionalist conservatism.

Second, the Southern Agrarians espoused ideas that were generally widespread among Southerners, embedded in Southern culture, and reflective of Southern attitudes. By contrast, pragmatism was an academic enterprise rejected by most Northern intellectuals and completely out of the purview of the average Northern citizen. Pragmatism was nowhere near representative of Northern thinking, especially not in the political or economic realm, and it is hyperbolic to suggest, as Langdale does, that pragmatism influenced the intellectual climate in the North to the extent that traditionalist conservatism influenced the intellectual climate in the South.

Third, the pragmatism of Peirce and James is not about sociopolitical or socioeconomic advancement. It is a methodology, a process of scientific inquiry. It does not address conservatism per se or liberalism per se. It can lead one to either conservative or liberal outcomes, although the earliest pragmatists rarely applied it to politics as such. It is, accordingly, a vehicle to an end, not an end itself. Peirce and James viewed it as a technique to ferret out the truth of an idea by subjecting concrete data to rigorous analysis based on statistical probability, sustained experimentation, and trial and error. Although James occasionally undertook to discuss political subjects, he did not treat pragmatism as the realization of political fantasy. Pragmatism, properly understood, can be used to validate a political idea, but does not comprise one.

The Southern Agrarians may have privileged poetic supernaturalism over scientific inquiry; it does not follow, however, that pragmatists like Peirce and James evinced theories with overt or intended political consequences aimed at Southerners or traditionalists or, for that matter, Northern liberals. Rather than regional conflict or identity, the pragmatists were concerned with fine-tuning what they believed to be loose methods of science and epistemology and metaphysics. They identified with epistemic traditions of Western philosophy but wanted to distill them to their core, knowing full well that humans could not perfect philosophy, only tweak it to become comprehensible and meaningful for a given moment. On the other hand, the Southern Agrarians were also concerned with epistemology and metaphysics, but their concern was invariably colored by regional associations, their rhetoric inflected with political overtones. Both Southern Agrarians and pragmatists attempted to conserve the most profitable and essential elements of Western philosophy; opinions about what those elements were differed from thinker to thinker.

Fourth, Langdale’s caricature (for that is what it is) of pragmatism at times resembles a mode of thought that is alien to pragmatism. For instance, he claims that “pragmatism is a distinctly American incarnation of the historical compulsion to the utopian and of what philosopher Eric Voegelin described as the ancient tradition of ‘gnosticism.’” Nothing, however, is more fundamental to pragmatism than the rejection of utopianism or Gnosticism. That rejection is so widely recognized that even Merriam-Webster lists “pragmatism” as an antonym for “utopian.”

Pragmatism is against teleology and dogma; it takes as its starting point observable realities rather than intangible, impractical abstractions and ideals. What Langdale describes is more like Marxism: a messianic ideology with a sprawling, utopian teleology regarding the supposedly inevitable progress of humankind.

Given that pragmatism is central to his thesis, it is telling that Langdale never takes the time to define it, explain the numerous differences between leading pragmatists, or analyze any landmark pragmatist texts. The effect is disappointing.

Landgale’s approach to “superfluity” makes Superfluous Southerners the inverse of Richard Poirier’s 1992 Poetry and Pragmatism: whereas Langdale relates “superfluity” to Southern men of letters who conserve what the modern era has ticketed as superfluous, Poirier relates “superfluity” to Emerson and his literary posterity in Robert Frost, Gertrude Stein, Wallace Stevens, T. S. Eliot, William Carlos Williams, and Ezra Pound. Both notions of superfluity contemplate the preservation of perennial virtues and literary forms; one, however, condemns pragmatism while the other applauds it.

For both Langdale and Poirier, “superfluity” is good. It is not a term of denunciation as it is usually taken to be. Langdale cites Hungarian sociologist Karl Mannheim to link “superfluity” to traditionalists who transform and adapt ideas to “the new stage of social and mental development,” thus keeping “alive a ‘strand’ of social development which would otherwise have become extinct.”

Poirier also links superfluity to an effort to maintain past ideas. His notion of “superfluity,” though, refers to the rhetorical excesses and exaggerated style that Emerson flaunted to draw attention to precedents that have proven wise and important. By reenergizing old ideas with creative and exhilarating language, Emerson secured their significance for a new era. In this respect, Emerson is, in Poirier’s words, “radically conservative.”

Who is right? Langdale or Poirier? Langdale seeks to reserve superfluity for the province of Southern, traditionalist conservatives. Does this mean that Poirier is wrong? And if Poirier is right, does not Langdale’s binary opposition collapse into itself?

These questions notwithstanding, it is strange that Langdale would accuse the Emersonian pragmatic tradition of opposing that which, according to Poirier, it represents. Although it would be wrong to call Emerson a political conservative, he cannot be said to lack a reverence for history. A better, more conservative criticism of Emerson—which Langdale mentions in his introduction—would involve Emerson’s transcendentalism that promoted a belief in innate human goodness. Such idealism flies in the face of Southern traditionalism, which generally abides by the Augustinian doctrine of innate human depravity and the political postures appertaining thereto.

What Langdale attributes to pragmatism is in fact a bane to most pragmatists. A basic tenet of pragmatism, for instance, is human fallibilism, which is in keeping with the doctrine of innate human depravity and which Peirce numbers as among his reasons for supporting the scientific method. Peirce’s position is that one human mind is imperfect and cannot by itself reach trustworthy conclusions; therefore, all ideas must be filtered through the logic and experimentation of a community of thinkers; a lasting and uniform consensus is necessary to verify the validity of any given hypothesis. This is, of course, anathema to the transcendentalist’s conviction that society corrupts the inherent power and goodness of the individual genius.

Langdale’s restricted view of pragmatism might have to do with unreliable secondary sources. He cites, of all people, Herbert Croly for the proposition that, in Croly’s words, “democracy cannot be disentangled from an aspiration toward human perfectibility.” The connection between Croly and pragmatism seems to be that Croly was a student of James, but so was the politically and methodologically conservative C. I. Lewis. And let us not forget that the inimitable Jacques Barzun, who excoriated James’s disciples for exploiting and misreading pragmatism, wrote an entire book—A Stroll with William James—which he tagged as “the record of an intellectual debt.”

Pragmatism is a chronic target for conservatives who haven’t read much pragmatism. Frank Purcell has written in Taki’s Magazine about “conservatives who break into hives at the mere mention of pragmatism.” Classical pragmatists are denominated as forerunners of progressivism despite having little in common with progressives. The chief reason for this is the legacy of John Dewey and Richard Rorty, both proud progressives and, nominally at least, pragmatists.

Dewey, behind James, is arguably the most recognizable pragmatist, and it is his reputation, as championed by Rorty, that has done the most to generate negative stereotypes and misplaced generalizations about pragmatism. Conservatives are right to disapprove of Dewey’s theories of educational reform and social democracy, yet he is just one pragmatist among many, and there are important differences between his ideas and the ideas of other pragmatists.

In fact, the classical pragmatists have much to offer conservatives, and conservatives—even the Southern Agrarians—have supported ideas that are compatible with pragmatism, if not outright pragmatic. Burkean instrumentalism, committed to gradualism and wary of ideological extremes, is itself a precursor to social forms of pragmatism, although it bears repeating that social theories do not necessarily entail political action.

Russell Kirk’s The Conservative Mind traces philosophical continuities and thus provides clarifying substance to the pragmatist notion that ideas evolve over time and in response to changing technologies and social circumstances, while always retaining what is focal or fundamental to their composition. The original subtitle of that book was “From Burke to Santayana,” and it is remarkable, is it not, that both Burke and Santayana are pragmatists in their own way? Santayana was plugged into the pragmatist network, having worked alongside James and Josiah Royce, and he authored one of the liveliest expressions of pragmatism ever written: The Life of Reason. Although Santayana snubbed the label, general consensus maintains that he was a pragmatist. It is also striking that Kirk places John Randolph of Roanoke and John C. Calhoun, both Southern conservatives, between these pragmatists on his map of conservative thought. There is, in that respect, an implication that pragmatism complements traditionalism.

Langdale relies on Menand’s outline of pragmatism and appears to mimic Menand’s approach to intellectual history. It is as though Langdale had hoped to write the conservative, Southern companion to The Metaphysical Club. He does not succeed because his representation of pragmatism is indelibly stamped by the ideas of Rorty, who repackaged pragmatism in postmodern lexica. Moreover, Langdale’s failure or refusal to describe standing differences between the classical pragmatists and neo-pragmatists means that his book is subject to the same critique that Susan Haack brought against Menand.

Haack lambasted Menand for sullying the reputation of the classical pragmatists by associating pragmatism with nascent Rortyianism—“vulgar Rortyianism,” in her words. Langdale seems guilty of this same supposition. By pitting pragmatism against Southern conservatism, he implies that Southern conservatism rejects, among other features, the application of mathematics to the scientific method, the analysis of probabilities derived from data sampling and experimentation, and the prediction of outcomes in light of statistical inferences. The problem is that the Agrarians did not oppose these things, although their focus on preserving the literary and cultural traditions of the South led them to express their views through poetry and story rather than as philosophy. But there is nothing in these methods of pragmatism (as opposed to the uses some later pragmatists may have put to them) that is antithetical to Southern Agrarianism.

Superfluous Southerners is at its best when it sticks to its Southern subjects and does not undertake comparative analyses of intellectual schools. It is at its worst when it resorts to incorrect and provocative phrases about “the gnostic hubris of pragmatists” or “the gnostic spirit of American pragmatic liberalism.” Most of its chapters do a remarkable job teasing out distinctions between its Southern conservative subjects and narrating history about the Southern Agrarians’ relationship to modernity, commitment to language and literature, and role as custodians of a fading heritage. Unfortunately, his book confounds the already ramified philosophy known as pragmatism, and at the expense of the Southern traditionalism that he and I admire.