See Disclaimer Below.

Archive for the ‘Essays’ Category

Michael Blumenthal Publishes “Just Three Minutes, Please,” with West Virginia University Press

In America, American Literature, Arts & Letters, Books, Creative Writing, Essays, Humanities, Law-and-Literature, Literature, Michael Blumenthal, News and Current Events, News Release, Poetry, Politics, Writing on March 5, 2014 at 8:30 am

Just Three Minutes, Please

West Virginia University Press is pleased to announce the publication of Just Three Minutes, Please: Thinking out Loud on Public Radio, by Michael Blumenthal.

In these brief essays, Blumenthal provides unconventional insights into our contemporary political, educational, and social systems, challenging us to look beyond the headlines to the psychological and sociological realities that underlie our conventional thinking.

What’s wrong with the contemporary American medical system? What does it mean when a state’s democratic presidential primary casts 40% of its votes for a felon incarcerated in another state? What’s so bad about teaching by PowerPoint? What is truly the dirtiest word in America?

These are just a few of the engaging and controversial issues that Michael Blumenthal, poet, novelist, essayist, and law professor, tackles in this collection of poignant essays commissioned by West Virginia Public Radio.

C.K. Williams, Pulitzer Prize-winning poet proclaims that Blumenthal has “The intellect of a scholar, the sensitivity of a poet, the objectivity of a professor of law: it hardly seems possible that so many virtues can be embodied in one book of short talks.”

Dalton Delan, Executive Producer of In Performance at the White House for PBS, declares: “David Sedaris and Ira Glass have a brother from another mother, and his name is Michael Blumenthal. His soulful NPR essays are profound thought-clouds from one of America’s finest poets.”

As a widely published poet and novelist, Blumenthal brings along a lawyer’s analytical ability with his literary sensibility, effortlessly facilitating a distinction between the clichés of today’s pallid political discourse and the deeper realities that lie beneath. This collection will captivate and provoke those with an interest in literature, politics, law, and the unwritten rules of our social and political engagements.

Michael Blumenthal is a Visiting Professor of Law and Co-Director of the Immigration Clinic at West Virginia University College of Law. A former Director of Creative Writing at Harvard University, he is the author of eight books of poetry, as well as All My Mothers and Fathers, a memoir; Weinstock Among The Dying, a novel; When History Enters the House, a collection of essays; and “Because They Needed Me”: The Incredible Struggle of Rita Miljo To Save The Baboons of South Africa, a book-length account of his work with orphaned infant chacma baboons in South Africa. His first collection of short stories, The Greatest Jewish-American Lover in Hungarian History, is forthcoming.

To order this book, visit wvupress.com, phone (800) 621-2736, or visit a local bookstore.

Just Three Minutes, Please: Thinking out Loud on Public Radio by Michael Blumenthal
March 2014 / 120pp / PB 978-1-938228-77-3: $16.99/ ePub 978-1-938228-78-0: $16.99

Transcendental Liberty

In America, American History, Arts & Letters, Creativity, Emerson, Essays, Ethics, History, Humane Economy, Humanities, Libertarianism, Literary Theory & Criticism, Literature, Nineteenth-Century America, Philosophy, Poetry, Politics, Property, Rhetoric, Western Philosophy, Writing on January 15, 2014 at 8:45 am

Allen 2

This essay originally appeared here in The Freeman.

“The less government we have, the better.” So declared Ralph Waldo Emerson, a  man not usually treated as a classical liberal. Yet this man—the Sage of  Concord—held views that cannot be described as anything but classical liberal or  libertarian.

None other than Cornel West, no friend of the free market, has said that  “Emerson is neither a liberal nor a conservative and certainly not a socialist  or even a civic republican. Rather he is a petit bourgeois libertarian, with at  times anarchist tendencies and limited yet genuine democratic sentiments.” An  abundance of evidence supports this view. Emerson was, after all, the man who  extolled the “infinitude of the private man.” One need only look at one of  Emerson’s most famous essays, “Self Reliance,” for evidence of his  libertarianism.

“Self-Reliance” is perhaps the most exhilarating expression of individualism  ever written, premised as it is on the idea that each of us possesses a degree  of genius that can be realized through confidence, intuition, and nonconformity.  “To believe your own thought, to believe that what is true for you in your  private heart is true for all men,” Emerson proclaims, “that is genius.”

Genius, then, is a belief in the awesome power of the human mind and in its  ability to divine truths that, although comprehended differently by each  individual, are common to everyone. Not all genius, on this view, is necessarily  or universally right, since genius is, by definition, a belief only, not a  definite reality. Yet it is a belief that leads individuals to “trust thyself”  and thereby to realize their fullest potential and to energize their most  creative faculties. Such self-realization has a spiritual component insofar as  “nothing is at last sacred but the integrity of your own mind” and “no law can  be sacred to me but that of my nature.”

According to Emerson, genius precedes society and the State, which corrupt  rather than clarify reasoning and which thwart rather than generate  productivity. History shows that great minds have challenged the conventions and  authority of society and the State and that “great works of art have no more  affecting lesson for us than this. They teach us to abide by our spontaneous  impression with good-humored inflexibility then most when the whole cry of  voices is on the other side.” Accordingly, we ought to refuse to “capitulate to  badges and names, to large societies and dead institutions.” We ought, that is,  to be deliberate, nonconformist pursuers of truth rather than of mere  apprehensions of truth prescribed for us by others. “Whoso would be a man,”  Emerson says, “must be a noncomformist.”

Self-Interest and Conviction

For Emerson, as for Ayn Rand, rational agents act morally by pursuing their  self-interests, including self-interests in the well-being of family, friends,  and neighbors, who are known and tangible companions rather than abstract  political concepts. In Emerson’s words, “The only right is what is after my  constitution, the only wrong what is against it.” Or: “Few and mean as my gifts  may be, I actually am, and do not need for my own assurance or the assurance of  my fellows any secondary testimony.”

It is not that self-assurance equates with rightness, or that stubbornness  is a virtue; it is that confidence in what one knows and believes is a condition  precedent to achieving one’s goals. Failures are inevitable, as are setbacks;  only by exerting one’s will may one overcome the failures and setbacks that are  needed to achieve success.

If, as Emerson suggests, a “man is to carry himself in the presence of all  opposition, as if everything were titular and ephemeral but he,” how should he  treat the poor?  Emerson supplies this answer:

Do not tell me, as a good man did to-day, of my  obligation to put all poor men in good situations. Are they my poor? I tell  thee, thou foolish philanthropist, that I grudge the dollar, the dime, the cent,  I give to such men as do not belong to me and to whom I do not belong. There is  a class of persons to whom by all spiritual affinity I am bought and sold; for  them I will go to prison, if need be; but your miscellaneous popular charities;  the education at college of fools; the building of meeting-houses to the vain  end to which many now stand; alms to sots; and the thousandfold Relief  Societies;—though I confess with shame I sometimes succumb and give the dollar,  it is a wicked dollar which by and by I shall have the manhood to withhold.

These lines require qualification. Emerson is not damning philanthropy or  charity categorically or unconditionally; after all, he will, he says, go to  prison for certain individuals with whom he shares a special relationship. He  is, instead, pointing out, with much exhibition, that one does not act morally  simply by giving away money without conviction or to subsidize irresponsible,  unsustainable, or exploitative business activities. It is not moral to give away  a little money that you do not care to part with, or to fund an abstract cause  when you lack knowledge of, and have no stake in, its outcome. Only when you  give money to people or causes with which you are familiar, and with whom or  which you have something at stake, is your gift meaningful; and it is never  moral to give for show or merely to please society. To give morally, you must  mean to give morally—and have something to lose.

Dissent

Emerson famously remarks that a “foolish consistency is the hobgoblin of  little minds, adored by little statesmen and philosophers and divines.” Much ink  has been spilled to explain (or explain away) these lines. I take them to mean,  in context, that although servile flattery and showy sycophancy may gain a  person recognition and popularity, they will not make that person moral or great  but, instead, weak and dependent. There is no goodness or greatness in a  consistency imposed from the outside and against one’s better judgment; many  ideas and practices have been consistently bad and made worse by their very  consistency. “With consistency,” therefore, as Emerson warns, “a great soul has  simply nothing to do.”

Ludwig von Mises seems to have adopted the animating, affirming  individualism of Emerson, and even, perhaps, Emerson’s dictum of nonconformity.  Troping Emerson, Mises remarks that “literature is not conformism, but dissent.”  “Those authors,” he adds, “who merely repeat what everybody approves and wants  to hear are of no importance. What counts alone is the innovator, the dissenter,  the harbinger of things unheard of, the man who rejects the traditional  standards and aims at substituting new values and ideas for old ones.” This man  does not mindlessly stand for society and the State and their compulsive  institutions; he is “by necessity anti-authoritarian and anti-governmental,  irreconcilably opposed to the immense majority of his contemporaries. He is  precisely the author whose books the greater part of the public does not buy.”  He is, in short, an Emersonian, as Mises himself was.

The Marketplace of Ideas

To be truly Emersonian, you may not accept the endorsements and propositions  in this article as unconditional truth, but must, instead, read Emerson and  Mises and Rand for yourself to see whether their individualism is alike in its  affirmation of human agency resulting from inspirational nonconformity. If you  do so with an inquiring seriousness, while trusting the integrity of your own  impressions, you will, I suspect, arrive at the same conclusion I have  reached.

There is an understandable and powerful tendency among libertarians to  consider themselves part of a unit, a movement, a party, or a coalition, and of  course it is fine and necessary to celebrate the ways in which economic freedom  facilitates cooperation and harmony among groups or communities; nevertheless,  there is also a danger in shutting down debate and in eliminating competition  among different ideas, which is to say, a danger in groupthink or compromise, in  treating the market as an undifferentiated mass divorced from the innumerable  transactions of voluntarily acting agents. There is, too, the tendency to become  what Emerson called a “retained attorney” who is able to recite talking points  and to argue predictable “airs of opinion” without engaging the opposition in a  meaningful debate.

Emerson teaches not only to follow your convictions but to engage and  interact with others, lest your convictions be kept to yourself and deprived of  any utility. It is the free play of competing ideas that filters the good from  the bad; your ideas aren’t worth a lick until you’ve submitted them to the test  of the marketplace.

“It is easy in the world,” Emerson reminds us, “to live after the world’s  opinion; it is easy in solitude to live after our own; but the great man is he  who in the midst of the crowd keeps with perfect sweetness the independence of  solitude.” Let us stand together by standing alone.

Seven Points of Grammar

In Advocacy, Arts & Letters, Communication, Essays, Law, Legal Education & Pedagogy, Legal Research & Writing, Teaching, Writing on November 20, 2013 at 8:45 am

Allen 2

An earlier version of this piece appeared here in The Alabama Lawyer.

As a staff attorney to Chief Justice Roy S. Moore, I read several briefs and petitions each day.  I have noticed that certain grammatical errors are systemic among attorneys.  Some errors are excusable; others aren’t.  Here are seven errors that are inexcusable.

1.    “Whoever” and “Whomever”

Many attorneys do not know the difference between whoever and whomever.  Test your knowledge by answering these questions:

Which of the following sentences is correct?

A.  Give the document to whoever requests it.

B.  Give the document to whomever requests it.

Which of the following sentences is correct?

A.  Whoever arrives first will get a copy.

B.  Whomever arrives first will get a copy.

If you answered A to both questions, you were correct.  Here is a trick to help determine whether to use whoever or whomever:

STEP ONE:  Imagine a blank space where you wish to use whoever or whomever.

Example: Give the document to ______ requests it.

STEP TWO:  Split the blank space to create two sentences; then fill in the blanks with the pronouns he or him.

Example: Give the document to himHe requests it.

STEP THREE:  Whenever you fill in the blank space with a him/he combination, use whoever.  As we have already seen, the previous sentence should read, “Give it to whoever requests it.”  Whenever you fill in the blank space with a him/him combination, use whomever.

Him/He = whoever

Him/Him = whomever

Here are more examples:

STEP ONE:           You should hire ______ Pete recommends.

STEP TWO:          You should hire him.  Pete recommends him.

STEP THREE:      You should hire whomever Pete recommends.

 

STEP ONE:            This letter is to ______ wrote that brief.

STEP TWO:           This letter is to himHe wrote that brief.

STEP THREE:       This letter is to whoever wrote that brief.

 

STEP ONE:           The prize is for _____ wins the contest.

STEP TWO:          The prize is for himHe wins the contest.

STEP THREE:      The prize is for whoever wins the contest.

 

STEP ONE:            The lawyer made a good impression on ______ he met.

STEP TWO:           The lawyer made a good impression on him.  He met him.

STEP THREE:       The lawyer made a good impression on whomever he met.

 

STEP ONE:            The lawyer tried to make a good impression on ______ was there.

STEP TWO:           The lawyer tried to make a good impression on himHe was there.

STEP THREE:       The lawyer tried to make a good impression on whoever was there.

2.    “Who” and “Whom”

The difference between who and whom has fallen out of favor in common speech, but retains its importance in formal writing.  Use who if the pronoun is a subject or subject complement in a clause.  Use whom if the pronoun is an object in a clause.  A trick to help determine whether to employ who or whom is to rephrase the sentence using a personal pronoun such as he or him.  Consider the following:

A.      Proper: Whom did you meet?  (Rephrase: I met him.)

           Him is objective, so whom is proper.

Improper:  Who did you meet?

B.       Proper: Who do you think murdered the victim?  (Rephrase: I think he murdered the victim.)

           He is subjective, so who is proper.

Improper: Whom do you think murdered the victim?

C.        Proper: Who was supposed to finish that brief last week?  (Rephrase: He was supposed to finish that brief last week.)

            He is subjective, so who is proper.

Improper: Whom was supposed to finish that brief last week?

D.        Proper:  Justice Brown is the man for whom I voted.  (Rephrase: I voted for him.)

            Him is objective, so whom is proper.

Improper:  Justice Brown is the man who I voted for.

3.    “As Such”

I used to practice at a mid-sized law firm in Atlanta.  Tasked with reviewing the writing of all associate attorneys at the firm, one partner became hardheaded about two words: “as such.”  He always struck through the word “therefore” and replaced it with the words “as such.”  He did this so often that I finally decided to correct him. I was tired of watching him substitute a grammatical error for a sound construction.

When I spoke up, he got defensive.  “As such means ‘therefore,’” he said.

He was wrong.

The Random House Dictionary (2013) describes “as such” as an “idiom” that means “as being what is indicated” or “in that capacity.”  In other words, after you have described something, you use the phrase “as such” to refer back to that something “as described.”  Here are examples:

  1. He is the president of the university; as such, he is responsible for allocating funds to each department.
  2. This is a matter of law; as such, it is subject to de novo review.
  3. Theft is a crime; treat it as such.

In these examples, “as such” properly refers back to a definite antecedent.

“As such” appears regularly in legal writing.  Whenever I see this construction misused, I think about that partner in Atlanta and become agitated.

“As such” is a simple construction; as such, it entails a simple application.  Don’t be shy about calling out your colleagues when you see them misuse this construction, even if you are a “lowly” associate.  You might just save them—and the partners—from embarrassment.

4.    The Colon

Although many rules govern the use of colons, I want to focus on this one: Never place a colon between a preposition and its object or between a verb and its complement.  Likewise, never place a colon after such words or phrases as especially, including, or such as.

These sentences violate this rule:

  1. He was convicted of several crimes, including: first-degree robbery, arson, third-degree burglary, and second-degree forgery.
  2. Some affirmative defenses are: statute of frauds, waiver, statute of limitations, and contributory negligence.
  3. Most restrictive covenants have provisions about the developer or declarant such as: “Property Subject to the Declaration,” “Easements,” “Assessments,” and “Membership.”
  4. She enjoys the sites, especially: the courthouse, the town square, and the memorial.

No colon is necessary in these sentences.

5.    Subject-Verb Agreement: “Neither,” “Nor,” “Either,” “Each,” and “Number”

Attorneys generally understand subject-verb agreement.  A verb must agree with its subject in number.  That is, a singular subject must take a singular verb; a plural subject must take a plural verb.  The following words, however, give attorneys trouble: neither, nor, either, each, and number.  What follows should clarify how to make these nouns agree with a verb.

Neither Mel’s clients nor his associate ___ going to the meeting tomorrow.

When you pair neither and nor as conjunctions linking two nouns, choose the noun closest to the verb and let that noun determine whether you use is or are.  In the example above, associate is closest to the verb.  Associate is singular, so the proper verb is is.

Neither of the partners ___ attending the meeting.

Neither is singular and the subject of the sentence.  It requires a singular verb: is.  The verb is not are if the plural noun (partners) is not the subject.  Partners is not the subject; it is part of a prepositional phrase.

___ either of you available to take his deposition tomorrow?

Either is singular and the subject of the sentence.  It requires a singular verb: is.  The verb is not are if the plural noun (you) is not the subject.  You is not the subject; it is part of a prepositional phrase.

Each of you ___ contributed valuable insights to the case.

The pronoun each is the subject of the sentence.  Each is singular and requires a singular verb: has.  Many attorneys will write have because they think that each is plural or that the verb must modify the plural noun youYou is part of a prepositional phrase and cannot serve as the subject of the sentence.

The number of thefts ___ increasing.

Number can be singular or plural depending on the context.  Here, number is used with the definite article the.  Therefore, the singular verb (is) applies.  In most cases, if number is used with the indefinite article a, then the plural verb (are) applies.

6.    The Possessive Form of Nouns Ending in “S”

My sixth grade teacher instructed me never to add ’s after a singular noun ending with an s or s sound.  She was wrong.  The trick to nouns ending with an s or an s sound is that no trick exists: the rule is the same for these nouns as for all other nouns (with a few notable exceptions, such as the words “its” and “yours”).  To form a singular possessive, add ’s to the singular noun.  To form a plural possessive, add an apostrophe to the plural noun.  Here are some examples:

Singular Noun

Mr. Jones               Mr. Jones’s

Mrs. Burnes           Mrs. Burnes’s

The boss                The boss’s

Plural Noun

The Joneses           The Joneses’

The Burnses           The Burnses’

The bosses             The bosses’

7.    “Only”

Only is one of the most regularly used words in the English language.  It is also one of the most regularly misused modifiers.  Below are examples of how attorneys misuse only in petitions and briefs.  I have altered the language in these examples to conceal the identity of the authors.

A.  “The appellant only references the reason why the appellee did not seek counseling.”

This sentence implies that the appellant does nothing—nothing at all—but reference the reason why the appellee did not seek counseling.  The appellant does not eat, sleep, think, talk, love, feel, or breathe.  The only thing he does is reference the reasons why the appellant did not seek counseling.  He must be a robot.  The author of this sentence intended to say the following: “The appellant references only the reason why the appellee did not seek counseling.”  This revised sentence means that, of all the reasons from which he could have chosen, the appellant referenced only one.  The appellant could have referenced other reasons, but did not.

B.  “He only robbed two people.”

This example suggests that “he” has never done anything—anything at all—but rob two people.  If all you have ever done is rob two people, your entire existence has been a crime.  The author of this sentence intended to say the following: “He robbed only two people.”  This revised statement should cause one to ask, “That’s it?  Just two people?”

C.  “The agency granted the application on the condition that the hospital only will move 300 beds.”

A hospital that does nothing but move 300 beds will not help sick patients.  The author of this sentence should have written, “The agency granted the application on the condition that the hospital will move only three-hundred beds.”  In this revised sentence, “only” modifies “three-hundred beds” rather than the verb “will move.”

Attorneys are educated; we tend to avoid using language if we aren’t certain about its grammatical soundness.  But something about the foregoing rules baffles us.

The rules, though, are easy.  What’s difficult is overcoming habits and industry-wide error.  If you aren’t certain about a rule, don’t just ask your colleagues for the solution.  And don’t take your colleagues’ suggestions at face value.  Consult a good, reliable grammar book.  Doing so will improve your writing and possibly raise the quality of writing among the entire profession.

John William Corrington, A Literary Conservative

In American History, Arts & Letters, Conservatism, Creative Writing, Essays, Fiction, History, Humanities, John William Corrington, Joyce Corrington, Law, Literary Theory & Criticism, Literature, Modernism, Southern History, Southern Literature, Television, Television Writing, The Novel, The South, Western Philosophy, Writing on October 23, 2013 at 8:45 am

 

Allen 2

 

An earlier version of this essay appeared here at Fronch Porch Republic.

Remember the printed prose is always

half a lie: that fleas plagued patriots,

that greatness is an afterthought

affixed by gracious victors to their kin.

 

—John William Corrington

 

It was the spring of 2009.  I was in a class called Lawyers & Literature.  My professor, Jim Elkins, a short-thin man with long-white hair, gained the podium.  Wearing what might be called a suit—with Elkins one never could tell—he recited lines from a novella, Decoration Day.  I had heard of the author, John William Corrington, but only in passing.

“Paneled walnut and thick carpets,” Elkins beamed, gesturing toward the blank-white wall behind him, “row after row of uniform tan volumes containing between their buckram covers a serial dumb show of human folly and greed and cruelty.”  The students, uncomfortable, began to look at each other, registering doubt.  In law school, professors didn’t wax poetic.  But this Elkins—he was different.  With swelling confidence, he pressed on: “The Federal Reporter, Federal Supplement, Supreme Court Reports.  Two hundred years of our collective disagreements and wranglings from Jay and Marshall through Taney and Holmes and Black and Frankfurter—the pathetic often ill-conceived attempts to resolve what we have done to one another.”

Elkins paused.  The room went still.  Awkwardly profound, or else profoundly awkward, the silence was like an uninvited guest at a dinner party—intrusive, unexpected, and there, all too there.  No one knew how to respond.  Law students, most of them, can rattle off fact-patterns or black-letter-law whenever they’re called on.  But this?  What were we to do with this?

What I did was find out more about John Willliam Corrington.  Having studied literature for two years in graduate school, I was surprised to hear this name—Corrington—in law school.  I booted up my laptop, right where I was sitting, and, thanks to Google, found a few biographical sketches of this man, who, it turned out, was perplexing, riddled with contradictions: a Southerner from the North, a philosopher in cowboy boots, a conservative literature professor, a lawyer poet.  This introduction to Corrington led to more books, more articles, more research.  Before long, I’d spent over $300 on Amazon.com.  And I’m not done yet.

***

Born in Cleveland, Ohio, on October 28, 1932, Corrington—or Bill, as his friends and family called him—passed as a born-and-bred Southerner all of his life.  As well he might, for he lived most of his life below the Mason-Dixon line, and his parents were from Memphis and had moved north for work during the Depression.  He moved to the South (to Shreveport, Louisiana) at the age of 10, although his academic CV put out that he was, like his parents, born in Memphis, Tennessee.  Raised Catholic, he attended a Jesuit high school in Louisiana but was expelled for “having the wrong attitude.”  The Jesuit influence, however, would remain with him always.  At the beginning of his books, he wrote, “AMDG,” which stands for Ad Majorem Dei Gloriam—“for the greater glory of God.”  “It’s just something that I was taught when I was just learning to write,” he explained in an interview in 1985, “taught by the Jesuits to put at the head of all my papers.”

Bill was, like the late Mark Royden Winchell, a Copperhead at heart, and during his career he authored or edited, or in some cases co-edited, twenty books of varying genres.  He earned a B.A. from Centenary College and M.A. in Renaissance literature from Rice University, where he met his wife, Joyce, whom he married on February 6, 1960.  In September of that year, he and Joyce moved to Baton Rouge, where Bill became an instructor in the Department of English at Louisiana State University (LSU).  At that time, LSU’s English department was known above all for The Southern Review (TSR), the brainchild of Cleanth Brooks and Robert Penn Warren, but also for such literary luminaries as Robert Heilman, who would become Bill’s friend.

In the early 1960s, Bill pushed for TSR to feature fiction and poetry and not just literary criticism.  He butted heads with then-editors Donald E. Stanford and Lewis P. Simpson, who thought of the journal as scholarly, not creative, as if journals couldn’t be both scholarly and creative.  A year after joining the LSU faculty, Bill published his first book of poetry, Where We Are.  With only 18 poems and 225 first edition printings, the book hardly established Bill’s reputation as Southern man of letters.  But it invested his name with recognition and gave him confidence to complete his first novel, And Wait for the Night (1964).

Bill and Joyce spent the 1963-64 academic year in Sussex, England, where Bill took the D.Phil. from the University of Sussex in 1965.  In the summer of 1966, at a conference at Northwestern State College, Mel Bradford, that Dean of Southern Letters, pulled Bill aside and told him, enthusiastically, that And Wait for the Night (1964) shared some of the themes and approaches of William Faulkner’s The Unvanquished.  Bill agreed.  And happily.

***

Of Bill and Miller Williams, Bill’s colleague at LSU, Jo LeCoeur, poet and literature professor, once submitted, “Both men had run into a Northern bias against what was perceived as the culturally backward South.  While at LSU they fought back against this snub, editing two anthologies of Southern writing and lecturing on ‘The Dominance of Southern Writers.’  Controversial as a refutation of the anti-intellectual Southern stereotype, their joint lecture was so popular [that] the two took it on the road to area colleges.”

In this respect, Bill was something of a latter-day Southern Fugitive—a thinker in the tradition of Donald Davidson, Allan Tate, Andrew Nelson Lytle, and John Crowe Ransom.  Bill, too, took his stand.  And his feelings about the South were strong and passionate, as evidenced by his essay in The Southern Partisan, “Are Southerners Different?” (1984).  Bill’s feelings about the South, however, often seemed mixed.  “[T]he South was an enigma,” Bill wrote to poet Charles Bukowski, “a race of giants, individualists, deists, brainy and gutsy:  Washington, Jefferson, Madison, Jackson (Andy), Davis, Calhoun, Lee, and on and on.  And yet the stain of human slavery on them.”  As the epigraph (above) suggests, Bill was not interested in hagiographic renderings of Southern figures.  He was interested in the complexities of Southern people and experience.  In the end, though, there was no doubt where his allegiances lay.  “You strike me as the most unreconstructed of all the Southern novelists I know anything about,” said one interviewer to Bill.  “I consider that just about the greatest compliment anyone could give,” Bill responded.

While on tour with Williams, Bill declared, “We are told that the Southerner lives in the past.  He does not.  The past lives in him, and there is a difference.”  The Southerner, for Bill, “knows where he came from, and who his fathers were.”  The Southerner “knows still that he came from the soil, and that the soil and its people once had a name.”  The Southerner “knows that is true, and he knows it is a myth.”  And the Southerner “knows the soil belonged to the black hands that turned it as well as it ever could belong to any hand.”  In short, the Southerner knows that his history is tainted but that it retains virtues worth sustaining—that a fraught past is not reducible to sound bites or political abstractions but is vast and contains multitudes.

***

In 1966, Bill and Joyce moved to New Orleans, where the English Department at Loyola University, housed in a grand Victorian mansion on St. Charles Avenue, offered him a chairmanship.  Joyce earned the M.S. in chemistry from LSU that same year.  By this time, Bill had written four additional books of poetry, the last of which, Lines to the South and Other Poems (1965), benefited from Bukowski’s influence.  Bill’s poetry earned a few favorable reviews but not as much attention as his novels—And Wait for the Night (1964), The Upper Hand (1967), and The Bombardier (1970).  Writing in The Massachusetts Review, Beat poet and critic Josephine Miles approvingly noted two of Bill’s poems from Lines, “Lucifer Means Light” and “Algerien Reveur,” alongside poetry by James Dickey, but her comments were more in passing than in depth.  Dickey himself, it should be noted, admired Bill’s writing, saying, “A more forthright, bold, adventurous writer than John William Corrington would be very hard to find.”

Joyce earned her PhD in chemistry from Tulane in 1968.  Her thesis, which she wrote under the direction of L. C. Cusachs, was titled, “Effects of Neighboring Atoms in Molecular Orbital Theory.”  She began teaching chemistry at Xavier University, and her knowledge of the hard sciences brought about engaging conservations, between her and Bill, about the New Physics.  “Even though Bill only passed high school algebra,” Joyce would later say, “his grounding in Platonic idealism made him more capable of understanding the implications of quantum theory than many with more adequate educations.”

By the mid-70s, Bill had become fascinated by Eric Voeglin.  A German historian, philosopher, and émigré who had fled the Third Reich, Voegelin taught in LSU’s history department and lectured for the Hoover Institution at Stanford University, where he was a Salvatori Fellow.  Voeglin’s philosophy, which drew from Friedrich von Hayek and other conservative thinkers, inspired Bill.  In fact, Voegelin made such a lasting impression that, at the time of Bill’s death, Bill was working on an edition of Voegelin’s The Nature of the Law and Related Legal Writings.  (After Bill’s death, two men—Robert Anthony Pascal and James Lee Babin—finished what Bill had begun.  The completed edition appeared in 1991.)

By 1975, the year he earned his law degree from Tulane, Bill had penned three novels, a short story collection, two editions (anthologies), and four books of poetry.  But his writings earned little money.  He also had become increasingly disenchanted with the political correctness on campus:

By 1972, though I’d become chair of an English department and offered a full professorship, I’d had enough of academia. You may remember that in the late sixties and early seventies, the academic world was hysterically attempting to respond to student thugs who, in their wisdom, claimed that serious subjects seriously taught were “irrelevant.” The Ivy League gutted its curriculum, deans and faculty engaged in “teach-ins,” spouting Marxist-Leninist slogans, and sat quietly watching while half-witted draft-dodgers and degenerates of various sorts held them captive in their offices. Oddly enough, even as this was going on, there was a concerted effort to crush the academic freedom of almost anyone whose opinions differed from that of the mob or their college-administrator accessories. It seemed a good time to get out and leave the classroom to idiots who couldn’t learn and didn’t know better, and imbeciles who couldn’t teach and should have known better.

Bill joined the law firm of Plotkin & Bradley, a small personal injury practice in New Orleans, and continued to publish in such journals as The Sewanee Review and The Southern Review, and in such conservative periodicals as The Intercollegiate Review and Modern Age.  His stories took on a legal bent, peopled as they were with judges and attorneys.  But neither law nor legal fiction brought him fame or fortune.

So he turned to screenplays—and, at last, earned the profits he desired.  Viewers of the recent film I am Legend (2007), starring Will Smith, might be surprised to learn that Bill and Joyce wrote the screenplay for the earlier version, Omega Man (1971), starring Charlton Heston.  And viewers of Battle for the Planet of the Apes (1973) might be surprised to learn that Bill wrote the film’s screenplay while still a law student.  All told, Bill and Joyce wrote five screenplays and one television movie.  Free from the constraints of university bureaucracy, Bill collaborated with Joyce on various television daytime dramas, including Search for Tomorrow, Another World, Texas, Capitol, One Life to Live, Superior Court, and, most notably, General Hospital.  These ventures gained the favor of Hollywood stars, and Bill and Joyce eventually moved to Malibu.

Bill constantly molded and remolded his image, embracing Southern signifiers while altering their various expressions.  His early photos suggest a pensive, put-together gentleman wearing ties and sport coats and smoking pipes.  Later photos depict a rugged man clad in western wear.  Still later photos conjure up the likes of Roy Orbison, what with Bill’s greased hair, cigarettes, and dark sunglasses.

Whatever his looks, Bill was a stark, provocative, and profoundly sensitive writer.  His impressive oeuvre has yet to receive the critical attention it deserves.  That scholars of conservatism, to say nothing of scholars of Southern literature, have ignored this man is almost inconceivable.  There are no doubt many aspects of Bill’s life and literature left to be discovered.  As Bill’s friend William Mills put it, “I believe there is a critique of modernity throughout [Bill’s] writing that will continue to deserve serious attentiveness and response.”

On Thanksgiving Day, November 24, 1988, Bill suffered a heart attack and died.  He was 56.  His last words, echoing Stonewall Jackson, were, “it’s all right.”

 

The Law Review Model as a Check against Bias?

In Academia, Arts & Letters, Essays, Humanities, Law, Scholarship, Writing on October 9, 2013 at 7:45 am

Allen 2

A version of this essay appeared in Academic Questions.

Could peer-reviewed humanities journals benefit by having student editors, as is the practice for law reviews? Are student editors valuable because they are less likely than peer reviewers to be biased against certain contributors and viewpoints?  I begin with a qualifier: What I am about to say is based on research, anecdotes, and experience rather than empirical data that I have compiled on my own. I do not know for sure whether student editors are more or less biased than professional academics, and I hesitate to displace concerns for expertise and experience with anxiety about editorial bias. There may be situations in which students can make meaningful contributions to reviewing and editing scholarship—and to scholarship itself—but to establish them as scholarly peers is, I think, a distortion and probably a disservice to them and their fields.

Student editors of and contributors to law reviews may seem to be the notable exception, but legal scholarship is different from humanities scholarship in ways I address below, and law reviews suffer from biases similar to those endemic to peer-reviewed journals. Nevertheless, law review submission and editing probably have less systemic bias than peer-reviewed journals, but not because students edit them. Rather, law review submission and editing make it more difficult for bias to occur. The system, not the students, facilitates editorial neutrality.

There are several factors about this system that preclude bias. Because editors are students in their second and third year of law school, editorial turnover is rapid. Every year a law review has a new editorial team composed of students with varied interests and priorities. What interested a journal last year will be different this year. Therefore, law reviews are not likely to have uniform, long-lasting standards for what and whom to publish—at least not with regard to ideology, political persuasion, or worldview.

Law review editors are chosen based on grades and a write-on competition, not because they are likeminded or pursuing similar interests. Therefore, law reviews are bound to have more ideological and topical diversity than peer-reviewed journals, which are premised upon mutual interest, and many of which betray the academic side of cronyism: friends and friends of friends become editors of peer-reviewed journals notwithstanding a record of scholarship. The composition of law review editorial boards is, by contrast, based upon merit determined through heated competition.

Once on board, law review student editors continue to compete with one another, seeking higher ranks within editorial hierarchies.[1] Being the editor-in-chief or senior articles editor improves one’s résumé and looks better to potential employers than being, say, the notes editor. Voting or evaluations of academic performance establish the hierarchies. Moreover, each year only a few student articles are published, so editors are competing with one another to secure that special place for their writing.[2] Finally, student editors usually receive grades for their performance on law review. The result of all of this competition is that law review editors are less able than peer reviewers to facilitate ideological uniformity or to become complacent in their duties—and law reviews will exhibit greater ideological diversity and publish more quickly and efficiently than peer-reviewed journals.

Because of the ample funding available to law schools, scores of specialized journals have proliferated to rival the more traditional law reviews. Many specialized law reviews were designed to compensate for alleged bias. There are journals devoted to women’s issues, racial issues, law and literature, law and society, critical legal studies, and so on. There are also journals aimed principally at conservatives: Harvard Journal of Law and Public Policy, Texas Review of Law & Politics, and Georgetown Journal of Law & Public Policy, to name three. Specialized journals give students and scholars a forum for the likeminded. On the other hand, such journals call for specialization, which students are unlikely to possess.[3]

For these reasons, I believe that bias is less prevalent among law reviews than among peer-reviewed journals. Part of the difficulty in determining bias, however, is that data collection depends upon the compliance of law review editors, who receive and weed through thousands of submissions per submission period and have neither the time nor the energy to compile and report data about each submission. Moreover, these editors, perhaps in preparation for likely careers as attorneys, are often required to maintain strict confidentiality regarding authors and submissions, thereby making “outside” studies of law reviews extremely difficult to conduct.

And then there is the problem of writing about bias at all: everyone can find bias in the system. I suspect that institutionalized bias against conservative legal scholars exists, but nonconservatives also complain about bias. Minna J. Kotkin has suggested that law reviews are biased against female submitters.[4] Rachel J. Anderson has suggested that law reviews are biased against “dissent scholarship,” which, she says, includes “civil rights scholarship, critical legal studies, critical race theory, feminist theory, public choice theory, queer theory, various ‘law ands’ scholarship that employs quantitative or humanistic methodologies, and other scholarship that, at one point in time or another, is not aligned with ideologies or methodologies that the reader values or considers legitimate.”[5] Finally, Jordan Leibman and James White discovered bias favoring authors with credentials, publication records, or experience.[6]

Law student bias seems, from my perspective, more likely to be weighted toward credentials and reputation rather than political persuasion.[7] An established professor with an endowed chair is therefore more likely to receive a publication offer from a law review than an unknown, young, or adjunct professor; and the name recognition of an author—regardless of personal politics—is more likely to guarantee that author a publication slot in a law review. One downside to this is that student editors will accept half-written or ill-formed articles simply because the author is, for want of a better word, renowned. It is common in these situations for students to then ghostwrite vast portions of the article for the author. Another more obvious downside is that professors from select institutions and with certain reputations will be published over authors who have submitted better scholarship. This is the primary reason why I advocate for a hybrid law review/peer review approach to editing.[8]

I’ve mentioned that legal scholarship differs from humanities scholarship. What makes it different is its attention to doctrinal matters, i.e., to the application of law to facts or the clarifying of legal principles and canons. After their first year of law school, students are equipped to study these sorts of matters. They are not unlike lawyers who approach a legal issue for the first time and must learn to analyze the applicable law in light of the given facts. Although the breadth and scope of legal scholarship have changed to reduce the amount of doctrinal scholarship produced and to incorporate interdisciplinary studies, doctrinal scholarship remains the traditional standard and the conventional norm.

Law students have the facility to edit doctrinal scholarship, but not to edit interdisciplinary articles.[9] This point is not necessarily to advance my argument about bias being less inherent in law review editing; rather, it is to circle back to my initial position that inexperienced and inexpert students should not be empowered to make major editorial decisions or to control the editing. As I have suggested, student editors are biased, just as professional peer reviewers are biased—the problem is that students are less prepared and qualified to make sound editorial judgments. If what is needed is an editorial system that diminishes bias, then student editors are not the solution. Law review editing, however, provides a clarifying model for offsetting widespread bias.

It would be difficult if not impossible to implement law review editing among humanities peer-reviewed journals for the disappointing reason that law reviews enjoy ample funding from institutions, alumni, and the legal profession whereas humanities journals struggle to budget and fight for funding. Therefore, I will not venture to say that peer-reviewed journals ought to do something about their bias problems by mimicking law review editing. Such a solution would not be practical. But by pointing out the benefits of law review editing—i.e., the result of less bias due to such factors as competition and turnover in editorial positions—I hope that more creative minds than mine will discover ways to reform peer-reviewed journals to minimize bias.

 


[1]I consider editor selection flawed for some of the reasons Christian C. Day describes in “The Case for Professionally-Edited Law Reviews,” Ohio Northern University Law Review 33 (2007): 570–74.

[2]How this competition works differs from journal to journal. In some cases, the students select which student articles to publish based on an elaborate voting process supposedly tied to blind review and authorial anonymity.  In other cases, faculty decide.

[3]“Many scholars feel that student editors of law review articles, while they were perhaps once competent to evaluate the merit of scholarly articles owing to the much narrower range of topics, have for the last few decades had great difficulty grappling with nondoctrinal scholarship (that is, scholarship dealing with the intersection of law and other disciplines). The authors of law journal articles now increasingly draw from areas such as economics, gender studies, literary theory, sociology, mathematics, philosophy, political theory, and so on, making the enterprise much too difficult for a group of generally young people, who are not only not specialists, but have barely entered the field of law.” Nancy McCormack, “Peer Review and Legal Publishing: What Law Librarians Need to Know about Open, Single-Blind, and Double-Blind Reviewing,” Law Library Journal 101, no. 1 (Winter 2009): 61–62.

[4]Minna J. Kotkin, “Of Authorship and Audacity: An Empirical Study of Gender Disparity and Privilege in the ‘Top Ten’ Law Reviews,” Women’s Rights Law Reporter 35 (Spring 2009).

[5]Rachel J. Anderson, “From Imperial Scholar to Imperial Student: Minimizing Bias in Article Evaluation by Law Reviews,” Hastings Women’s Law Journal 20, no. 2 (2009): 206.

[6]Jordan H. Leibman and James P. White, “How the Student-Edited Law Journals Make Their Publication Decisions,” Journal of Legal Education 39, no. 3 (September 1989): 396, 404.

[7]Many others share this view: “It appears to be generally assumed that, to a significant degree, Articles Editors use an author’s credentials as a proxy for the quality of her scholarship.” Jason P. Nance and Dylan J. Steinberg, “The Law Review Article Selection Process: Results from a National Study,” Albany Law Review 71, no. 2 (2008): 571.

[8]See my Spring 2013 Academic Questions article, “The Law Review Approach: What the Humanities Can Learn.” I am not alone on this score. Day suggests that “this bias can be defeated by blind submissions or having faculty members read the abstracts and articles of blind-submitted articles where the quality is unknown. The names and other identifying information should be obscured, which is common in other disciplines. This is easy to do with electronic submissions. It should be the rule in law reviews, at least at the initial stage of article selection.” “Case for Law Reviews,” 577.

[9]Hence Richard Posner’s suggestion that law reviews “should give serious consideration to having every plausible submission of a nondoctrinal piece refereed anonymously by one or preferably two scholars who specialize in the field to which the submission purports to contribute.” “The Future of the Student-Edited Law Review,” Stanford Law Review 47 (Summer 1995): 1136.

Thoughts on ‘The Road to Serfdom’: Chapter 7, “Economic Controls and Totalitarianism”

In Arts & Letters, Austrian Economics, Book Reviews, Books, Conservatism, Economics, Epistemology, Essays, History, Humane Economy, Humanities, Justice, Law, Libertarianism, Literature, Philosophy, Western Civilization, Western Philosophy on October 2, 2013 at 8:45 am

Slade Mendenhall

Slade Mendenhall is an M.Sc. candidate in Comparative Politics at the London School of Economics, with specializations in conflict and Middle Eastern affairs. He holds degrees in Economics and Mass Media Arts from the University of Georgia and writes for The Objective Standard and themendenhall.com, where he is also editor.

The following is part of a series of chapter-by-chapter analyses of Friedrich Hayek’s The Road to Serfdom, conducted as part of The Mendenhall’s expanding Capitalist Reader’s Guide project. Previous entries can be found here: Introduction, Chapter 1, 2, 3, 4, 5, and 6.

In “Economic Control and Totalitarianism”, the subject of Hayek’s seventh chapter, we find him at his best, with a clarity and reason that we have not seen since chapter two, “The Great Utopia.” In chapter seven, Hayek expounds upon numerous themes within the titular subject: the inextricability of dictatorial control and economic planning, the fallacy of believing that economic controls can be separated from broader political controls, the inevitability in a planned economy of controls extending to individuals’ choice of profession, and the interrelation of economic and political freedom. What aspects of the chapter we might find to criticize arise either from a desire for him to take his line of thinking a step further than he does or already established mistakes carried over from previous chapters. Despite a few minor missteps, however, Hayek’s chapter is, overall, an exceedingly positive contribution.

He begins by stating what is, to many self-deceiving advocates of socialism, a jarring observation: that planned economies, following their natural course, ultimately always require dictatorial rule. “Most planners who have seriously considered the practical aspects of their task,” Hayek writes, “have little doubt that a directed economy must be run on more or less dictatorial lines” (66). Without fully restating the argument here, Hayek implicitly rests upon the description of this tendency that he spelled out in chapter 5, “Planning and Democracy”: power in a planned system gradually consolidates into a central committee or single dictator as a matter of organizational efficiency, with a decisive central leadership winning out over the gridlock and inefficiencies of a democratic body. The point is as valid and well made here as it was then.

Where Hayek expounds upon this is in refuting one of the false promises often made by planners as they reach for the reins of a country’s economy: “the consolation… that this authoritarian direction will apply ‘only’ to economic matters” (66). Contrary to the suggestion that controls will be limited to economic affairs, Hayek asserts that economic controls in the absence of broader political controls are not simply unlikely, but impossible. Rather than simply detailing in a typical way the interrelationship of economic and other activities, Hayek acknowledges the inseparability of the two, writing, “It is largely a consequence of the erroneous belief that there are purely economic ends separate from the other ends of life” (66). He later elaborates:

“The authority directing all economic activity would control not merely the part of our lives which is concerned with inferior things; it would control the allocation of the limited means for all our ends. And whoever controls all economic activity controls the means for all our ends, and must therefore decide which are to be satisfied and which not. This is really the crux of the matter. Economic control is not merely control of a sector of human life which can be separated from the rest; it is the control of the means for all our ends” (68).

Hayek’s point is, in the context of modern economic education, a largely underappreciated and mishandled one. Economics instructors have, with time, lost the important skill of contextualizing economic interests within the broader scope of other human pursuits, instead treating them either as abstract ideas toyed with in a vacuum without real-world ramifications or preaching the ‘economics is everything’ doctrine to the exclusion of other analytical tools and frameworks.

Hayek, whether by virtue of writing at a time less bound by such false dichotomization of the field or simply due to his exceptional qualities as an economic thinker, successfully avoids both traps. “Strictly speaking,” he writes,

“there is no ‘economic motive’ but only economic factors conditioning our striving for other ends. What in ordinary language is misleadingly called the ‘economic motive’ means merely the desire for general opportunity, the desire for power to achieve unspecified ends. If we strive for money it is because it offers us the widest choice in enjoying the fruits of our efforts” (67).

Hayek rightly acknowledges money as a profoundly empowering economic good, calling it “one of the greatest instruments of freedom ever invented by man” that “opens an astounding range of choice to the poor man, a range greater than that which not many generations ago was open to the wealthy” (67).

Chapter seven goes on to briefly characterize the pervasiveness of central planning, and its propensity to spread to all areas of a society. Hayek recognizes that the much-eluded question of socialism-versus-capitalism is not simply one of which decisions individuals are to make for their lives, but whether the decision is to be theirs at all:

“The question raised by economic planning is, therefore, not merely whether we shall be able to satisfy what we regard as our more or less important needs in the way we prefer. It is whether it shall be we who decide what is more, and what is less, important for us, or whether this is to be decided by the planner” (68).

Those on both sides of the aisle in the United States today, who fail in so many matters to appreciate the distinction between individuals choosing the right thing for their lives and a government official imposing their choice (be it right or wrong) upon them, would do well to heed Hayek’s warning. Modern American political thinking, caught between an increasingly authoritarian left (taken directly from Marx and Rousseau, or updated via modern incarnations like Krugman, Sunstein, and Stiglitz) and a right that has yet to extend its limited government spirit to all areas of economics—much less censorship and social issues—has a great deal to learn from an Austrian economist’s words written some seventy years ago.

One element of central planning that utopian-minded young socialist idealists evade is that labor, being an input, must, in a controlled economy be as controlled as any other good—if not more so. This does not mean simply the control of wages or the maintenance of union. Ultimately, it means government control over the quantity of individuals in a given profession, conducted in the interest of keeping wages in a given field high and ensuring that there is an adequate supply of expertise to meet all of the economy’s needs. This means at some point dictating who can and cannot enter a given field of work.

Hayek writes,

“Most planners, it is true, promise that in the new planned world free choice of occupation will be scrupulously preserved or even increased. But there they promise more than they can possibly fulfill. If they want to plan they must control the entry into the different trades and occupations, or the terms of remuneration, or both” (71).

How many young socialists on college campuses across the country would not object to being torn from their chosen course of study and compelled to study for degrees in which they had no interest, to spend their lives in careers they did not love? That is the fate that they ask for, whether they recognize it as such or not. Would they accept it willingly? Would they “become a mere means, to be used by the authority in the service of such abstractions as the ‘social welfare’ or the ‘good of the community’” (72), bowing their heads subserviently to spend a life on a path that was chosen for them, for the good of society? Perhaps some. And perhaps others would recognize the nature of what they profess to believe in and renounce it. Either way, it is a reality that should be presented to them in those terms by those who see socialism for what it is.

Towards the end of the chapter, Hayek makes several key observations that would prove all the more true in the decades after his writing.  He notes the decline of references by advocates of socialism to the functional superiority of socialism. Gradually witnessing their system being discredited, but doubling-down on their dogma, the socialists of the mid-20th century came to look less and less like those of the early 20th century, who believed in the system as a technically superior model for society. Instead, their arguments turned egalitarian in nature,  “advocat[ing] planning no longer because of its superior productivity but because it will enable us to secure a more just and equitable distribution of wealth” (74). Little did Hayek know how far that trend would go with the rise of the New Left and its legacies, stretching up to the present and the current American administration.

Finally, in another point that has proven all the more true since the time of his writing, Hayek recognizes that the extent of planning proposed by socialism, empowered by modern modes of control, is that much greater than the control and subjugation that occurred under the days of monarchy and feudalism. In reading it, one is brought to wonder how much greater that mechanism of control is today, with NSA surveillance, a growing regulatory state, and ever more executive agencies maintaining armed units to impose their rules, than at Hayek’s writing in 1943.

Hayek’s seventh chapter is a valuable and, for the same reasons, saddening one for the way that it makes us reflect upon the applicability of his words and ideas to our current political environment. Though our current condition is far from totalitarian in nature, the same principles apply, to a lesser extent, in all areas where government intrudes to control markets, alter incentives, or provide special advantages to some at the expense of others.

Human beings are rational animals. We respond to the incentives around us. In the presence of a government that seems increasingly, explicitly willing to toy with those incentives to alter our behavior to suit models and ideals for our lives that are not our own, how much do we lose that we never knew we had? In what ways are our options limited? Need it be by a government edict that tells a young man who would study to be a doctor that doctors are no longer needed, and he should apply to be an engineer instead? No. It may be as subtle as inflating the price of his education through government loan programs, regulating the field he seeks to enter, and subjecting him to entitlement programs that tell him that his life’s work is not his own; that he works and exists in the service of society as a whole. And at that point, the difference between our condition and the ill fate that Hayek describes becomes one not of kind, but of degree.

Thoughts on ‘The Road to Serfdom’: Chapter 1, “The Abandoned Road

In Arts & Letters, Austrian Economics, Book Reviews, Books, Britain, Economics, Epistemology, Essays, Ethics, Historicism, History, Humane Economy, Humanities, Liberalism, Libertarianism, Literary Theory & Criticism, Literature, Modernism, Philosophy, Politics, Pragmatism, Western Civilization, Western Philosophy on September 11, 2013 at 7:45 am

Slade Mendenhall

Slade Mendenhall is an M.Sc. candidate in Comparative Politics at the London School of Economics, with specializations in conflict and Middle Eastern affairs. He holds degrees in Economics and Mass Media Arts from the University of Georgia and writes for The Objective Standard and themendenhall.com, where he is also editor.

This analysis is the second installment in a series of chapter analyses of Friedrich Hayek’s The Road to Serfdom. The previous analysis of Hayek’s introduction can be found here.

If Hayek’s introduction gave us a brief summary of the ideas and practices he is setting out to oppose and contextualized the progression toward a socialist political culture in the last half century of Europe’s history, his first chapter, “The Abandoned Road”, firmly roots his grievances in the present and the problems facing England at the time of his writing and seeks to explain how England (and the West more generally) arrived there. He describes the intellectual evasions, distortions, and faulted epistemology—often consisting of poorly defined key concepts —that led to and are, in his time, perpetuating the state of affairs he observes. He then proceeds to address the subject of liberalism and how socialists who misconceive of their own system do so at least as much with its antithesis. In the process, Hayek makes many excellent observations, but also succumbs to several dangerous philosophical errors and unsubstantiated claims against laissez-faire capitalism that tarnish what might otherwise be an outstanding defense against government controls.

Hayek begins the chapter with one of the most argumentatively powerful, poignant approaches that one can take in opposing socialist ideas: illustrating to those who support more moderate, tempered versions of statist controls that though they may differ in degree from those statists they oppose, the philosophical fundamentals they advocate are the same. “We all are, or at least were until recently, certain of one thing,” he writes,

“that the leading ideas which during the last generation have become common to most people of goodwill and have determined the major changes in our social life cannot have been wrong. We are ready to accept almost any explanation of the present crisis of our civilisation except one:  that the present state of the world may be the result of genuine error on our own part, and that the pursuit of some of our most cherished ideals have apparently produced utterly different results from those which we expected” (8).

Hayek’s point is well made and much needed at a time when such widespread, utter contradictions were even more severe than they are today. Writing to Britons in the 1940s, but with as much truth to offer Americans who stumbled over the same contradictions in the 1960s and 1970s, as the platitude “we are all socialists now” manifested on Nixon’s lips as “we are all Keynesians now” (and with less fundamental difference between them than Keynesians would have you believe), he asks us to recognize that “the tendencies which have culminated in the creation of the totalitarian systems were not confined to the countries which have succumbed to them” (8-9). Nor, for that matter, are they confined to those times, and Hayek’s message to this effect—the importance of recognizing the same fundamental ideas across contexts—is as much needed today as it was then.

He goes on to recognize that the conflict between the Axis and Allied powers in World War II is fundamentally a conflict of ideas: “The external conflict is a result of a transformation of European thought in which others have moved so much faster as to bring them into irreconcilable conflict with our ideals, but which has not left us unaffected.” He is quick to point out, though, that “the history of these countries in the years before the rise of the totalitarian system showed few features with which we are not familiar” (9).

Such an appreciation for the motive power of ideas in human conflict was not so unique in Hayek’s time. In fact, the Allied leaders superlatively acknowledged the enemy they faced as “fascism” and condemned it explicitly (though the economic and social policies of FDR, along with his earlier overt flirtations with such ideas, may have made the condemnation somewhat ironic). If Hayek has a lesson to teach to this effect, it is most needed in today’s world, when the significance of philosophy is so frequently cast aside by the influences of multiculturalist nihilism and the failure, even in academia, to appreciate the role of broadly held cultural ideas in deciding man’s fate. At a time when the mention of a “clash of civilizations” invites accusations of oppressive Western chauvinism, Hayek’s acknowledgement that conflicting fundamental ideas may lead to actual conflict is a welcome reminder.

Much of the chapter appropriately looks to fundamental ideology as the cause for the rise of Nazism, seeing the rejection of individualism in favor of collectivism as a necessary prerequisite to the “National-Socialist revolution” and a “decisive step in the destruction of that civilisation which modern man had built up from the age of the Renaissance.” The spirit of this argument is undoubtedly sound. However, the method by which he proceeds to argue it leaves much to be desired. Hayek proceeds down a path of questionable historical interpretations, a half-cocked swipe at moral philosophy (that, as we shall see, is flawed but not unfamiliar to readers of this site), and ultimately an incomplete defense of the liberal policies he hopes to defend—showing the consequences of that brief glimpse of skepticism we witnessed in the introduction.

In his historical contextualization of the trends he observes, Hayek writes,

“How sharp a break not only with the recent past but with the whole evolution of Western civilisation the modern trend towards socialism means, becomes clear if we consider it not merely against the background of the nineteenth century, but in a longer historical perspective. We are rapidly abandoning not the views merely of Cobden and Bright, of Adam Smith and Hume, or even of Locke and Milton… Not merely the nineteenth- and eighteenth-century liberalism, but the basic individualism inherited by us from Erasmus and Montaigne, from Cicero and Tacitus, Pericles and Thucydides is progressively relinquished” (10).

Hayek’s invocation of these great names in the history of liberal thought is, in most instances, not misplaced. It is true that all emerged from Western civilization and that to varying extents they all fit well into the liberal, individualist tradition he means to illustrate. One would be wise to regard the inclusion of Hume and Montaigne, paragons of skepticism, as only conditional points on such a list, though Hayek’s own skepticism and that of many libertarians in his tradition would certainly allow them.

More broadly, however, it must be said that the individuals mentioned, no matter how great their contributions to political and social thought, were not often the rule in their place and time, but the exception. One can admire the works of Pericles, but should bear in mind the fickle reception he received among the Athenians. Likewise, Cicero may deserve praise above any in his time, but for those virtues we might praise he was slaughtered without trial by a dictator who faced no consequences.

Thus, as admirable as Hayek’s examples may be, to suggest that they were the norm throughout most of Western civilization is unsubstantiated. They may have embodied those qualities that most distinguished Western civilization and have been most responsible for its progress, but it was a progress often achieved by much-abused minorities. The Renaissance, Enlightenment, and nineteenth century were the high-points of individualism and Western ideals, and Hayek is right in singling them out. However, he also runs the risk of obscuring the philosophical roots of National Socialism, itself the product of contrary trends in Western thought, by engaging in careless generalization from those high-points and distinguished individuals to Western history in general.

Departing from this somewhat problematic historical interpretation, Hayek moves through a favorable discussion of the benefits of economic and political freedom on scientific innovation. His recognition and argument that “[w]herever the barriers to the free exercise of human ingenuity were removed man became rapidly able to satisfy ever-widening ranges of desire” is incontestable (12). He also anticipates the common objections by socialist apologists today who characterize the Industrial Revolution as a period of oppression by citing the difficult living conditions of the urban poor. He rightly rejects this by contextualizing the period in the experiences and expectations of those who lived through it, writing that

“[w]e cannot do justice to this astonishing growth if we measure it by our present standards, which themselves result from this growth and now make many defects obvious. To appreciate what it meant to those who took part in it we must measure it by the hopes and wishes men held when it began… that by the beginning of the twentieth century the working man in the Western world had reached a degree of material comfort, security, and personal independence which a hundred years before had seemed scarcely possible” (12-13).

What proceeds from there is where Hayek seems on unsteady footing, as he briefly undertakes the task of trying to explain what ideas diverted man from the individualist course set from the Renaissance to the nineteenth century. Inexplicably, Hayek credits an excess of ambition as responsible for the turn toward socialism. He writes,

“What in the future will probably appear the most significant and far-reaching effect of this success is the new sense of power over their own fate, the belief in the unbounded possibilities of improving their own lot, which the success already achieved created among men. With success grew ambition—and man had every right to be ambitious” (13).

He returns to the idea again later, writing that,

“Because of the growing impatience with the slow advance of liberal policy, the just irritation with those who used liberal phraseology in defence of anti-social privileges, and the boundless ambition seemingly justified by the material improvements already achieved, it came to pass that toward the turn of the century the belief in the basic tenets of liberalism was more and more relinquished” (14-15).

It is here that Hayek’s inadequacy in analyzing philosophical ideas, and perhaps an economic bias toward looking at matters purely as a function of supply and demand, begins to show. The notion that an inadequate or insufficiently rapid provision of living standards by capitalism is to blame for the introduction and spread of socialism is baseless, as it not only commits the philosophical error of attributing a total change in fundamental beliefs to external conditions, but also ignores the fact that the introduction of socialist policies preceded the slowdown in quality of living improvements in the Western world—and, furthermore, that the slowdown still wasn’t all that slow, as anyone who looks at world history from 1870 to 1928 will readily observe.

Thus, Hayek’s notion that “ambition” is somehow to blame is irrational. If we accept the notion that capitalism was responsible for man’s improved quality of living, then the only function that ambition should serve in this context is to drive men back toward capitalism and its fundamental values—not toward socialism. To the contrary, it is not an excess of ambition that drove men away from capitalism, but the fact that the philosophical principles that underlie and empower capitalism were not consistently established in the minds of its practitioners in the first place. That is: those who lived under capitalism had not explicitly embraced reason as man’s means of acquiring knowledge, nor rational egoism as his proper ethical system, and thus lacked the fundamentals on which individualism rests. Thus, ultimately, the individualism that Hayek admires was present in the West, but not firmly rooted enough to survive the philosophical revival of Plato in the forms of Kant and Hegel. Undercut by their philosophies, in the face of Marx and Engels the West was a pushover.

Hayek’s invocation of excess ambition as an explanation for socialism shows that though he understands the role of political ideology in man’s fate, his ability to explain how that ideology stems from deeper levels of philosophy is severely lacking. Unfortunately, he does not allow this lack of expertise to stop him from making such baseless speculations as to the roots of socialism being in man’s ambition, nor from making a similarly arbitrary and more dangerous conjecture: that the essential quality that animated the Renaissance and Western civilization’s embrace of individual man was “tolerance.”

“Tolerance,” he writes, “is, perhaps, the only word which still preserves the full meaning of the principle which during the whole of this period was in the ascendant and which only in recent times has again been in decline, to disappear completely with the rise of the totalitarian state” (3). Hayek offers no further explanation to support this statement or the implication that tolerance was the animating virtue of these times, or at the very least played some crucial role in it. Nor does he illustrate the point with citations or examples. The claim stands alone.

We are thus left to speculate as to his actual beliefs on this point. However, a look at a somewhat younger contemporary libertarian economist who dabbled in political writings such as this and who shares certain philosophical fundamentals—namely a skepticist epistemology—may shed some light on the claim. Milton Friedman similarly cited ‘tolerance’ and, more specific to Friedman’s case, “tolerance based on humility” as the fundamental basis of his libertarianism. That is: the rejection of statism based not on the rights of individuals but based on the fact that no one can rightly initiate force against another since the initiator has no basis by which to know whether the cause in whose name he would initiate that force is right or wrong. Put simply, it establishes a social system in which peaceable relations between men depend upon the impossibility of establishing objective principles. In which ignorance, not knowledge, is man’s saving grace. In which moral certainty is perceived to be the root of all tyranny.

(I will not go further into Friedman’s confused moral philosophy here, though it is encouraged that the reader reference my article “The Failures of Milton Friedman” for a fuller explanation his views and the dangers they entail.)

Whether Hayek’s implication in citing “tolerance” as the great virtue lost by the rise of collectivism is in line with Milton Friedman’s connections of “tolerance” and libertarianism is unknown, but the fact that the two men share a skepticist epistemology and both ultimately land at the same word to describe the virtue that they see to be animating their ideals cannot be ignored and provides a possible explanation for Hayek’s unsupported statement.

Where skepticist epistemology and haphazard forays into moral philosophy are found, an incomplete defense of freedom usually follows. So it is here with Hayek, who shows us precisely his conception of freedom and how it should be fought for, writing, “There is nothing in the basic principles of liberalism to make it a stationary creed, there are no hard and fast rules fixed once and for all. The fundamental principle that in the ordering of our affairs we should make as much use as possible of the spontaneous forces of society, and resort as little as possible to coercion, is capable of an infinite variety of applications” (13).

I will not engage with this statement directly, as it has been soundly argued elsewhere in other essays from this publication such as “The Philosophy of Capitalism” and Brian Underwood’s “Political Capitalism”, as well as in Ayn Rand’s essays “Man’s Rights”, “The Objectivist Ethics”, and “The Nature of Government.” I will observe simply that for a man accepted by many to be symbolic of twentieth century liberalism to take such a pragmatic, unprincipled approach to the defense of freedom stands as much as a symbol of the unsteadiness and lack of a moral basis in that movement as it does a condemnation of the man himself. What’s more, it shows that no sound defense of liberty can be based on a skepticist epistemology. A defense of man begins with an admiration for man and his nature as a rational, efficacious being. Whoever hopes to undertake a task so daunting and so crucial as a defense of man’s rights against oppression cannot enter the fray with a puttering “Who knows?!” as his battle cry.

It is the inevitable fate of such pragmatists that they should ultimately abandon a strict conception of liberty and that they should shrink principles down to the level of momentarily expedient guidelines to be cast aside at the first sign of opposition. We must be immensely grateful that the Founding Fathers of the United States had the moral basis to recognize and firmly assert the rights of “life, liberty, and the pursuit of happiness”, yoking future statesmen to these principles rather than settling for such a shrugging recommendation that they “make as much use as possible of the spontaneous forces of society.” We must be proud that Jefferson swore “an oath upon the altar of God eternal hostility against every form of tyranny over the mind of man”, and not merely an oath to “resort as little as possible to coercion.”

The distortions, sadly, do not end there. Hayek confounds our expectations further by seeking to balance his critique of socialism with a contrary charge against advocates of full individual rights, writing that “[p]robably nothing has done so much harm to the liberal cause as the wooden insistence of some liberals on certain rough rules of thumb, above all the principle of laissez faire” [emphasis mine] (13).

Hayek’s ambiguous accusation against advocates of laissez-faire, that they are somehow partly responsible for the rise of socialist policies, apparently rests on the capitalists having viewed the principle as a “hard and fast… rule which knew no exceptions” (13).  He goes on to explain that the downfall of liberalism is explainable by reference to the liberal’s strict adherence to the laissez-faire principle, finding it “inevitable that, once their position was penetrated at some points, it should soon collapse as a whole” (13).

At this point, Hayek quickly reveals several key implications: that advocates of laissez-faire are partly responsible for the rise of socialism, that laissez-faire is a flawed system, and that its legitimacy has indeed “collapse[d]” through being disproven. He continues, “No sensible person should have doubted that the crude rules in which the principles of economic policy of the nineteenth century were expressed were only a beginning, that we had yet much to learn, and that there were still immense possibilities of advancement on the lines on which we had moved” (14).

To be clear: Hayek is not referring to changes in application or translation of the existing principles, but a shift in principles as such. ‘What’, one must ask, ‘could have fundamentally changed so drastically in the period in question, to make the basic principles of economic freedom no longer relevant or applicable in one period as they had been in the previous one?’ According to Hayek, it was the inevitable result of having

“gained increasing intellectual mastery of the forces of which we had to make use. There were many obvious tasks, such as our handling of the monetary system, and the prevention or control of monopoly, and an even greater number of less obvious but hardly less important tasks to be undertaken in other fields, where there could be no doubt that the governments possessed enormous powers for good and evil;” (14)

Thus, Hayek posits that our “increasing intellectual mastery” (though I can think of a century of economic instability primarily brought by government controls that would refute this alleged “mastery”) is to credit for government intervention in the economy. He implies that the belief that governments could regulate the economy by force somehow translates to the presumption that they should do so—a significant leap that Hayek does not and cannot, without reference to philosophy, explain. Not only does this misconceive of the problem; it carelessly implies that those statesmen of earlier times did not intervene in the economy because they could not conceive of how to do so. To the contrary: earlier liberal thinkers did not plead ignorance in the face of proposed interventionism—they opposed it on principle, and suggesting otherwise is a discredit to their defenses of liberty.

Hayek’s passing statements apparently endorsing the “control of monopoly” and his suggestion that “the governments possessed enormous powers for good and evil”—that is, that good could be achieved by force just as surely as evil—only add layers to the disappointing picture established thus far. He goes on to make an unconvincing argument that the slow pace of economic progress under liberalism was to blame for people having turned away from it—a confounding claim to make about a century that witnessed the most rapid and dramatic rise in quality of life in the history of humankind, and one that even Marx himself would likely have disputed as unsubstantiated.

Finally, he ends the chapter on an agreeable note with a brief description of how the geographical flow of ideas—from Britain and the US east to continental Europe—reversed at this period in history and the prevailing current turned westward, exporting German socialist ideas to the Atlantic. He astutely summarizes how the ideas of Marx, Hegel, List, Schmoller, Sombart, and Mannheim overtook the intellectual tone set by the English after 1870. He ends on the essential point that it was ultimately the lack of confidence in their own convictions by Western thinkers that made this shift possible. In this effort—narrating the history of philosophical and cultural trade balances—Hayek is excellent and displays the power of which he is capable when he remains in his purview, capitalizing on his unique perspective.

After a promising introduction, the first chapter of Hayek’s book has proven shaky at best. The flaws are numerous and fatal: a questionable interpretation of the histories of both liberalism’s origins and socialism’s ascendance, a dangerously inadequate grasp of the role of moral philosophy in the histories he details, a desire to blame liberalism for its own destruction with insufficient substantiation, a skepticist rejection of principles that leads to a pragmatist’s approach to policy, and, finally, a rejection of laissez-faire capitalism.

To his credit, Hayek is overall favorable on matters of economic history, arguing effectively for the role of capitalism in promoting scientific progress and advances in standards of living. However, his suggestion that advancement in the nineteenth and early twentieth centuries was slow, and that this slowness of progress is to blame for the West’s acceptance of socialism, is largely without a supporting argument, is contrary to the unrivaled history of economic progress that we know to have characterized that period, and, incidentally, indulges a determinist philosophy that we saw him as likely to avoid in the introduction—a serious point of inconsistency.

Overall, Hayek’s first chapter is a dramatic step down from the introduction and a disappointment considering the reputation of the book. It is, in its own way, an abandonment of the road, if in a slightly different direction than those whom Hayek criticizes. Though future chapters may redeem the work to some extent, the fact that so much ground is lost in the first few pages is a severe blow, but one that is in keeping with the suspicions which we noted in assessing the introduction and which we warned to be on the lookout for. It illustrates well the consequences of even small cracks in one’s intellectual foundation and confirms the value of critically applying careful philosophical detective work in reading works such as this, no matter their reputation.

Unmasking

In Arts & Letters, Creative Writing, Essays, Humanities, The South, Writing on August 14, 2013 at 8:45 am

This essay first appeared here in Kestrel: A Journal of Literature and Art.

Allen Mendenhall

There is no remembrance of former things; neither shall there be any remembrance of things that are to come with those that shall come after.

                                                                                           Qoheleth 1 : 11

Southerners are particular about the way they preserve their loved ones; they encourage embalming, for instance, although at one time they shunned it as unconsented-to tampering with the body.  Eventually someone decided, rather wisely, that the deceased, had they a choice, would like a genteel display of their “shell.”  This meant more than sanitization: it meant dressing the dead like ladies or gentlemen on their way to church.  Which is precisely where they were going—just before they were buried in the ground.  For the most part, Southerners don’t cremate.  (A preacher once told me that the Bible discourages cremation.)

In the South—more than in other regions—funerals are hierarchical affairs: one’s nearness to the deceased signifies one’s importance to the family.  This holds for the church and burial service and is especially true if the departed was popular in life.  Being closest to the deceased, pallbearers shoulder the weightiest burden.

Nowhere is decorum more important than at a funeral procession.  It’s unseemly for one who’s not party to the procession to fail to bow his head and arrange a grave face as the procession passes.  If you’re in a vehicle, you pull over to the curb and, so long as it isn’t dangerous to do so, take up the sidewalk as if on foot.  Quitting the vehicle is, in general, inadvisable if by the time you encounter the procession the hearse is no longer in sight.  Or if, alternatively, the weather doesn’t permit.  If you’re in line, the modus operandi is ecclesiastic—ordered from clergy, to immediate kin, to next-of-kin, to distant family, to friends, to the rest.  Losing your place in line is, accordingly, like losing your intimacy with the family, for whom these rituals are carried out.

I was eight when Great-Granddaddy died.  Mom piloted me before his open-casket and whispered, “That’s not Great-Granddaddy.  That’s just a shell.  Great-Granddaddy’s gone to heaven.”

I looked down at the thing, the shell, the facsimile that seemed uncannily human, and said to myself—perhaps out loud—“That’s not Great-Granddaddy.  That’s something else.”  But the thing appeared real, strange, so nearly alive that it repulsed me.  Its eyes, thank God, were closed, but its mannequin face, vacant and plastic, nauseated me.

Mom prodded me away, hollering at my cousin to take me outside.  My first brush with death, while necessary, had not imparted a healthy understanding of mortality.

My grandmother, Nina, tried to familiarize me with the inescapable while I was still a boy.  Instead of taking me to playgrounds, she took me to cemeteries for what she called “Southern preparations.”  These outings usually occurred on warm spring afternoons, when azaleas bloomed bright white and pink, when yellow Jessamine vines crawled up walls and fences, when dogwoods yawned inflorescent, and when tulips, still un-beheaded, stretched with impeccable posture.  When, in short, nature was doing anything but dying.

Nina shared facts about various grave plots, giving the lowdown on so-and-so’s passing—“he died in Korea,” “he of aids,” “she during pregnancy,” and so forth.  When she finished, we fed the swans.

Which attacked me once.  I was standing on the riverbank, feeding the once-ugly ducklings by hand just as Nina had taught me, when, like Leda, I was enveloped by a feathered glory of beating white wings.  Traumatized, I no longer stood on shore but sat on the roof of the car.  To make me feel less sissy, Nina sat on the hood and pretended that she, too, was afraid.  It wasn’t their size exactly.  Nor the way they tussled with graceful wrath.  Maybe it was the mask about their swan eyes.  I’m sure it was that: the concealment, secret identity, veiled feelings.

Just before I got married, my fiancée, Giuliana, flew in from São Paulo to meet my family.  After supper, Nina insisted that I drive her through the cemetery.  I hadn’t been in years but instantly recognized the rod-iron gates that once seemed so colossal.  There was the river.  The ducks.  The swans.  In the distance, a family, their heads bowed, stood under a high green tent.

Giuliana was not disturbed by this detour.  Quite the contrary:  she felt in some way moved.  It was as if Nina had invited her into a private, intimate space: one that contradicted this modern world of medical science in which everyone tries to postpone or avert death.  In a cemetery one couldn’t help but think of decomposition, permanence, the soul.  One couldn’t help but track the beat of one’s heart, measure the inhales and exhales of one’s breathing.  One couldn’t help, that is, but cherish the fact that one’s alive.

My cell phone buzzed.  An unknown number flashed across the screen.  I answered, “Hello?”

“Mr. Mendenhall?”

“Yes.”

“Are you in the car?”

“No.”

“This is the cancer center at St. Joseph’s Hospital.  We need you to come in.”

I was twenty-four, and about to hear, “You have cancer.”

Nothing—not even a Southern upbringing—can prepare you for those three words.

The odd thing about preachers is that, depending on time and place, their company is either most welcome or most unwelcome.  When I got the call, the cancer call, my uncle, a preacher, was beside me, and I was, to that end, glad.  He made me feel the power of presence, to say nothing of companionship:  I was not alone.

My uncle—Uncle Steve—preaches in the only Southern Baptist church in Chicago.  Unlike most Southern Baptist preachers down South, he eschews the noisy and spectacular, preferring, instead, politesse and restraint.  Bookish and professorial, his voice nasal, his nose suitably sloped to hold up his saucer-sized spectacles, he loves theology and will tell you as much at the drop of a hat.  What with his general softness, he might, with a bit more age, have been mistaken for Truman Capote, with whom, incidentally, his father—my grandfather—had grown up in Monroeville, Alabama.

A man of custom, a student of Latin and Greek, fluent in Russian and French, a former lawyer and journalist, Uncle Steve is uncommonly qualified to carry on the sanctifying traditions of Western Civilization.  He is, in short, a gentleman and a scholar.  And he was in Atlanta that day, standing in the Varsity parking lot, his belly stuffed full of chili dogs, his ketchup-smudged face like an advertisement for this, the world’s largest drive-in restaurant.

I could feel his gaze moving over me and spared him the discomfort of asking what was the matter.

“I have cancer,” I said.

As the words issued from my mouth, my chest felt as though someone were driving a stake into it.  Cancer.  That thing other people got.  Old people.  Not young and healthy people.  Not me.

I tried to act normal, but in doing so betrayed what I really felt—terror.

Uncle Steve put his arm around me.  “Come on.  Let’s get to the hospital.”

Every hour on the hour, the employees of St. Joseph’s Hospital pray together.  These moments, though heavily orchestrated, bring peace to the ill and dying, the sick and suffering.  The nurses and doctors who wander the hallways pause while a disembodied, female voice recites the Lord’s Prayer, first in English, then in Spanish.  “Our Father, who art in heaven…”—the words echo off the cold, linoleum tiles—“hallowed be thy name.

This was happening when I walked into the waiting room.  A nurse, a heavyset black woman with the softest eyes I’d ever seen, was behind the counter, her necklace, weighed down by a tiny crucified Jesus, dangling at her pillow-like breasts.  She whispered, again and again, amen, amen, and then, looking up, took me in with those deep knowing eyes, spoke without speaking.  Sunlight streamed through the cool, trapezoid panes of glass in the ceiling, falling across her face and hair at a low angle.

At last the prayer ended.  She unfolded her hands and smiled formally.  “Good afternoon, how may I help you?”

Responding with “I have cancer” didn’t feel right, so I said, “I’m here to see Dr. Danaker.”

That was all she needed to know.

“Bless your heart, child,” she said.  And, for the first time, I got emotional.  She hugged me, calling me child again; then, right then, I wanted to be a child, wanted her to scoop me into her arms and cradle me, wanted her thick, strong body wrapped around me; but there, too, was Uncle Steve, dignified and collected.  I couldn’t break down in front of him.

The nurse ushered me into a white, windowless room with expansive tile walls and sat me on a tissue-papered chair, which swished and crackled whenever I readjusted my derrière.

There I was.  Conscious.  Being, yet trying to fathom not being.  I imagined myself in a coffin, like that horrid shell, Great-Granddaddy.  Which only made things worse, for I knew that, once in the coffin, I would have no notion of being there.  The problem was thinking itself.  I couldn’t imagine being dead because I couldn’t imagine not imagining.

On Sunday mornings, before church, dad had always made my siblings and me read from the obituaries.  This, he said, would acquaint us with the fragility of life.  He also thought the best way to learn was from experience.  But he’d known only one person who’d experienced death and, almost impossibly, lived to tell about it—Martin, a friend of the family, who’d apparently died three times and, on the operating table, been revived.  Martin loved cigarettes, which he called the backbone of Southern economy and which, he readily admitted, had brought about his three near-fatalities.

Except Martin didn’t put it in those terms.  To him, cigarettes had allowed him to float outside his body for a while, to see what death was like.  For better or worse, Martin didn’t tease a tunnel of light, greet a golden angel, or feel a fluffy cloud:  he simply “left” himself and, in a state of utter weightlessness, peered down on his body as would an outside observer.  Maybe that’s why dad didn’t like us talking to Martin about death: Dad wanted us to hear about St. Peter and heaven and departed relatives.

The trouble with Martin was that one never knew when to believe him.  Heck, we barely knew who he was.  Ephemerally at least, he’d been my aunt’s boyfriend; then, when she dumped him, he’d never gone away: he moved in with my other aunt, a single mother, and helped care for my young cousin.  Martin was present every Thanksgiving and Christmas, but neither got nor gave gifts.  A transplant from North Carolina, he had daughters somewhere—either the Carolinas or Virginia—and had graduated from the University of North Carolina at Chapel Hill, an achievement he was quite proud of.  He didn’t work.  Didn’t own a car.  And didn’t seem to have money.  His singular ability to access death could’ve been, for all we knew, lifted from a sci-fi novel.  Nevertheless, I believed him.

Ten.  That’s how old I was when I saw a dead body I wasn’t supposed to see.  A right turn on I-85, heading north, highway stretching to where sky and land sandwiched together.  I was in my school outfit, backpack in my lap.  Mom was in her tennis getup, checking the rearview mirror.  Traffic was slowing and stopping.  To my left was a vast gray sheet held up by blank-faced men.  Behind it, a woman.  Or what was left of a woman.  Arms and legs bent at impossible angles; head sagging, possibly unattached; a bloodied skirt lifted by the breeze.  Someone’s mom.  Or sister.  Or wife.  Or girlfriend.  Or daughter.  Here one minute, gone the next.  This wasn’t dignity.  This was mean and messy.

Death, they say, is not only universal but also the great leveler: it befalls kings and paupers, rich and poor, wise and foolish.  Solomon, Caesar, Constantine, Charlemagne, Napoleon: all died despite their glory in life.  What I never understood, and, frankly, still don’t, is why folks pretend death doesn’t happen.  The person who ignores death is delusional at best, narcissistic at worst.  Death is our sole commonality, the thing in this world we all await, about which we may commiserate.  It’s what makes us human.  I daresay one can’t fully love a person without knowing that person is temporary.

Francis Bacon once declared, “The contemplation of death, as the wages of sin, and passage to another world, is holy and religious; but the fear of it, as a tribute due unto nature, is weak.”  Weak it may be to the healthy and fit, but to the ill and ailing it seems only natural.  The person who claims he doesn’t fear death is either a liar or an incorrigible maniac—or else a coward, too faint-of-heart to face the facts.  Bacon himself had the good fortune of dying in two to three days, having contracted pneumonia while conducting an experiment in the snow.  Willfully blind to his fate, lying on his deathbed, he penned a letter to his friend, Thomas Howard, expressing relief that he hadn’t suffered the fortune of Caius Plinius, “who lost his life by trying an experiment about the burning of Mount Vesuvius.”

After surgery, I, like Bacon, was bedridden.  Soon a phone call would tell me one of two things: that I was okay, my cancer hadn’t metastasized, or else I wasn’t okay, I needed chemotherapy and my chances of living another two years were below fifteen percent.  A glued-together wound, resembling fat, blue, puckered-up lips, took up the length of my chest.  Visitors asked to see it and then regretted their request when I rolled up my shirt, revealing a moon-shaped, smurfy smile.  When the visitors left, and I was alone again, alone and quiet, I imagined what the malignancy would look like as it spread through my body, which I conceived of as a mini mine field: tunneled with small explosive cancer clusters about to be detonated.  How could this shell—which once ran a mile in under four-and-a-half minutes—expire?

I’m not in my brain but somewhere lower: near the chest, maybe, or the gut.  I couldn’t, for instance, stop a dream even if I wanted to.  Which is odd, because it’s my brain that’s dreaming—not someone else’s.  The brain works independently of me, or, to be precise, of what I perceive to be me: it’s like an unmanned motor boat zipping on the water.  Occasionally one of my siblings, or an old friend, will recall some long-ago event, which I’d otherwise forgotten, and then, suddenly, I’ll remember.  The brain has stored this memory somewhere—somewhere not readily accessible—but I, wherever I am in this shell, never felt compelled to find it.  The thought just exists up there, waiting.

It’s the soul, I suppose, that’s me.  When I lie awake at night and contemplate this interim body, which I inhabit the way a renter inhabits an apartment, I locate my self—that subjective knowing ego—whole and center, as though the brain, convenient as it is, has a mind of its own.  To be sure, I can borrow this organ when I study or otherwise require deep reflection; but when I tire of thinking, when I want a break, when I lean back from my desk, I’m very aware that I, my self, am moving from the head to just above the torso, where I belong.  And when I experience joy, compassion, anguish, despair—when, that is, I feel—it’s never with my head but with something deep within my bosom.  How does one explain this?  Perhaps we’re all antecedent to the body: little floating things confined to this definite, corporate form we didn’t choose, waiting, like thoughts, to be accessed—or released.    

Opossums, more commonly known, in the South, as “possums,” are, I’m told, a delicacy.  Nina’s got a cookbook that says so, though she claims she’s never cooked or eaten one.  I have my doubts, since my dad grew up eating squirrel, which, I think, is more revolting because squirrels are cute and handsome, whereas possums have that eerie look I associate with demons and devils—and masks.

At seven, I persuaded my brother to take a life.  A possum’s life.  It was a horrible affair, really.  One that, even today, is difficult to own up to.  Brett, being the gullible little brother he was—I convinced him once that the shadow-puppet giant who lived on the ceiling would kill him in his sleep—stomped on a squeaking pile of pine-straw while I looked on, presumably to punish him if he disobeyed.  Of course, the squeaking didn’t belong to the pine-straw, but to a tiny nest of baby possums underneath.

For some reason, I was initially proud of what I’d done, and, hours later, said as much to my mom.  Horrified, she made me show her the nest, since I’d “cried-wolf” before.  Sure enough, there, in the pine-straw, lay a bloody baby possum, whimpering, dying.

My first defense was I hadn’t done anything.  Brett had.  I’d simply stood by and watched.  Mom was smarter than that.  I don’t remember what she said—only that, once she said it, I began to cry.  And couldn’t stop.

It was this event, this murder of an innocent, that brought about my general appreciation for original sin, or least for the idea of innate human depravity.  Humans, you might say, are born rotten—so much so that most of us, in our youth, could stomp infant possums to death without understanding the wrongness of our action.  No doubt I regretted this behavior—this actus rea—but not because I felt guilty: it was, in effect, because I feared punishment—some combination of mom’s wrath and her spank-happiness.  A parent’s role is, among other things, to tame a child’s destructive impulses.  That’s what mom did—without succumbing to her own elemental aggressions.

She called the Chattahoochee Nature Center, a local environmental organization, and a worker there explained how to save the baby possum.  This, then, became my task, my agonizing punishment: to keep the possum alive.  Being intimate with death is one thing; being intimate with suffering quite another.  When I scooped the trembling creature up to my palm, it emitted a sad, pitiable squeak.  “Everything’s okay,” I whispered, “I’m not here to hurt you”—a funny assurance coming from the kid who’d just ordered its murder.

If truth be told, I wished I’d just destroyed the thing.  Better dead than in this wretched condition.  Still, the way it looked at me—its beady, searching eyes perusing my face—reminded me of how Ansley, my little sister, then only a year old, looked up at mom when she wanted to be fed.

I placed the creature in a shoe box, which I tucked beneath a shelf in my parents’ closet, the darkest place in the house.  More than anything, the possum needed darkness and silence.  I dug a hole in the backyard, tied two twigs together in the shape of a cross, and arranged a constellation of stones around what would’ve been a grave.  But the thing didn’t die.  It healed so well that, the next morning, it was squirming and scurrying and dad needed a net to contain it.  Even after the possum was free in the backyard, I left the grave untouched, a reminder that all things, even possums, eventually come to an end.

My Southern upbringing was all about learning how to die.  Like the Greek Stoics, Southerners believe in cultivating virtue, improving life, and, above all, accepting mortality.  Liberated from urban distractions, tied to land and home, they regard humans as custodians of the past; they keep gardens, preserve antiques, record lineage, mark battlefields, and salvage the efforts of planters, carpenters, raconteurs, and architects; they ensure, in short, the availability of history.  This can lead to nostalgia for times they never knew, bad times, ugly times, which is to say that this can cause Southerners to overlook—or, worse yet, revise—the inconveniences of history: slavery, for instance, or civil rights.  All the same, the Southern tradition, burdened as it is by various conflicts, retains virtues worth sustaining: community, family, religion, husbandry, stewardship.  These customs, however vulnerable, hardly need guardians.  They will, I suspect, persist, in some form or another, as long as humanity itself; for they are practical, permanent ideals—tested by generations—which people fall back on during disorienting times.  In a region haunted by racial brutality, these principles are, and have been, a unifying reference point, a contact zone where cultures—black, white, and Hispanic—share something spiritual despite their differences.

Living history, not just studying it, but consciously living it, is neither wicked nor wrong; the chronic, urgent awareness that everything you know and love will come undone, is not, I think, misguided, but utterly essential.  There’s something beautiful about facing the insurmountable.  When the world’s fleeting, death becomes a liberating, albeit terrifying, reality.  It throbs and pulsates and beats beneath the skin, inside of which we’re all raw skeleton.

For all this, however, I wasn’t ready.  Didn’t want to die.  Couldn’t even conceive of it.  The twenty-something years my family had been teaching me about death amounted to, not nothing, but not much, either.  Death, I suppose, is a hard thing to accept, and an even harder thing to fight, since fighting seems so pointless: deep down, you know you can’t win.  You might prevail once.  Maybe even twice.  But ultimately it’ll beat you.  It almost did me.

Friends ask how it feels to “beat” cancer.  I never can answer—not satisfactorily—for the experience is more like submission than competition: it’s a manifold process of coming to terms with the body, a thing doomed to decay.  When the doctor—Dr. Danaker—called to say the lymph nodes were benign, that the cancer hadn’t spread, I shocked him with a tired reply:  “Oh, good.”

“This is great news,” he assured me, as if I needed reminding, as if I hadn’t appreciated—indeed, hadn’t understood—how lucky I was.

“I know,” I said.

At this, the good doctor seemed annoyed.  “Ungrateful kid,” his tone implied.  But I wasn’t ungrateful.  Nor ecstatic.  I was, simply put, unbound—by life, by people, by things.  His take was that I had another chance, a fresh start, that I could put this nonsense behind me and move on.  My take was that, having embraced impermanence, I was done protecting myself from suffering, done seeking security through delusion, done dislocating from fate, destiny, providence, what have you.

Done: this, it is true, is weary resignation.  Yet it’s more than that: it’s a sweet but unhappy release, a deliverance, an unmasking.  Almost paradoxically, it’s freedom within—and despite—limitation.

What’s more exhilarating than that one should die?  What’s more mysterious, more horribly electrifying?  As one writer, Paul Theroux, has put it, “Death is an endless night so awful to contemplate that it can make us love life and value it with such passion that it may be the ultimate cause of all joy and all art.”  That is how you cope with this chilling, daunting, stupefying phenomenon: you do it every day until it’s serviceable and aesthetic, until at last you won’t know, can’t know, when it happens, until it’s pleasurable, a masterpiece, sublime in its regularity.  You keep it close, so close it becomes part of you, so close it’s at your disposal, so close that without it, you’re nothing, nothing if not boringly, thoughtlessly, mechanically alive, which is just another way of being dead.  You train and train and then it comes.

To Educate in the Permanent Things

In Arts & Letters, Books, Essays, Fiction, History, Humanities, Literary Theory & Criticism, Literature, Politics, Walt Whitman, Western Philosophy, Writing on March 20, 2013 at 8:18 am

Allen Mendenhall

This article originally appeared here in The American Spectator.

In his State of the Union address last month, President Obama proposed changes to preschool, high school, and college education, respectively. His proposals generated praise and condemnation from the predictable cheerleaders and naysayers. Some celebrated his efforts to expand early childhood education; others suggested that he should have focused more on the student loan crisis; still others, not to be outdone, pointed to school funding, teacher salaries, grading, standardized testing, technology, and foreign study as the pressing issues that he neglected to address with sufficient detail.

Everyone, it seems, has an opinion about how to improve American education from the top down. But positive change rarely happens through centralized design; it arises spontaneously through the interaction of human agents operating within and among social groups. The State cannot plan and then promulgate a proper education, and legislative enactments cannot reflect the mores and traditions of local groups with differing standards and expectations. The most prudent and humble proposals for improving education are not couched in statist, Platonic terms about civic education and human perfection; instead, they approach learning modestly, on the individual level. They entail the everyday interactions between teachers and students. They are not stamped with the approval of politicians, unions, think tanks, or interest groups.  They take place in the classroom, not the public square. A teacher anywhere, whatever his station, school, or background, can implement them in his course without disrupting the pace or provoking the ire of the educational establishment. The best of these, because it is so easily executed, is simply to teach what T.S. Eliot, and Russell Kirk after him, called “permanent things.”

The permanent things are the inherited principles, mores, customs, and traditions that sustain humane thinking and preserve civilized existence for future generations; their canonization in literary, philosophical, religious, and historical texts happened and is happening in slow degrees. We can trace the permanent things through curricula that emphasize the ultimate values of prosperous societies. An informed, laborious study of the perennial themes and archetypal patterns in what are variously denominated as the Great Works, the Western Canon, or the Classics can help us to organize and make sense of the permanent things. There are those who would object that this approach seems too hopeful and ideal. But no one has suggested it as a panacea, of which there are none, and anyway, is there a proposal that could be simpler, more straightforward, and more workable than assigning and discussing the Great Works?

As early as 1948, Eliot remarked that “there is no doubt that in our headlong rush to educate everybody, we are lowering our standards, and more and more abandoning the study of those subjects by which the essentials of our culture—of that part of it which is transmittable by education—are transmitted; destroying our ancient edifices to make ready the ground upon which the barbarian nomads of the future will encamp in their mechanized caravans.” It might be asked just who these barbarian nomads are and why we ought not to welcome their cultural practices and assumptions. The barbarian nomads could be, I think, any group lacking in historical perspective and mostly ignorant of the illuminating continuities that have guided our weightiest and most imaginative thinkers. The practices and assumptions of these nomads are not grounded in lived experience but aimed at utopian projects such as ensuring equality, creating fundamental rights, or eliminating poverty, and, to the extent that these practices and assumptions deviate from enduring norms, they cannot be said to have flourished ever.

To study the permanent things, on the other hand, is to consider the prevailing and profound ideas from certain times and schools in relation to other such ideas from various times and schools throughout successive eras. It is to map the course of perennial ideas to examine how they apply to different settings and generations. It is both sequential and diachronic in its approach. Its chief benefit is to put ideas into context, which is to say that it is to make us aware of our own presuppositions and perspectives that necessarily arise from our social, cultural, and historical situation.  Each thinker lives in his own specific era and place and cannot gain knowledge in a vacuum outside of time; our era and place shape the manner in which we think and restrict our ability to imagine conditions beyond our immediate and tangible experience.

This is not to submit that our ideas are determined for us, only that we enter into experience with certain perceptions that we have no control over. They are there because of the conditions present at the time and space in which we exist.  A sustained study of the permanent things will show us that our perceptions are not totally alien from those of our predecessors, although the respective perceptions are different. It also teaches us to compensate for our prejudices and to avoid thinking that our necessarily limited perspectives are unconditionally true and universally acceptable, even if they have verifiable antecedents. It reveals, as well, that schools of thought cannot simply be deemed later versions of earlier schools just because the two are in agreement about certain points. Finally, although we cannot escape those presuppositions that are embedded in our thought and culture, being alert to their probable existence can counteract their possible effect.

A rigorous study of the permanent things provides a lodestar for evaluating particular ideas against that which has been tested and tried before. Ideas that seem new always have traceable antecedents, and individuals equipped with a fundamental knowledge of the permanent things are able to situate purportedly novel ideas alongside their forerunners. These individuals recognize that change is not always progress; sometimes it is decline, deterioration, or decay. Only a sense of the continuities of history and thought can demonstrate the difference. Our political pedants in general and President Obama in particular insist on recognizing and implementing new institutions as if a radical departure from historic standards and established customs is itself the mark of good and lasting policy. Yet the permanent things show that even the most exceptional thinkers, those who represent the spirit of their age, whatever that might have been or might be, are part of a greater tradition.

It may be true that to study a particular thinker’s cultural milieu and biography is requisite to placing his ideas into their proper context and to highlighting the unacceptable premises of his philosophy; nevertheless, cautious interpreters ought to consider whether his thoughts necessarily lead to certain consequences, or whether the events that seem related to his thoughts arose accidentally, apart from his philosophy. Put another way, the cautious interpreter must carefully consider causation: whether theories actually generate particular circumstances, or whether those circumstances would have come to pass regardless of what the thinker spoke or wrote. Mussolini, for instance, praised William James, but it does not follow that anything James said or wrote endorses or enables fascism. He who would suggest otherwise betrays an ignorance of James’s work. The permanent things can help us to distinguish the true forms and implications of an individual’s thought from their appropriations by hostile forces.

By studying the permanent things, moreover, we learn that we cannot achieve the proper education through mere funding; nor does the solution to schooling gridlock and setbacks come from student aid, dress codes, student evaluations, tuition, or whatever. These issues begin to seem fleeting and trivial to one with an historical sense. They are at most temporary struggles, and although they are important, as all struggles are important, we are not to subordinate liberal learning to them. The best way to achieve the liberal learning necessary to make important and meaningful distinctions about our complex world is, as I have suggested and as it bears repeating, through a holistic, painstaking exploration of the permanent things. This means not only reading the Great Works for their content, but analyzing them in light of their place in history.

The beauty of this approach is that anyone can carry it out; the wisdom of it lies in its civilizing effects. Whether one is a homeschooling parent, a public school teacher, the leader of a local book club, or simply a curious-minded autodidact, the permanent things are available to him in texts, waiting to be sifted through and analyzed. It is true that there is disagreement as to what constitutes a Great Work and by what criteria, but it does not take more than research and commonsense empiricism to discern which pre-twentieth century texts have withstood the test of time. Teaching the permanent things does not require a large-scale, bureaucratic, administrative overhaul. It does not demand central planning or the implementation of mass, curricular programs; it can be accomplished through decentralized networks of concerned individuals. If parents would teach their children, friends their friends, colleagues their colleagues, and so on, we would in the aggregate become a more literate, astute, and informed society. And as our politicians lecture us about our duties even as they demand our money, we can take comfort in the proverb that these things too shall pass.

The 1965 Eagles

In Arts & Letters, Creative Writing, Essays, Humanities on January 2, 2013 at 8:45 am

Mel Mendenhall was born and raised in Columbus, Georgia.  He lives in Atlanta and is the CEO CLVL Solutions, LLC

Mel Mendenhall

The following essay was composed in 2011, when Gary Levi announced that he had an inoperable brain tumor.  Gary Levi passed away in November 2012.

All of us have known someone who was a particularly good storyteller.  For me, that person was my Dad.  I guess growing up, as my dad did, in the country during the 1920s and early 1930s lent itself to that sort of entertainment: there was no TV, or even radio, back then.  It seemed all of his siblings inherited the storytelling trait, and as children my brothers, sisters, and I enjoyed listening to dad’s siblings’ stories about life on the farm, life in the military (eight brothers and one brother-in-law served in WWII), and life in general.

My Dad coached my first baseball team, the Eagles.  The Eagles I can easily recall are Sim Thomas, Rob Varner, Mac Turner, Johnny Jackson, Gary Levi, Jimmy Monfort, and Johnny Cooper.  In the history of eight-year-old baseball teams, our team, I’m certain, was the cream of the crop – the eight-year-old equivalent to the 1927 Yankees.  What follows is a quick biography of the players at age eight:

Sim Thomas – The star of the team; he did it all.  A true five star player: runs, throws, fields, hits, and hits for power.  He also was our most dominating pitcher, but susceptible to getting frustrated when the umpire’s vision was impaired and strikes were called balls, or so it seemed.  Pitched and played 1st base.

Rob Varner – a solid all around ball player who was reliable in all facets of eight-year-old baseball.  Rob and Sim were both upper classman, 3rd graders, whereas the rest of us were in the second grade.  Rob played third base and catcher.

Mac Turner – A solid second baseman and, like me, a coach’s son.  Mac was from a prominent family that, though wealthy, was very down-to-earth and inclusive.  Mac was always smiling and having a good time on the field and in the dugout.

Johnny Jackson – a really good athlete, muscular fireplug.  He could do it all.  He started out the season as a catcher, but moved to 3rd base after his mom felt – we all felt — that Johnny’s privates were getting a little too beat-up over the course of the season (Casey Stengel didn’t have these parental issues at the MLB level).

Gary Levi – Played left field and was easily our most outwardly enthusiastic player.  Gary woke up fired up and stoked those fires all day long until game time. He had a distinctive way of wearing his hat sideways on his head, with the bill facing left or right, but never straight.  He continuously pounded his glove with his fist while standing in his usual left field spot and giving himself a vehement pep-talk, or, depending on your perspective, “talking to.”

Jimmy Monfort – a very smooth shortstop for an eight-year-old.  He threw right-handed, but batted left-handed, a fact that I thought would look pretty cool on the back of a baseball card.  Jimmy was a sweet swinger who hadn’t yet mastered actual contact, but who looked very good swinging the bat.  There was no question but that he was a ballplayer in the making.

Johnny Cooper – “Cooper” is what we called him.  Did you ever know a kid who always smiled?  It didn’t matter what the circumstance, Cooper was smiling.  Unfortunately, Cooper’s five-year-old athleticism was captured inside an eight-year-old body that quite frankly had not caught up with his fellow 1927 Yankee eight-year-old teammates.  He stood in right field (one couldn’t claim he actually played right field).

The season began, and from the start it was apparent that the Eagles were a team of destiny.  Reporters from all around Columbus, and eventually the entire New York media, or so it seemed to us, followed the team as it plundered through the league beating the Foxes, the Bears, the Cubs, the Lions, and other collective critters.  Simply put, our pitching was dominant and our hitting and fielding were equally good.  Some among us, me included, had to learn to deal with periodic failures where insult was the occasional strike out, which was followed by immediate temper tantrums and tears.  One waiting to bat, or one sitting in the “open air” dugout, needed to stay alert because all of us, without fail, were prone to hurling our bat backwards, towards the dugout, whenever the ump ended our “at bat” with a strike three call.  All of us, of course, except “Cooper,” who always struck out with a smile on his face, and believe me, Cooper always struck out, were given to emotional instability when we ran out of strikes.

As the season went along, we continued to get better and better, and the kids playing on the other teams did as well.  Each team seemed to have a star player or two.  I recall being fascinated with each team’s colors: the Foxes wore red jerseys, the Bears green, the Cubs blue, the Lions a lighter shade of blue.  All teams’ jerseys and caps matched red for red, green for green, blue for blue – you get the picture.  The Eagles, on the other hand, wore the colors of a winner: navy blue with orange letters (like my beloved Auburn).  Would you believe me if I told you I can still smell in my mind’s nose what those jerseys smelled like—I can!—just as I can still smell the freshly mowed grass, or in the outfield, the stubble of weeds. Read the rest of this entry »