See Disclaimer Below.

Posts Tagged ‘Oliver Wendell Holmes Jr.’

The News Makes You Dumb

In America, Arts & Letters, Books, Communication, Humanities, Literature, News and Current Events, Writing on August 19, 2020 at 6:45 am

This piece originally appeared here in Public Discourse.

A pernicious notion seems to have settled into the minds of my generation (I’m 37) when we were little boys and girls. It’s now an unquestioned “fact” that “staying informed,” “staying engaged,” and “following the news” are the obligatory duties of sensible, responsible people.

They’re not.

Reading and watching the news isn’t just unhelpful or uninstructive; it inhibits real learning, true education, and the rigorous cultivation of serious intellectual curiosity.

Simply Gathering Information Is Not Educational

When I was a child, my parents, quite rightly, restricted my television viewing. I could not, for instance, watch television after 5:00 p.m. or for more than an hour on weekdays. (Saturday morning cartoons ran for a permissible two hours, before my parents arose from bed.)

The glaring exception to these rules was “the news.” Watching the evening news was for my family a ritual in information gathering, the necessary means of understanding “current events.” Whatever else people said of it, the news was, by all accounts, educational.

Was it, though? U.S. Supreme Court Justice Oliver Wendell Holmes, Jr. famously refused to read newspapers. In The Theory of Education in the United States, Albert Jay Nock bemoaned “the colossal, the unconscionable, volume of garbage annually shot upon the public from the presses of the country, largely in the form of newspapers and periodicals.” His point was that a societal emphasis on literacy was by and large ineffectual if the material that most people read was stupid and unserious. Does one actually learn by reading the cant and carping insolence of the noisy commentariat?

“Surely everything depends on what he reads,” Nock said of the average person, “and upon the purpose that guides him in reading it.” What matters is not that one reads but what and how one reads. “You can read merely to pass the time,” the great Harold Bloom remarked, “or you can read with an overt urgency, but eventually you will read against the clock.”

The heart beats only so many beats; in one life, a person can read only so much. Why squander away precious minutes reading mediocre scribbling or watching rude, crude talking heads debate transitory political matters of ultimately insignificant import, when instead, in perfect solitude, you could expand your imagination, nurture your judgment and discernment, refine your logic and reasoning, and purge yourself of ignorance, by pursuing wisdom and objective knowledge, through the canon of great literature, with a magnanimous spirit of openness and humility?

Why let obsequious, unlettered journalists on CNN, Fox News, or MSNBC shape your conscience, determine your beliefs, or develop your dependency on allegedly expert opinion, as if you were a docile creature lacking the courage to formulate your own ideas, when you could, instead, empower yourself through laborious study, exert your own understanding, and free yourself from the cramped cage of contemporary culture by analyzing past cultures, foreign places, difficult texts, and profound ideas?

The Demise of Journalism

When I was in college, not so long ago, you could still find semicolons in The New York Times. I’m told they surface there every now and then, but journalistic writing, as a whole, across the industry, is not what it once was. I’m being hyperbolic, of course, and am not so pedantic as to link semicolon usage with across-the-board journalistic standards. Besides, the Kurt Vonneguts of the world would have been pleased to be rid of semicolons. All I’m saying is that popular media should be more challenging if it’s to have far-reaching, salubrious effects. Newspaper writing, print or online, seems to have dumbed down to the point of harming rather than helping society writ large, and the opinions aired on television and radio seem to have attached themselves to one political party or another rather than liberating themselves from groupthink and stodgy consensus.

Reading as an activity should lift of us up, not drag us down. It should inspire and require us to improve our cognitive habits and performance. The same goes for listening: how we listen and what we listen to affects our basic competency and awareness.

Not only have the grammar, vocabulary, and syntax displayed in “the news” diminished in sophistication, both in print and on television and radio, but also more generally the principal subject matter has moved from the complex and the challenging to the easy and simplistic. Media coverage focuses predominantly on contemporary partisan politics that occasion minimal cognitive energy.

There’s a reason why so many people pay attention to politics: it just isn’t that difficult to think about or discuss. It doesn’t demand rational labor or arduous engagement. It can be passively absorbed. Ratings of television news would not be so high if its content weren’t so simplistic and easy to process. People watch the news to take a break or relax, or to get a rise out of eye-catching scandals and circumstances. The distinction between journalism and tabloid journalism has blurred beyond recognition. In short, journalism is a dying art.

Dangers of a Digital Age

Smart phones and social media are part of the problem. Every age has anxieties about technology. We shouldn’t blame smart phones and social media for human sins. The discourse, not the medium through which it circulates, ultimately is the problem. Yet it’s a problem that smart phones and social media have enabled in a way that past technologies could not. To air an opinion, anyone anywhere can simply tweet or post on Facebook without channeling the message through editors or other mediators.

Digital and smart devices have accelerated editorial processes. The never-ending race to publish “breaking” news results in slipshod work. Online reporting is full of typos and errors. A few clever reporters employ terms like Orwellian, Kafkaesque, Machiavellian, or Dickensian to give the impression of literacy, but the truly literate aren’t fooled.

Have journalistic practices and standards declined as literacy rates have risen? Does an increase in readership necessitate a reduction in quality? Do editors and publishers compete for the lowest common denominator, forgoing excellence and difficulty in order to achieve broad appeal?

Demanding stories and accounts that enrich reading habits and exercise mental faculties aren’t merely salacious or sensationalized clickbait. So they’re difficult, these days, to find, unless you already know where to look.

In the 1980s, E. D. Hirsch, Jr. could write with confidence that newspapers assumed a common reader, i.e., “a person who knows the things known by other literate persons in the culture.” Neither journalists nor their readers today, however, seem literate in the traditional sense of that term. The culture of literacy—true literacy, again in the traditional sense of that term—has come under attack by the very scholars and professors who should be its eager champions.

Our popular pundits, mostly hired guns, supply unqualified, cookie-cutter answers to often manufactured problems; their job is not to inform but to entertain a daft and credulous public. “The liberally educated person,” by contrast, is, according to Allan Bloom, “one who is able to resist the easy and preferred answers, not because he is obstinate but because he knows others worthy of consideration.”

Seek Wisdom and Discernment over Politics and Personal Preference

If we wish to consume the news, we should treat it as junk food. The human body cannot healthily sustain itself on candy bars alone. It requires a balanced diet, nutrition, and exercise. So it is with the mind. Fed only junk, it’s malnourished.

Every now and then we may indulge the vice of chocolate or soda without impairing our overall, long-term health. Likewise we may watch without permanent or severe detriment the screeching cacophonies of semiliterate blatherskites like Sean Hannity, Wolf Blitzer, Chris Wallace, Anderson Cooper, Tucker Carlson, Jake Tapper, or, heaven help us, the worst of the worst, Chris Cuomo.

Just know that during the hour spent watching these prattling performers present tendentious interpretations of fresh facts, militantly employing tedious details to service ideological narratives, you could have read an informative book that placed the applicable subject matter into illuminating historical and philosophical context. The facts may be simple and quick, but interpreting them requires knowledge of the past, including the complexities and contingencies of the relevant religious movements, geographies, anthropologies, governments, literatures, and cultures. Devouring ephemeral media segments and sound bites in rapid succession is not learning. It is gluttonous distraction.

Do not misunderstand me: I do not advocate a Luddite lifestyle or a withdrawal from society and the workaday world. I just mean that too many of us, too much of the time, are enthralled by fleeting media trifles and trivialities, and ensnared in the trap of mindless entertainment disguised as vigorous edification.

Let’s stop telling little children what my generation heard when we were kids. They should stay away from the news lest they fall prey to its mania, foolishness, and stupidity. They should read books—difficult books—and be challenged to improve themselves and refine their techniques. Rather than settling on easy, preferred answers, they should accept tensions and contingencies, suspending judgment until all angles have been pursued and all perspectives have been considered. Let’s teach them to become, not activists or engaged citizens necessarily, but intelligent human beings who love knowledge and learning, and who pursue wisdom and discernment before mundane politics.

Advertisement

Civility, Humility, and the Pursuit of Knowledge

In Arts & Letters, Humanities, Law, Libertarianism, Pedagogy, Philosophy on March 25, 2020 at 6:45 am

The following speech was given to the Furman University Conservative Student Society on February 24, 2020. The American Institute for Economic Research published this speech here.

Good evening.  I’ve come from Alabama, but without a banjo on my knee.

It’s always nice to be back at Furman University, my alma mater, where memories of my professors, late evenings in the library, campus strolls around the lake, football games, fraternity shenanigans, ex-girlfriends, meals in the dining hall, rounds of golf, great books and profound discoveries all come rushing back to me with haunting vividness and intensity.

The day I moved into my dorm room, just before orientation began, was sad and exciting and frightening and chaotic. I pulled out of my parents’ driveway in Atlanta that morning to the melodies of James Taylor singing that he was gone to Carolina in his mind. A couple of hours later I was gone to Carolina, too, but not just in my mind.

I parked my blue Ford pickup on the fields beside Blackwell where the SUVs and other pickups were parked or parking. My parents, who had followed me to Greenville in their car, parked in what’s now the Trone Student Center parking lot. Back then it was mostly dirt and gravel except for some paved spaces near the coffee shop, which became a Starbucks Coffee but is now, I’m told, part of the on-campus bookstore. My parents helped me to unload the stuff of my old life and to arrange my dorm room for my new life.

My roommate hadn’t arrived yet. I claimed one side of the room and began filling my dresser, desk, and closet with things. Since I appropriated one section of the room, I wanted my roommate, Bill, to choose the top or bottom bunk for himself. We’d spoken only once before, by phone, a pitiful attempt by two distant, disembodied voices to share in a matter of minutes deep convictions, career ambitions, and preferred hobbies. Bill informed me years later that our initial phone conversation had discouraged him. I was coming to college with my high school girlfriend, so he presumed I would be fully invested in passionate romance and uninterested in secondary friendships.

Were it not for my girlfriend, he would have been correct. She, a socialite and a cheerleader, was the type who always searched for bigger and better things, who elevated revelry to the supreme virtue. To keep up with her, I had to fritter away precious hours at parties and functions and bars. She grew bored of me eventually, and found herself in the arms of many other freshmen boys that year. Or rather, they found themselves in hers; she was the aggressor.

I was talking about Bill’s arrival. He materialized in the dorm room out of nowhere and with an entourage of relatives: his mother and Irish Catholic stepfather (God rest his soul) and his aunts and uncles and cousins and who knows what else besides. They swept into the room, a noisy spectacle, and everyone was introducing themselves and moving furniture and clothes and electronics and sporting equipment that was never used and encyclopedias that were never opened.

What would’ve taken my parents and me several trips to unpack took Bill only one. That’s how many people attended him and serviced his every need. It was impressive, really, as though I were in the presence of royalty. He was rich, in fact, and made a point of displaying his wealth. Only our dorm room seemed bare, too plain and unadorned for this princely graduate of a distinguished private high school in Columbus, Ohio. So the next thing we knew we were at the finest of fine establishments, Walmart, buying decorations. I had the clever idea to acquire signs with which to adorn our door: a stop sign, a men’s and women’s restroom sign, and whatever other signs I cleared from the hardware section. Bill eyed these curious treasures skeptically but assented to their purchase. He’d known me only about an hour. Best not to upset the poor Southerner over these procurements, the magnanimous Yankee must’ve thought.

By mid-afternoon our room was fully furnished. Our new hall mates stopped by to introduce themselves, allured by the bewildering array of signage on our door, which, in the Tate, would have resembled a modernist masterpiece: a condemnatory symbol of the directionless chaos of the consumerist decade we were leaving behind. (It was, after all, 2001.) A crowd developed in our room. We were instantly popular. Bill seemed to appreciate, at length, my unique design tastes.

Bill and I decided to look around after everyone left. Where, we wondered, was the laundry room? We needed to find out, maybe even to experiment with the washer and dryer since we had never used either before. We found the laundry room musty and tucked away in the basement. At least the machines, despite their coin slots, no longer required quarters. I noticed a button on the wall beside a green light. “To test carbon monoxide levels,” read an adjacent sign, “press button when light is green.” I didn’t know much about carbon monoxide, but suddenly had the urge to test its levels.

I pressed the button. The fire alarm erupted; red lights flashed on and off. Bill shot me a glare that conveyed anger, panic, and amusement all at once. Which feeling prevailed, I couldn’t say.

We needed to flee. We knew it was illegal to stay in the building, but also that we weren’t in danger, that there wasn’t a fire, so we repaired to our room. The hallways were empty. No one saw us sneaking up the stairs. Once in our room, we determined to wait out the alarm. Eventually, we knew, everyone would come filing back when no fire was detected.

So we sat. And we sat. And we sat, completely silent. Then came a loud knocking at the door. Wham! Wham! Wham!

I stood, frightened. Bill stared at me, desperately shaking his head as if to say, “Do not open the door!”  I paused out of deference. The knock came again: Wham! Wham! Wham! “I’m sorry,” I said, “I have to open it.”  Bill buried his face in his palm.

I opened the door. There before me, standing six foot six, muscles bulging, stood a firefighter in full gear. From behind his goggles, which were affixed to his helmet, he looked me up and down, head to toe. This is it, I thought. I am going to be arrested on my first day on campus, and I’m taking my innocent roommate with me.

Speechless, I offered my wrists for the cuffing, obsequiously extending my arms. The firefighter lifted his goggles, revealing brown button eyes, and removed his helmet. He looked at me and then behind me, back at me and then behind me again. It struck me that he was examining the door. “I’m sorry,” he said. “I thought this was the bathroom.”

“The bathroom’s over there,” I said, pointing down the hall.

“Thank you,” he said, and walked away.

I closed the door. Bill sighed with relief and then he and I roared with laughter.

I remember my first day of class. It was early, Introduction to Philosophy with Dr. Sarah Worth. After class I walked back to the dorms. A guy named Jonathan Horn, who lived on what was then the Sigma Chi hall on the ground floor, intercepted me. He was animated and flustered. I had played little league baseball with him back in Marietta, Georgia, when I was seven or eight, but had not seen him again until orientation week. He was now a rising sophomore in college. I don’t recall how we established that we’d been teammates long ago, but we made the connection. He was the first student to show me around campus and to introduce me to the fraternity ecosystem. At this particular moment, he was frazzled and going on about how an airplane had crashed into the World Trade Center. I was confused, not really knowing what the World Trade Center was. “You know,” said Jonathan, “that tall building with offices and restaurants and stuff on top.”

I didn’t know, and had assumed that whatever struck the building had been small: a glider or an ultralight. I walked up the stairs to my room and turned on the television. Moments later a second plane—a large commercial airliner—crashed into the Twin Towers, and I saw, or at least seem to recall, people leaping from the monstrous building to their deaths. I was horrified and scared and confused, still so very confused, and tried calling my dad’s cell phone because I knew he was flying to New York that morning.

We had a land line in our dorm room: a phone that plugged into the wall. Only a few students carried cell phones back then. It was the first year I hadn’t worn a pager on my belt. My parents had given me a cell phone the week before, but I didn’t use it—and wouldn’t use it regularly until spring semester, when cell phones suddenly proliferated across campus. My dad didn’t answer his phone. I assumed the worst and tried calling mom. Eventually I got ahold of her. She had, she assured me, spoken to dad. He was okay. Now she was trying to locate her brother, my uncle, who’d also flown to New York that day, or maybe was in New York already for work. In either case, he was eventually accounted for.

The first day of college is disorienting and momentous, one of those rare occasions when you’re acutely aware of the gravity of the moment you’re experiencing. For my classmates, though, that day was disorienting and momentous, not just for us, but for the entire country, perhaps the entire planet. It marked the end of an era. I was a grownup, and so, too, was the United States of America. The ideas and books my classmates and I discussed that semester, and for the next few years, took on a furious intensity. Everyone, it seemed, was debating weighty and difficult questions: What was America? What was terrorism? Who was responsible for this attack? What was just war? What were the differences between Islam, Christianity, and Judaism? What was totalitarianism? What is Western Civilization and Eastern Civilization? Weren’t there other civilizations? What the hell was civilization? What was the difference between a conservative and a liberal? How do you accommodate differences in beliefs, feelings, and opinions within a diverse populace? What were facts, and how could people arrange them differently to produce competing narratives?

My high school sweetheart broke up with me a few weeks into freshmen year. I was devastated and buried myself in books. Bill, to his credit, grew concerned and suggested that I meet with his English professor, Judy Bainbridge, for advice and direction. He watched me reading and writing poetry in the evenings, slowly disengaging from the social scene, spending countless hours in the library with books that weren’t assigned in my classes. He thought I needed an intervention.

He was right. I met with Dr. Bainbridge and showed her some of my poetry, which did not impress her. I don’t remember much about our conversation, but I recall her recommendation that I take certain courses with certain professors, and also that I join both the college Republicans and the college Democrats so that I could be exposed to different viewpoints and learn to avoid ideological complacency. I followed her advice, joined both organizations, and throughout my time at Furman tried to keep an open mind about, well, everything.

I majored in English and quickly adopted convictions that I considered to be leftist—in particular in the field of economics of which I was ignorant—because I wanted to do good, be nice, and help those who were less fortunate. Turns out I still desire those goals, only now I have a more principled and mature approach that in our current intellectual climate would be considered conservative or libertarian. This approach is predicated, not on how much I know, but on how much I don’t know. I have F.A. Hayek to thank for my epistemological commitments.

The development of the legal system demonstrates the importance of maintaining conflict at the level of rhetoric and persuasion, the alternatives to coercion and force

I have spent over a decade studying former United States Supreme Court justice Oliver Wendell Holmes Jr., who, to my mind, is one of the most misunderstood figures in our country’s history—a punching bag for commentators of various political persuasions. His book The Common Law tells the story of the evolution of the common-law system from its rude and primitive origins, when violence and personal vendetta characterized the arbitrary rule of kin and clan, to a more mature and sophisticated system involving public fora, courts and tribunals, administrative procedures, impartial juries, and the emergence of general principles out of concrete cases regarding unforeseeable conflicts between antagonistic parties.

This tidy account details how vengeance and passion yielded to reason, rhetoric, and rationality as argumentation and persuasion took the place of blood feuds as the operative form of dispute resolution. I’m reminded of Aeschylus’s great trilogy, The Oresteia, which consists of tragedies that mythologize the founding of a rational Greek legal system that supplanted the carnage and recklessness of the grand age of Homeric gods and heroes who warred without end. You might find a distinctively American version of this myth in the television series Deadwood, which traces the development of government and law in a chaotic Western town.

I bring up Holmes and Aeschylus and Deadwood to suggest to you the immense importance of free and open dialogue, of rational argumentation and civil disagreement. Civilization itself—that is, a state of human society that is organized, peaceful, and prosperous, consisting of science, industry, arts, and literature—is potentially at stake when disagreement is no longer maintained at the level of rhetoric and resolved through persuasion and procedure. In the absence of ongoing conversation and debate, we risk falling into the chaos and violence and internecine strife that destabilize and destroy civil societies.

Before the Civil War, the idealistic young Holmes—then known as Wendell—flirted with transcendentalism. Having fought in the 20th Massachusetts during the Civil War and having experienced firsthand the carnage of battle, he spent his later career as a jurist seeking to accommodate disagreement, diffuse conflict, and moderate uncompromising political forces that threatened to bring about widespread violence. He did not want to witness another Civil War.

When I worked at the Alabama Supreme Court, I handled hundreds if not thousands of cases. Appellate cases provide edifying examples of the centrality of patience, humility, tenacity, and open-mindedness to problem-solving and unfettered inquiry. I would read appellants’ briefs that convinced me of the rightness of their clients’ positions. Then I would turn to the appellees’ briefs that seemed equally persuasive. Had I been tasked with deciding between the appellant and the appellee using my isolated reason and judgment, I would have struggled and despaired and probably arrived at erroneous conclusions. Fortunately, though, I had not only my colleagues to assist me, but innumerable precedents in prior cases and hundreds of years of development in the law to guide me. The appellant and the appellee were just two parties to a larger conversation that had endured in varying forms for centuries. Resolving their particular dispute required an exploration of the reasoning and rationale of several judges faced with similar facts and issues.

We learn by similar processes. Stuck between competing arguments, torn between opposing positions, we suspend judgment, or should, until we have analyzed the relevant facts and issues and mined the past for like situations and instructive examples. We should question our presuppositions and examine complex conflicts from different angles. Aware that knowledge is limited, memory is selective, and perspective is partial, we must avoid the trap of ideology, which causes people to choose what they believe and then to find support for it, or to draw complicated ideas through simplistic formulae to generate favored outcomes.

College should be about discovery, learning, and the acquisition and transmission of knowledge. It should involve inquiry and curiosity, challenge and exploration, forcing us to shape and revise our beliefs, to pursue clarity through rigorous study. The Book of Proverbs submits that fools despise wisdom and instruction.[1] To avoid foolishness, we must be teachable. And we must learn our limitations.

Learning our limitations

Across the hall from me, on the top floor of Manly Hall, during my freshman year at Furman, lived my friend Andre, a kicker on the football team.  He was affable and happy, the kind of person you wanted around when you told jokes because of his contagious laughter. He was much bigger than I was, though not as large, say, as an offensive or defensive lineman, and one day we wrestled on the floor right there in the hallway of the dorm. It was all for fun, but a real contest of manly strength with actual pride and reputation at stake. Several of our hall-mates watched and cheered as Andre wrapped me up like a pretzel and pinned me to the ground in an impressive show of force. At first I tried to maneuver out of his iron grip but, realizing I lacked the strength, I simply submitted, defeated and docile, waiting for him to release me.

I had lost, and was genuinely surprised by the ease with which I had been conquered. I realized that, given my size, I possessed only so much physical power, and that someone of greater size and strength could, quite efficiently, subdue me. You would think that common sense, or a basic understanding of physical reality, would have led me to that conclusion already, but I was young and hubristic. At some point, a short man must acknowledge he’s short. A slow man must acknowledge he’s slow. A clumsy man must acknowledge his inelegance. We’re not all mathematicians, rocket scientists, or geniuses. But to realize our fullest potential, to maximize our ability to know things and accomplish our goals, we must discover our strengths and weaknesses. We can’t be who we’re not, but we can make the best of who we are.

Aesop, a slave in the ancient world whose fables have been told since at least the 6th century B.C., tells of the Proud Frog, the mother of several little froglets. One morning, while she was away, an ox, not seeing the froglets, stepped on one and squashed him to death. When the mother returned, the froglet brothers and sisters croaked and squeaked, warning their mother of the enormous beast that had killed their brother. “Was it this big?” the mother asked, swelling up her belly. “Bigger,” the children said. “This big?” she said, swelling her belly even more. “Much bigger,” the children said. “Was it this big?” she said, swelling her belly and puffing herself up with tremendous force. “No, mother, the beast was much bigger than you.” Offended, the mother strained and strained, swelling and puffing, swelling and puffing until—boom! She popped!

You see, we shouldn’t presume to be more than we are.

I learned years after graduation that, while he was in medical school, Andre entered the great, ever-growing family of the departed, having taken his own life for reasons I don’t know and probably couldn’t understand. Even today it’s hard for me to imagine what could have driven this fun-loving, kind, strong, and generous person to such unbearable, unspeakable despair.

Channeling human emotions through debate and rhetorical fora

Human beings are emotional and passionate. Our feelings, our tendencies towards anger and wrath, are not, however, necessarily bad. If someone were to enter this room and commit some violent atrocity, we would be horrified and enraged. When we hear grievous stories of innocents who have been slaughtered, deprived of their possessions, hurt, mistreated, or oppressed, we fume and demand responsive, retributive action. Anger towards some people suggests that we feel strongly towards other people, that we have the capacity, in other words, to love deeply, bond, and affectionately associate.

But our anger and wrath must be constructively channeled. The legal system provides a mechanism for managing the pain, outrage, hurt, and anger that threaten to disrupt social harmony. Consider The Eumenides, the last play in the trilogy, The Oresteia, which I mentioned earlier. Here is the backstory. Clytemnestra murdered her husband, Agamemnon, king of Mycenae, after he returned home to Argos from the Trojan War. She had taken a lover, Aegisthus, just as Agamemnon had taken a lover: the seer, Cassandra, whom Clytemnestra also murdered. At the behest of Apollo, Orestes, the son of Agamemnon and Clytemnestra, avenges his father’s death by killing both Aegisthus and Clytemnestra.

Now the Furies—three enraged goddesses in the form of beasts who are older than the Olympian gods and goddesses—relentlessly and recklessly pursue Orestes to avenge the murder of Clytemnestra. Apollo has given Orestes temporary refuge in the temple at Delphi, but Clytemnestra’s ghost rouses the passionate, bloodthirsty Furies into uncontrolled passion. They are shocked and angered by unpunished matricide. Athena intervenes to assemble a jury and hold a public trial in which the prosecuting Furies will argue their case and Apollo will serve, in effect, as Orestes’s defense attorney.

The jury splits, leaving Athena to cast the deciding vote. The Furies worry that if Athena opts to acquit Orestes, she’ll usher in an era of lawlessness. They believe that order and the integrity of the ancient law depend on killing Orestes. To them, Orestes’s murder is especially offensive because Clytemnestra is the mother, the fertile figure, the bearer of life from whose womb Orestes emerged into the cosmos. An attack on the mother is an attack on life itself, on the very continuity of human existence.

Athena is faced with a seemingly zero-sum situation: she must either spare Orestes’s life and enrage the Furies, who will unleash their lethal rage on society, or give the Furies what they wish, namely Orestes’s death, and thereby inflame Apollo and the other Olympian gods. Violent revenge appears inevitable. A self-perpetuating cycle of violence seems destined.

The Furies are wild, destructive, and vindictive. Athena in her divine wisdom recognizes, however, that they are indispensable to the law precisely because of those qualities. If someone is murdered, the legal system must bring about justice and mete out coercive punishment. The emotions and passions that animate revenge must be mediated, however, through formal and public processes, procedures, and protocols to ensure that they do not spin out of control, infecting whole populations beyond the immediate parties to a case. The legal system, by bringing conflicts into the field of rhetoric, argumentation, and persuasion in open fora governed by procedural rules, mitigates the intensity of the parties’ passions and emotions, which must be channeled through formal institutions and subjected to public scrutiny.

So what does Athena do? She splits the baby, as it were, by voting to free Orestes and by promising the Furies a high seat on the throne of her city, where they will enjoy everlasting honor and reverence. Of course, she must persuade the Furies of the rightness of this resolution. She does so with such effectiveness that her persuasion is likened to a “spell”; the Furies call her rhetoric “magic.” “Your magic is working,” the leader of the Furies submits. “I can feel the hate, / the fury slip away.”

Like Holmes, Athena despised civil war. “Let our wars / rage on abroad, with all their force, to satisfy / our powerful lust for fame,” she says. “But as for the bird / that fights at home—my curse on civil war.” She has pacified the hateful Furies and established a system of conflict resolution, not just for this matter but for all future matters.

Dealing with the inevitability of conflict

Imagine, if you will, that you could press a reset button that erased all memory and knowledge of the past but that instilled in each of us one definite principle, namely that every person by virtue of being human deserves to live freely and peaceably until visited by a natural death. This button would provide humanity with a clean slate, as it were. A fresh beginning. But it wouldn’t be long before inevitable conflicts arose. Accidents would happen. People would get hurt. Emotions and passions would be inflamed as a result. We seem to be wired to favor family over strangers, and to desire healthy and prosperous lives for our children. We want to maximize our wellbeing, sometimes at the expense of others’ wellbeing. Given the option to help our children or the children of some faraway stranger, we choose our children, the beings we brought into the world, on whose behalf we labor, weep, and rejoice.

Even if we could start over, struggle, contest, fighting, and feuding would arise. In light of the inevitability of conflict, we must make every effort to restrain it at persuasion and rhetoric. The university as an ideal represents a kind of intellectual forum where the sharpest minds come to debate, not the case of a client, but of an idea. Courtrooms provide spaces for litigants to have it out, so to speak, whereas universities provide spaces for scholars to test and debate facts and theories.

Universities are like courtrooms where competing ideas are given a hearing; the principle of rule of law over arbitrary and tyrannical rule should govern inquiry on campuses

We could think of the university as a legal system in which intellectuals “litigate” differing viewpoints before juries of intellectual peers who are committed to the advancement of knowledge and the clarity of ideas. We evaluate legal systems based on their tendency toward tyranny on the one hand and rule of law on the other. A tyrannical legal system is characterized by arbitrary commands, private vendettas, rapidly changing rules and standards, retroactive application of new rules and standards, lack of procedure and due process, and ambiguity.

By contrast, rule of law consists of general, regular, stable, and public rules regarding fundamental fairness that play out in established processes, procedures, and protocols. The university and the legal system realize the benefits of receiving and transmitting knowledge through open dialogue and debate, of resolving complex disputes through argumentation rather than physical force and intimidation, of settling controlling precedents through the aggregated decisions of innumerable minds, of suspending judgment on controversial matters until discovery procedures and deliberative processes have been exhausted, and of appealing contested judgments to additional, impartial bodies that will analyze the facts, evidence, and operative rules from a more removed vantage point.

Violent protests, no-platforming and de-platforming, dis-invitations, the shouting down of controversial speakers, or of blacklisting, harassing, threatening, or doxing them—these push us in the direction of arbitrary and tyrannical rule rather than the rule of law. They foment anger and outrage and privilege immediate vengeance over rational, procedural argumentation. They inhibit learning and deprive others of the opportunity to understand people and issues with greater clarity. They rouse emotions and passions that are antithetical to civility and humility.

College students should, in my view, think of themselves as judges in training—not in the sense that they will preside in courtrooms or manage and decide cases, but in the sense that they will be constructive participants in their civic and intellectual communities, cultivating the standards, norms, and discernment necessary to improve the lives and institutions of their family, friends, neighbors, colleagues, cities, counties, states, and country. They may not render binding judgments, but they will exercise judgment.

You cannot refine your logic and reasoning, your critical thinking, your ability to formulate cogent arguments, without considering diverse ideas with which you disagree. And when you identify an idea with which you disagree, you should adopt a Socratic approach to it, asking question after question until you grasp at a deeper level why you disagree and how to articulate your disagreement in a manner that persuades others to your position.

Good judges are patient, diligent, competent, credible, independent, and impartial. They avoid not just impropriety, but appearances of impropriety. They eschew favoritism. Confidence in their office and judgment depends upon their integrity, high standards of conduct and method, and prioritizing of truth, evidence, and fact over private interests and biases. They are not influenced by familial, financial, or political factors but courteously committed to fair processes, correct answers, sound research, substantiated arguments, and reasonableness. The best judges and professors I have met over my career are those whose personal political convictions, and whose attitude regarding partisan elections or newsworthy current events, were unknown to me.

The lesson of the Furies is that violence breeds violence, and that coercion breeds coercion. If you stifle speech, rough up speakers, intimidate them, prohibit them from airing their opinions, you generate backlash, maybe not right away, maybe not in a form that you’ll immediately recognize, but forces will work to meet your anger with anger. Intellectual inquiry has difficulty flourishing in a climate of radioactive anger and toxic outrage.

Unleashing fury upon those who express views with which you disagree will only jeopardize your credibility, and might just empower the ideas you’re seeking to discredit. Ideas that appear taboo or transgressive often spread when powerful forces seek to suppress them. The paradox of the martyr, of course, is that his or her power resides in defeat, in death. The voice of the martyr is loudest once he or she has been permanently silenced. There’s a reason why passive resistance and civil disobedience are so effective in the long run.

The Apostle Paul wrote that Jesus had told him—perhaps through a vision or a revelatory inner voice—“My power is made perfect in weakness.” Another paradox: strength resides in meekness and mildness. If you are utterly convinced of the rightness of certain views that you sincerely hold, then constructively to advance them, to see them succeed in the long run, you should air them from a position of meekness and mildness. Spreading them with coercion or force will probably fail. Even those who outwardly manifest the signs of a convert might inwardly reject the views they purport to have adopted. Beliefs are dubious that depend for their advancement on the use of coercion and force. A resort to violence in the name of an idea suggests that arguments for that idea are unpersuasive. In the absence of articulated reasoning against certain views, those views gain credence and currency. Attempting to stamp them out through coercion or force is counterproductive.

Civility and humility are therefore indispensable to the pursuit and acquisition of knowledge.

I’ll end with the wisdom of Aesop’s fable “The Cat and the Fox.” The fox, you see, was braggadocios, boasting to the cat about all the things he could and would do if he were attacked by hunting hounds. The modest, sensible cat replied to the haughty fox that she, having only one simple trick to escape dogs, wasn’t so clever. “If my trick doesn’t work,” she sighed, “then I’m done for.”

The fox, laughing, mocked the cat for her lack of cunning. “Too bad you’re not as smart as I am,” he taunted. As soon as these words issued from his snout, a pack of hounds descended upon him. The cat resorted to her one trick and escaped. The fox, however, tried several tricks, each craftily, but they didn’t work. The hounds snatched him up and tore him to shreds, filling their bellies with bloody fox meat.

Friends, my fellow Furman paladins, don’t be the fox. Please, don’t be like him. There are always dogs—and cats for that matter—who are better and smarter than you are. There are always powerful forces beyond your control. Be sensible lest they swallow you up. Be humble and teachable, know your strengths and weaknesses, and suspend judgment on important and controversial matters until you have considered them from different angles and, if possible, examined all relevant data. Unless and until you do these things, you won’t acquire and transmit knowledge with your fullest potential.

 

 

 

 

 

[1] Proverbs 1:7.

Oliver Wendell Holmes Jr., Abraham Lincoln, and the Civil War

In American History, History, Humanities, Oliver Wendell Holmes Jr. on March 11, 2020 at 6:45 am

St. George Tucker’s Jeffersonian Constitution

In American History, Arts & Letters, Books, Civics, History, Humanities, Jurisprudence, Law, Legal Education & Pedagogy, liberal arts, Nineteenth-Century America, Philosophy, Politics, Western Civilization, Western Philosophy on October 30, 2019 at 6:45 am

This piece originally appeared here in Law & Liberty. 

One could argue that there are two basic visions for America: the Hamiltonian and the Jeffersonian. The former is nationalist, calling for centralized power and an industrial, mercantilist society characterized by banking, commercialism, and a robust military. Its early leaders had monarchical tendencies. The latter vision involves a slower, more leisurely and agrarian society, political decentralization, popular sovereignty, and local republicanism. Think farmers over factories.

Both have claimed the mantle of liberty. Both have aristocratic elements, despite today’s celebration of America as democratic. On the Hamiltonian side we can include John Adams, John Marshall, Noah Webster, Henry Clay, Joseph Story, and Abraham Lincoln. In the Jeffersonian camp we can place George Mason and Patrick Henry (who, because they were born before Jefferson, could be considered his precursors), the mature (rather than the youthful) James Madison, John Taylor of Caroline, John C. Calhoun, Abel Upshur, and Robert Y. Hayne. The Jeffersonian Republicans won out in the early nineteenth century, but since the Civil War, the centralizing, bellicose paradigm has dominated American politics, foreign and monetary policy, and federal institutions.

St. George Tucker falls into the Jeffersonian category. View of the Constitution of the United States, published by Liberty Fund in 1999, features his disquisitions on various legal subjects, each thematically linked. Most come from essays appended to his edition of Sir William Blackstone’s Commentaries on the Laws of England.

Born in Bermuda, Tucker became a Virginian through and through, studying law at the College of William and Mary under George Wythe, whose post at the law school he would eventually hold. On Tucker’s résumé we might find his credentials as a poet, essayist, and judge. He was an influential expositor of the limited-government jurisprudence that located sovereignty in the people themselves, as opposed to the monarch or the legislature, which, he believed, was a surrogate for the general will in that it consisted of the people’s chosen representatives.

Tucker furnished Jeffersonians with the “compact theory” of the Constitution:

The constitution of the United States of America . . . is an original, written, federal, and social compact, freely, voluntarily, and solemnly entered into by the several states of North-America, and ratified by the people thereof, respectively; whereby the several states, and the people thereof, respectively, have bound themselves to each other, and to the federal government of the United States; and by which the federal government is bound to the several states, and to every citizen of the United States.

Under this model, each sovereign, independent state is contractually and consensually committed to confederacy, and the federal government possesses only limited and delegated powers—e.g., “to be the organ through which the united republics communicate with foreign nations.”

Employing the term “strict construction,” Tucker decried what today we’d call “activist” federal judges, insisting that “every attempt in any government to change the constitution (otherwise than in that mode which the constitution may prescribe) is in fact a subversion of the foundations of its own authority.” Strictly construing the language of the Constitution meant fidelity to the binding, basic framework of government, but it didn’t mean that the law was static. Among Tucker’s concerns, for instance, was how the states should incorporate, discard, or adapt the British common law that Blackstone had delineated.

Tucker understood the common law as embedded, situated, and contextual rather than as a fixed body of definite rules or as the magnificent perfection of right reason, a grandiose conception derived from the quixotic portrayals of Sir Edward Coke. “[I]n our inquiries how far the common law and statutes of England were adopted in the British colonies,” Tucker announced, “we must again abandon all hope of satisfaction from any general theory, and resort to their several charters, provincial establishments, legislative codes, and civil histories, for information.”

In other words, if you want to know what the common law is on this side of the pond, look to the operative language of governing texts before you invoke abstract theories. Doing so led Tucker to conclude that parts of English law were “either obsolete, or have been deemed inapplicable to our local circumstances and policy.” In this, he anticipated Justice Holmes’s claim that the law “is forever adopting new principles from life at one end” while retaining “old ones from history at the other, which have not yet been absorbed or sloughed off.”

What the several states borrowed from England was, for Tucker, a filtering mechanism that repurposed old rules for new contexts. Tucker used other verbs to describe how states, each in their own way, revised elements of the common law in their native jurisdictions: “modified,” “abridged,” “shaken off,” “rejected,” “repealed,” “expunged,” “altered,” “changed,” “suspended,” “omitted,” “stricken out,” “substituted,” “superseded,” “introduced.” The list could go on.

The English common law, accordingly, wasn’t an exemplification of natural law or abstract rationalism; it was rather the aggregation of workable solutions to actual problems presented in concrete cases involving real people. Sometimes, in its British iterations, it was oppressive, reinforcing the power of the king and his agents and functionaries. Thus it couldn’t fully obtain in the United States. “[E]very rule of the common law, and every statute of England,” Tucker wrote on this score, “founded on the nature of regal government, in derogation of the natural and unalienable rights of mankind, were absolutely abrogated, repealed, and annulled, by the establishment of such a form of government in the states.”

Having been clipped from its English roots, the common law in the United States had, in Tucker’s view, an organic opportunity to grow anew in the varying cultural environments of the sovereign states. In this respect, Tucker prefigured Justice Brandeis’s assertion in Erie Railroad Company v. Tompkins (1938) that “[t]here is no federal general common law.” Tucker would have agreed with Brandeis that, “[e]xcept in matters governed by the Federal Constitution or by acts of Congress, the law to be applied in any case is the law of the state.”

In fact, summarizing competing contentions about the Sedition Act, Tucker subtly supported the position that “the United States as a federal government have no common law” and that “the common law of one state . . . is not the common law of another.” The common law, in Tucker’s paradigm, is bottom-up and home-grown; it’s not a formula that can be lifted from one jurisdiction and placed down anywhere else with similar results and effects.

By far the most complex essay here is “On the State of Slavery in Virginia,” which advocated the gradual extirpation of slavery. With admirable clarity, Tucker zeroed in on the hypocrisy of his generation:

Whilst we were offering up vows at the shrine of Liberty, and sacrificing hecatombs upon her altars; whilst we swore irreconcilable hostility to her enemies, and hurled defiance in their faces; whilst we adjured the God of Hosts to witness our resolution to live free, or die, and imprecated curses on their heads who refused to unite us in establishing the empire of freedom; we were imposing upon our fellow men, who differ in complexion from us, a slavery, ten thousand times more cruel than the utmost extremity of those grievances and oppressions, of which we complained.

Despite his disdain for the institution of slavery, Tucker expressed ideas that are racist by any measurable standard today—for instance, his notion that slavery proliferated in the South because the climate there was “more congenial to the African constitution.”

On the level of pure writing quality and style, Tucker had a knack for aphorism. “[T]he ignorance of the people,” he said, “is the footstool of despotism.” More examples: “Ignorance is invariably the parent of error.” “A tyranny that governs by the sword, has few friends but men of the sword.”

Reading Tucker reminds us that for most of our country’s formative history the principal jurisprudential debates were not about natural law versus positivism, or originalism versus living constitutionalism, but about state versus federal authority, local versus national jurisdiction, the proper scale and scope of government, checks and balances, and so forth. To the extent these subjects have diminished in importance, Hamilton has prevailed over Jefferson. Reading Tucker today can help us see the costs of that victory.

Review of Stephen Budiansky’s “Oliver Wendell Holmes Jr.”

In Academia, America, American History, American Literature, Arts & Letters, Book Reviews, Books, Historicism, History, Humanities, Jurisprudence, Law, liberal arts, Oliver Wendell Holmes Jr., Philosophy, Pragmatism, Scholarship, Western Philosophy on September 25, 2019 at 6:45 am

This review originally appeared here in Los Angeles Review of Books.

Do we need another biography of Oliver Wendell Holmes Jr., who served nearly 30 years as an Associate Justice of the United States Supreme Court and nearly 20 years before that on the Massachusetts Supreme Judicial Court? He has been the subject of numerous biographies since his death in 1935. We have not discovered new details about him since Harvard made his papers available to researchers in 1985, so why has Stephen Budiansky chosen to tell his story?

The answer may have to do with something Holmes said in The Common Law, his only book: “If truth were not often suggested by error, if old implements could not be adjusted to new uses, human progress would be slow. But scrutiny and revision are justified.”

Indeed, they are — both in the law and in the transmission of history. Holmes has been so singularly misunderstood by jurists and scholars that his life and thought require scrutiny and revision. Because his story is bound up with judicial methods and tenets — his opinions still cited regularly, by no less than the US Supreme Court as recently as this past term — we need to get him right, or at least “righter,” lest we fall into error, sending the path of the law in the wrong direction.

A veritable cottage industry of anti-Holmes invective has arisen on both the left and the right side of the political spectrum. No one, it seems, of any political persuasion, wants to adopt Holmes. He’s a giant of the law with no champions or defenders.

For some critics, Holmes is the paragon of states’ rights and judicial restraint who upheld local laws authorizing the disenfranchisement of blacks (Giles v. Harris, 1903) and the compulsory sterilization of individuals whom the state deemed unfit (Buck v. Bell, 1927). This latter decision he announced with horrifying enthusiasm: “Three generations of imbeciles are enough.” For other critics, he’s the prototypical progressive, decrying natural law, deferring to legislation that regulated economic activity, embracing an evolutionary view of law akin to living constitutionalism, and bequeathing most of his estate to the federal government.

The truth, as always, is more complicated than tendentious caricatures. Budiansky follows Frederic R. Kellogg — whose Oliver Wendell Holmes Jr. and Legal Logic appeared last year — in reconsidering this irreducible man who came to be known as the Yankee from Olympus.

Not since Mark DeWolfe Howe’s two-volume (but unfinished) biography, The Proving Years and The Shaping Years, has any author so ably rendered Holmes’s wartime service. Budiansky devotes considerable attention to this period perhaps because it fundamentally changed Holmes. Before the war, Holmes, an admirer of Ralph Waldo Emerson, gravitated toward abolitionism and volunteered to serve as a bodyguard for Wendell Phillips. He was appalled by a minstrel show he witnessed as a student. During the war, however, he “grew disdainful of the high-minded talk of people at home who did not grasp that any good the war might still accomplish was being threatened by the evil it had itself become.”

Holmes had “daddy issues” — who wouldn’t with a father like Oliver Wendell Holmes Sr., the diminutive, gregarious, vainglorious, and sometimes obnoxious celebrity, physician, and author of the popular “Breakfast Table” series in The Atlantic Monthly? — that were exacerbated by the elder Holmes’s sanctimonious grandstanding about his noble, valiant son. For the aloof father, the son’s military service was a status marker. For the son, war was gruesome, fearsome, and real. The son despised the father’s flighty ignorance of the on-the-ground realities of bloody conflict.

Holmes fought alongside Copperheads as well, a fact that might have contributed to his skepticism about the motives of the war and the patriotic fervor in Boston. His friend and courageous comrade Henry Abbott — no fan of Lincoln — died at the Battle of the Wilderness in a manner that Budianksy calls “suicidal” rather than bold. The war and its carnage raised Holmes’s doubts regarding “the morally superior certainty that often went hand in hand with belief: he grew to distrust, and to detest, zealotry and causes of all kinds.”

This distrust — this cynicism about the human ability to know anything with absolute certainty — led Holmes as a judge to favor decentralization. He did not presume to understand from afar which rules and practices optimally regulated distant communities. Whatever legislation they enacted was for him presumptively valid, and he would not impose his preferences on their government. His disdain for his father’s moralizing, moreover, may have contributed to his formulation of the “bad man” theory of the law. “If you want to know the law and nothing else,” he wrote, “you must look at it as a bad man, who cares only for the material consequences which such knowledge enables him to predict, not as a good one, who finds his reasons for conduct, whether inside the law or outside of it, in the vaguer sanctions of conscience.”

Budiansky’s treatment of Holmes’s experience as a trial judge — the Justices on the Massachusetts Supreme Judicial Court in those days presided over trials of first instance — is distinctive among the biographies. Budisansky avers,

[I]n his role as a trial justice, Holmes was on the sharp edge of the law, seeing and hearing firsthand all of the tangled dramas of the courtroom, sizing up the honesty of often conflicting witnesses, rendering decisions that had immediate and dramatic consequences — the breakup of families, financial ruin, even death — to the people standing right before him.

Holmes’s opinions as a US Supreme Court Justice have received much attention, but more interesting — perhaps because less known — are the salacious divorce cases and shocking murder trials he handled with acute sensitivity to evidence and testimony.

Budiansky skillfully summarizes Holmes’s almost 30-year tenure on the US Supreme Court, the era for which he is best known. He highlights Holmes’s dissenting opinions and his friendship with Justice Louis Brandeis, who was also willing to dissent from majority opinions — and with flair. For those looking for more detailed narratives about opinions Holmes authored as a Supreme Court Justice, other resources are available. Thomas Healy’s The Great Dissent, for example, dives more deeply into Holmes’s shifting positions on freedom of speech. Healy spends a whole book describing this jurisprudential development that Budiansky clears in one chapter.

Contemptuous of academics, Budiansky irrelevantly claims that “humorless moralizing is the predominant mode of thought in much of academia today.” He adds, “A more enduring fact about academic life is that taking on the great is the most reliable way for those who will never attain greatness themselves to gain attention for themselves.” Harsh words! Budianksy accuses the French historian Jules Michelet of rambling “on for pages, as only a French intellectual can.” Is this playful wit or spiteful animus? Is it even necessary?

Budiansky might have avoided occasional lapses had he consulted the academics he seems to despise. For instance, he asserts that the “common law in America traces its origins to the Middle Ages in England […] following the Norman invasion in 1066,” and that the “Normans brought with them a body of customary law that, under Henry II, was extended across England by judges of the King’s Bench who traveled on circuit to hold court.” This isn’t so. Writing in The Genius of the Common Law, Sir Frederick Pollock — “an English jurist,” in Budiansky’s words, “whose friendship with Holmes spanned sixty years” — mapped the roots of the common law “as far back as the customs of the Germanic tribes who confronted the Roman legions when Britain was still a Roman province and Celtic.” In other words, Budiansky is approximately one thousand years off. Rather than supplanting British customs, the Normans instituted new practices that complemented, absorbed, and blended with British customs.

The fact that Budiansky never mentions some of the most interesting researchers working on Holmes — Susan Haack, Seth Vannatta, and Catharine Wells come to mind — suggests willful ignorance, the deliberate avoidance of the latest scholarship. But to what end? For what reason?

It takes years of study to truly understand Holmes. The epigraph to Vannatta’s new edition, The Pragmatism and Prejudice of Oliver Wendell Holmes Jr., aptly encapsulates the complexity of Holmes’s thought with lines from Whitman’s Song of Myself: “Do I contradict myself? / Very well then I contradict myself, / (I am large, I contain multitudes.)” Budiansky recognizes, as others haven’t, that Holmes was large and contained multitudes. Holmes’s contradictions, if they are contradictions, might be explained by the famous dictum of his childhood hero, Emerson: “A foolish consistency is the hobgoblin of little minds.”

Holmes was consistently inconsistent. His mind was expansive, his reading habits extraordinary. How to categorize such a wide-ranging man? What were the defining features of his belief? Or did he, as Louis Menand has alleged, “lose his belief in beliefs”? Budiansky condenses Holmes’s philosophy into this helpful principle: “[T]hat none of us has all the answers; that perfection will never be found in the law as it is not to be found in life; but that its pursuit is still worth the effort, if only for the sake of giving our lives meaning.”

Holmes was intellectually humble, warning us against the complacency that attends certainty. Driving his methods was the sober awareness that he, or anyone for that matter, might be incorrect about some deep-seated conviction. During this time of polarized politics, self-righteous indignation, widespread incivility, and rancorous public discourse, we could learn from Holmes. How civil and respectful we could be if we all recognized that our cherished ideas and working paradigms might, at some level, be erroneous, if we were constantly mindful of our inevitable limitations, if we were searchers and seekers who refuse to accept, with utter finality, that we’ve figured it all out?

Oliver Wendell Holmes Jr. and Abraham Lincoln

In Arts & Letters, Historicism, History, Humanities, Law, Nineteenth-Century America, Oliver Wendell Holmes Jr., Politics, Southern History, The South on July 10, 2019 at 6:45 am

Oliver Wendell Holmes Jr., Abraham Lincoln, and the Civil War

In America, American History, Historicism, History, Humanities, Nineteenth-Century America, Oliver Wendell Holmes Jr. on April 24, 2019 at 6:45 am

Seth Vannatta’s Justice Holmes

In American History, Arts & Letters, Books, Conservatism, History, Humanities, Jurisprudence, Law, Philosophy, Pragmatism, Scholarship, Western Philosophy on March 6, 2019 at 6:45 am

Seth Vannatta identifies the common law as a central feature of the jurisprudence of former United States Supreme Court justice Oliver Wendell Holmes, Jr. Holmes treated the common law as if it were an epistemology or a reliable mode for knowledge transmission over successive generations. Against the grand notion that the common law reflected a priori principles consistent with the natural law, Holmes detected that the common law was historical, aggregated, and evolutionary, the sum of the concrete facts and operative principles of innumerable cases with reasonable solutions to complex problems. This view of the common law is both conservative and pragmatic.

Vannatta’s analysis of Holmes opens new directions for the study of conservatism and pragmatism—and pragmatic conservatism—demonstrating that common-law processes and practices have much in common with the form of communal inquiry championed by C.S. Peirce. For more on this subject, download “Seth Vannatta’s Justice Holmes,” which appeared in the journal Contemporary Pragmatism in the fall of 2018.

Review of Richard Posner’s “The Federal Judiciary”

In Arts & Letters, Book Reviews, Books, Jurisprudence, Law, Writing on December 27, 2017 at 6:45 am

This review originally appeared here in the Los Angeles Review of Books.

“I’m not a typical federal judge,” Richard Posner says in his new book The Federal Judiciary, which seems designed to affirm that claim.

Released in August, this tome shouldn’t be confused with his self-published Reforming the Federal Judiciary, released in September. The latter has generated controversy because it includes documents internal to the Seventh Circuit Court of Appeals, including personal emails from Chief Judge Diane Wood and confidential bench memoranda. The former, the subject of this review, is no less blunt, though one suspects the editors at Harvard University Press ensured that it excluded improper content.

Publication of both books coincides with the sudden announcement of Posner’s retirement. This quirky and opinionated jurist is going out with a bang, not a whimper, after serving nearly 36 years on the bench. He could have taken senior status; instead he’s withdrawing completely, citing his court’s handling of pro se appellants as the prime reason.

The Federal Judiciary presents “an unvarnished inside look” at the federal court system, which, Posner insists, “is laboring under a number of handicaps,” “habituated to formality, resistant to change, backward-looking, even stodgy.”

Posner is a self-styled pragmatist who champions resolving cases practically and efficiently through common-sense empiricism without resorting to abstractions or canons of construction. He adores Justice Oliver Wendell Holmes Jr., whose jurisprudence resembled the pragmatism of C. S. Peirce, William James, and John Dewey. His methodology relies on analyzing the facts and legal issues in a case, and then predicting the reasonable outcome in light of experience and the probable consequences of his decision. Accordingly, he follows his instincts unless some statute or constitutional provision stands in the way. Most of the time, the operative rules remain malleable enough to bend toward his purposes.

This fluid approach to judging stands in contradistinction to that of Justice Antonin Scalia, for whom Posner has little affection. In fact, Posner establishes himself as Scalia’s opposite. Where Scalia was formalistic and traditional, Posner is flexible and innovative. Where Scalia was doctrinaire, Posner is pragmatic. Where Scalia was orthodox, Posner boasts, “I am willing to go […] deep into the realm of unorthodoxy.”

Posner’s criticisms of Scalia can seem irresponsibly personal, involving not only Scalia’s originalism and textualism (legitimate objects of concern) but also his religious views on Creationism (about which, Posner declares, Scalia was “wrong as usual”). He calls Scalia’s belief in the devil “[c]hildish nonsense” and denounces Scalia’s unhealthy lifestyle. In a low moment, he calls Scalia “careless” for dying next to a sleep apnea machine the ailing justice wasn’t using. This rebuke is irreverent, but is it constructive or extraneous? Does it advance Posner’s judicial methods while weakening the case for Scalia’s?

Aspiring to be “relentlessly critical and overflowing with suggestions for reform,” Posner attacks the “traditional legal culture” that, he says, “has to a significant degree outlived its usefulness.” Cataloging the targets of his iconoclastic ire would be exhausting. He jumps from subject to subject, castigating “judicial pretense” and treating with equal fervor such weighty topics as statutory interpretation and such trivial matters as the denotation of “chambers” versus “office.” He confers delightfully disrespectful labels (“slowpokes,” “curmudgeons”) on his colleagues but can also seem petty (complaints about food in the US Supreme Court cafeteria come to mind).

Most of his critiques have merit. His persistent assault on the sanctimony and pomposity of federal judicial culture is acutely entertaining, signaling to some of his more arrogant colleagues that they’re not as important or intelligent as they might think.

Posner likes to shock. What other judge would assert that the Constitution is “obsolete” or ask when we’ll “stop fussing over an eighteenth-century document” that institutes the basic framework of governance for the country? A bedrock principle underlying the separation-of-powers doctrine holds that the judicial branch interprets law while the legislative branch makes it. Posner, however, announces that federal judges legislate even though they’re unelected. Conservative commentators would offer this fact as condemnation, but Posner extols it as an indispensable prerogative.

Although he alleges that judges are political actors, he’s impatient with politicians. He ranks as the top weakness of the federal judiciary the fact that politicians nominate and confirm federal judges and justices. (The president nominates and the Senate confirms.) The basis of this objection is that politicians are mostly unqualified to evaluate legal résumés and experience.

A refrain Posner employs to advance his argument — “Moving on” — might serve as his motto for judges, who, in his mind, must break free from undue restraints of the past. “The eighteenth-century United States, the nineteenth-century United States, much of the twentieth-century United States,” he submits, “might as well be foreign countries so far as providing concrete guidance (as distinct from inspiration) to solving today’s legal problems is concerned.” This isn’t meant to be hyperbole.

His citations to Wikipedia and tweets — yes, tweets — enact the forward-looking attitude he celebrates: he’s not afraid of new media or of pushing boundaries. Consider the time he asked his law clerks to doff and don certain work clothing to test facts presented by litigants in a case before him.

His advice to colleagues on the bench: Let clerks refer to you by your first name; do away with bench memos and write your own opinions; stop breaking for three-month recesses; stagger hiring periods for law clerks; don’t employ career clerks; don’t procrastinate; don’t get bogged down in procedure at the expense of substance; be concise; read more imaginative literature; avoid Latinisms; abolish standards of review. If you’re an appellate judge, preside over district-court trials. And whatever you do, look to the foreseeable future, not backward, for direction.

Readers of his most recent book, Divergent Paths, will recognize in these admonitions Posner’s distinctive pet peeves. He believes that judges who don’t author their opinions are weak or unable to write well. If judges were required to write their opinions, he supposes, fewer unqualified lawyers would sit on the bench: inexpert writers, not wanting to expose their deficiencies, would not accept the nomination to be a federal judge.

Posner’s love of good writing is so pronounced that he praises Scalia, his chosen nemesis, for his “excellent writing style.” He sprinkles references to Dante, Tennyson, Keats, Fitzgerald, Nietzsche, T. S. Eliot, Orwell, and Edmund Wilson and supplies epigrams by Auden, Yeats, and Alexander Pope. Those who didn’t know it wouldn’t be surprised to learn that Posner majored in English at Yale.

Still one comes away with the impression that he has sacrificed precision for speed. He appears to have cobbled together several blog posts and other articles of only ephemeral significance to pad his polemic. He discusses judges’ “priors” on page 116 but doesn’t define that term (“a mixture of temperament, ideology, ambition, and experience”) until page 148. Liberal with block quotes, scattered in focus, he recycles by-now familiar arguments against Bluebook and legal jargon and other staples of the legal academy. Even those who agree with him on these points will balk at the redundancy.

The repetition isn’t only at the thematic level: it involves diction and syntax. He tells us on page 408, “Pope Pius XII made peace with evolution in 1950.” Then a page later, he states, “The Church had had a ‘problem’ with evolution until Pius XII had made his peace with it in 1950.” On page five, he writes, “almost all federal judicial opinions are drafted by law clerks […] in the first instance, and edited more or less heavily by the judge.” He then echoes himself on page 22: “[M]ost judges (and Justices) require their law clerks to write the initial draft opinion, which the judge then edits.” He describes this same process again on page 276. “I write my own opinions,” he declares only to repeat himself later: “I write and edit my own opinions.” These are mere samples of a striking trend in Posner’s book.

A former law professor, Posner concludes by assigning grades to the federal judiciary in eight categories: selection of judges (B), judicial independence (A-), rule of law (A), finality of judgments (B), court structure (B), management (C), understanding and training (C), and compensation (B+). Total? Around a B average. For all the fuss, that’s a decent score.

Posner’s characteristic arrogance is grandly exhibited. “I’m a pretty well-known judge,” he assures us. His preface includes a short bibliography for “readers interested in learning more about me.” He names “yours truly” (i.e., himself) in his list of notables in the field of law-and-economics, an indisputable detail that a more humble person would have omitted. Posner’s self-importance can be charming or off-putting, depending on your feelings toward him.

Yet he’s honest. And forthright. Not just the federal judiciary but the entire legal profession thrives off mendacity, which is not the same as a lie or embellishment. It’s a more extravagant, systemic mode of false narrative that lawyers and judges tell themselves about themselves to rationalize and enjoy what they do. Posner sees through this mendacity and derides it for what it is. His frank irritability is strangely charming, and charmingly strange. The federal judiciary has lost a maverick but gained a needed detractor.

Allen Mendenhall Interviews Anton Piatigorsky, Author of “Al-Tounsi”

In Arts & Letters, Books, Creative Writing, Criminal Law, Fiction, Humanities, Justice, Law, liberal arts, Literature, Novels, Oliver Wendell Holmes Jr., Philosophy, Writing on October 4, 2017 at 6:45 am

AM: Thanks for discussing your debut novel with me, Anton.  It’s titled Al-Tounsi and involves U.S. Supreme Court justices who are laboring over a case about an Egyptian detainee held on a military base in the Philippines. How did you come up with this premise for a novel? 

AP:  I was interested in the intersection between contemporary legal and political issues and the personal lives of the justices. I was particularly impressed by the ways in which the writ of habeas corpus has been used (and suspended) throughout U.S. history.

The Great Writ is a heroic call to responsibility—a demand made by the judiciary for the executive to live up to its obligations to imprisoned individuals. While it has obvious political and social ramifications, it also has philosophical ones. It encourages moral and psychological reckoning: what are our responsibilities to others?

I was excited about writing a novel where two strains—the political and the personal—overlap and blend. I realized that if I fictionalized the important 2008 Guantanamo Bay case Boumediene vs. Bush—by changing key events, decisions and characters—I could use it as the basis for a novel about the Court that explores all my interests.

Anton Piatigorsky

AM: How did you decide to change directions and write about the law?  Did this case just jump out at you?  Your previous writings address a wide variety of subjects but not, that I can tell, law. 

AP:  I came to the law, strangely enough, through religion. I’ve long been interested in how religion functions, and especially in the ways that secular systems mimic religious ones. When I started reading about American law and the U.S. Supreme Court, I saw those institutions as a part of an Enlightenment era secular religion. From this perspective, law is a system of rituals, codes and writings that helps establish an identity for a community, a set of shared values and beliefs, and a way for people to function within the world. I found that fascinating. It inspired all sorts of questions.

What are the general beliefs about people and the world that lie beneath the American legal system? How are those beliefs enacted in cases, courts, and legal writings? How do they play out in the rituals of the Court?  How do the justices of the Supreme Court —who are, in some ways, high priests of the legal world — reconcile conflicts between their personal beliefs and the foundational beliefs of the legal system they guide?

The fictional stories I wanted to tell about justices’ lives grew out of these general questions. Those questions also led me into an investigation of the main case before them.

AM: One of the most fascinating parts of the book, to me, is the Afterword, which consists of the concurring opinion of the fictional Justice Rodney Sykes.

AP: I have always loved novels of ideas, when a character’s emotional journey overlaps with their complex thoughts and beliefs. Whenever that type of fiction really works — as it does with Dostoyevsky’s The Brothers Karamazov — the character’s philosophy or worldview stands alone as a work of non-fiction. And so the novel becomes part fiction, part critical thought. It functions as a critique on ideas that circulate in the real world.

That’s what I hoped to achieve for Justice Rodney Sykes’s formal opinion in the novel’s Afterword. I wanted Rodney to reach a powerful critique of basic tenets of the American legal system. I wanted him to address what our responsibilities are (or aren’t) towards others in the legal system, and the problems with that system’s fundamental faith in individual actors. In his concurrence, Rodney takes an unorthodox and unlikely stance for a Supreme Court Justice, but that’s what makes it a work of fiction. A novel can be the perfect forum to discuss how a real person might come to a radical decision, and how that decision might revolutionize their thoughts and actions.

AM: Who are your favorite living writers?

AP:  I particularly admire J.M. Coetzee and Alice Munro. I think about their works often while I’m writing and editing my own.

Coetzee has written several fantastic “novels of ideas.” Both Diary of a Bad Year and Elizabeth Costello manage to incorporate far-reaching critiques into their larger stories about characters, and they do so while using imaginative formal techniques. I also love Coetzee’s cold and austere style in his less overtly intellectual books. They’re cleanly written, shockingly honest, and endlessly compelling.

Alice Munro—although it’s almost a cliché to praise her at this point—shows remarkable insight into her characters, gradually revealing their motivations, resentments and surprising decisions without ever erasing their fundamental mysteries as people. Her stories are complex formally, but in such a quiet way that I often don’t notice their structures until I’ve read them a few times. Her writing is a great model for how to show characters’ lives and decisions with efficiency and imagination while maintaining mystery.

AM: Do you intend to continue in the novel form in your own writing?

AP:  Absolutely. I would love to write more legal fiction, as well. I’ve spent years learning about the law, but know that I’ve barely scratched the surface. There are so many potentially interesting legal stories. I’m also at the early stages of a new novel, which is not explicitly about law, but does feel like an outgrowth of Al-Tounsi in certain ways.

AM: I worked for a state Supreme Court justice for over three years, and I agree: there are many interesting legal stories out there, and I’ve found that facts are often stranger than fiction.

AP:  It must be fascinating to work on the diverse cases that roll through a court. I can only imagine how many potential stories you and other lawyers, judges and court workers can recall—ideas for a million novels and movies and plays.

I think legal stories are particularly exciting for fiction because they distill big questions into concrete human situations and personalities. The giant subjects of guilt and innocence, love and betrayal, responsibilities towards others as opposed to ourselves, community or self-reliance, greed, jealousy and ambition all play out in specific facts and events, in the concrete details of a case. It’s just like in a novel. And since the American legal system is, in my mind, an application of an entire Enlightenment, philosophic worldview, these test cases and stories also pose huge philosophical, ethical and moral questions. It’s no coincidence some of the best novels ever written involve detailed legal plots.

AM:  That reminds me of something Justice Holmes once said: “Law opens a way to philosophy as well as anything else.”  But it sounds as if you and I would go further and say it might open a way better than many other things do.

AP:  Law is like applied philosophy; It puts general ideas to the test in the real world. If a philosophy remains theoretical it never really touches what it means to live it, inside it. The answers theoretical philosophy provides are always tentative.

A huge inspiration for my novel was the work of the late French philosopher, Emmanuel Levinas. While Levinas’s writing is often arcane and difficult to get through, I find his thinking to be a powerful and searing indictment of basic Enlightenment principles. While I was writing Al-Tounsi, I used Levinas’s insights—directly—to help me construct Justice Sykes’s final concurrence. It was hugely inspiring to find a concrete way to use this philosophy I have long loved. All the questions and problems I was interested in exploring were present in this genuine legal situation, in the constitutional habeas corpus case, Boumediene vs. Bush, on which I based my fictional case of Al-Tounsi vs. Shaw.

So, yes, I completely agree with you and Justice Holmes!

AM:  So glad we had this opportunity to talk.  Let’s do it again.  

 

 

%d bloggers like this: