Allen Porter Mendenhall

Archive for the ‘Arts & Letters’ Category

Is Hacking the Future of Scholarship?

In Arts & Letters, Books, Communication, Ethics, Historicism, History, Humanities, Information Design, Property, Scholarship on November 19, 2014 at 8:45 am

Allen 2

This piece originally appeared here in Pacific Standard in 2013.

Most attorneys are familiar with e-discovery, a method for obtaining computer and electronic information during litigation. E-discovery has been around a long time. It has grown more complex and controversial, however, with the rise of new technologies and the growing awareness that just about anything you do online or with your devices can be made available to the public. Emails, search histories, voicemails, instant messages, text messages, call history, music playlists, private Facebook conversations (not just wall posts)—if relevant to a lawsuit, these and other latent evidence, for better or worse, can be exposed, even if you think they’ve been hidden or discarded.

Anyone who has conducted or been involved with e-discovery realizes how much personal, privileged, and confidential information is stored on our devices. When you “delete” files and documents from your computer, they do not go away. They remain embedded in the hard drive; they may become difficult to find, but they’re there. Odds are, someone can access them. Even encrypted files can be traced back to the very encryption keys that created them.

E-discovery has been used to uncover registries and cache data showing that murderers had been planning their crimes, spouses had been cheating, perverts had been downloading illegal images, and employees had been stealing or compromising sensitive company data or destroying intellectual property. Computer forensics were even used to reveal medical documents from Dr. Conrad Murray’s computer during the so-called “Michael Jackson death trial.”

More information is good; it helps us to understand our universe and the people in it. The tracking and amassing of computer and electronic data are inevitable; the extent and details of their operation, however, cannot yet be known.

Computer forensics can teach you a lot about a person: the websites he visits, the people he chats with, the rough drafts he abandons, the videos he watches, the advertisements he clicks, the magazines he reads, the news networks he prefers, the places he shops, the profiles he views, the songs he listens to, and so on. It is fair to say that given a laptop hard drive, a forensic expert could nearly piece together an individual’s personality and perhaps come to know more about that person—secret fetishes, guilty pleasures, and criminal activities—than his friends and family do.

In light of this potential access to people’s most private activities, one wonders how long it will be until academics turn to computer forensics for research purposes. This is already being done in scientific and technology fields, which is not surprising because the subject matter is the machine and not the human, but imagine what it would mean for the humanities? If Jefferson had used a computer, perhaps we would know the details of his relationship with Sally Hemings. If we could get ahold of Shakespeare’s iPad, we could learn whether he wrote all those plays by himself. By analyzing da Vinci’s browsing history, we might know which images he studied and which people he interacted with before and during his work on the Mona Lisa—and thus might discover her identity.

There are, of course, government safeguards in place to prevent the abuse of, and unauthorized access to, computer and electronic data: the Wiretap Act, the Pen Registers and Trap and Trace Devices Statute, and the Stored Wired and Electronic Communication Act come to mind. Not just anyone can access everything on another person’s computer, at least not without some form of authorization. But what if researchers could obtain authorization to mine computer and electronic data for the personal and sensitive information of historical figures? What if computer forensics could be used in controlled settings and with the consent of the individual whose electronic data are being analyzed?

Consent, to me, is crucial: It is not controversial to turn up information on a person if he voluntarily authorized you to go snooping, never mind that you might turn up something he did not expect you to find. But under what circumstances could computer forensics be employed on a non-consensual basis? And what sort of integrity does computer or electronic information require and deserve? Is extracting data from a person’s laptop akin to drilling through a precious fresco to search for lost paintings, to excavating tombs for evidence that might challenge the foundations of organized religion and modern civilization, or to exhuming the bodies of dead presidents? Surely not. But why not?

We have been combing through letters by our dead predecessors for some time. Even these, however, were meant for transmission and had, to that end, intended audiences. E-discovery, by contrast, provides access to things never meant to be received, let alone preserved or recorded. It is the tool that comes closest to revealing what an individual actually thinks, not just what he says he thinks, or for that matter, how and why he says he thinks it. Imagine retracing the Internet browsing history of President Obama, Billy Graham, Kenneth Branagh, Martha Nussbaum, Salmon Rushdie, Nancy Pelosi, Richard Dawkins, Toni Morrison, Ai Weiwei, or Harold Bloom. Imagine reading the private emails of Bruno Latour, Ron Paul, Pope Francis, Noam Chomsky, Lady Gaga, Roger Scruton, Paul Krugman, Justice Scalia, or Queen Elizabeth II. What would you find out about your favorite novelists, poets, musicians, politicians, theologians, academics, actors, pastors, judges, and playwrights if you could expose what they did when no one else was around, when no audience was anticipated, or when they believed that the details of their activity were limited to their person?

This is another reason why computer and electronic data mining is not like sifting through the notes and letters of a deceased person: having written the notes and letters, a person is aware of their content and can, before death, destroy or revise what might appear unseemly or counter to the legacy he wants to promote. Computer and electronic data, however, contain information that the person probably doesn’t know exists.

More information is good; it helps us to understand our universe and the people in it. The tracking and amassing of computer and electronic data are inevitable; the extent and details of their operation, however, cannot yet be known. We should embrace—although we don’t have to celebrate—the technologies that enable us to produce this wealth of knowledge previously unattainable to scholars, even if they mean, in the end, that our heroes, idols, and mentors are demystified, their flaws and prejudices and conceits brought to light.

The question is, when will we have crossed the line? How much snooping goes too far and breaches standards of decency and respect? It is one thing for a person to leave behind a will that says, in essence, “Here’s my computer. Do what you want with it. Find anything you can and tell your narrative however you wish.” It is quite another thing for a person never to consent to such a search and then to pass away and have his computer scanned for revealing or incriminating data.

It’s hard to say what crosses the line because it’s hard to know where the line should be drawn. As Justice Potter Stewart said of hard-core pornography, “I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it.” Once scholars begin—and the day is coming—hacking devices to find out more about influential people, the courts and the academic community will be faced with privacy decisions to make. We will have to ask if computer and electronic data are substantially similar to private correspondence such as letters, to balance the need for information with the desire for privacy, to define what information is “private” or “public,” and to honor the requests of those savvy enough to anticipate the consequences of this coming age of research.

Amid this ambiguity, one thing will be certain: Soon we can all join with Princess Margaret in proclaiming, “I have as much privacy as a goldfish in a bowl.” That is good and bad news.

Paul H. Fry on Deconstruction, Part II

In American Literature, Arts & Letters, Books, Epistemology, Fiction, History, Humanities, Literary Theory & Criticism, Literature, Pedagogy, Philosophy, Politics, Postmodernism, Rhetoric, Scholarship, Semiotics, Teaching, The Academy, Western Civilization, Western Philosophy, Writing on November 12, 2014 at 8:45 am

Below is the ninth installment in the lecture series on literary theory and criticism by Paul H. Fry.  The three two lectures are here, here, here, here, here, here, here, here, here, and here.

Free Not to Vote

In America, Arts & Letters, Austrian Economics, Libertarianism, News and Current Events, Politics on October 22, 2014 at 8:45 am

Allen 2

This piece first appeared here as a Mises Emerging Scholar article for the Ludwig von Mises Institute Canada.

The 2014 U.S. midterm elections are coming up, and I don’t intend to vote. A vote is like virginity: you don’t give it away to the first flower-bearing suitor. I haven’t been given a good reason, let alone flowers, to vote for any candidate, so I will stay home, as well I should.

This month, my wife, a Brazilian citizen, drove from Auburn, Alabama, to Atlanta, Georgia, on a Sunday morning to cast her vote for the presidential election in Brazil. She arrived at the Brazilian consulate and waited in a long line of expatriates only to be faced with a cruel choice: vote for the incumbent socialist Dilma Rousseff of the Workers’ Party, for the socialist Aécio Neves of the Brazilian Social Democracy Party who is billed as a center-right politician, for the environmentalist socialist Marina Silva of the Socialist Party, or for any of the other socialist candidates who were polling so low that they had no chance of victory. Brazil maintains a system of compulsory voting in addition to other compulsory schemes such as conscription for all males aged 18.

Logan Albright recently wrote about the folly of compulsory voting, support for which is apparently growing in Canada. He criticized the hypocrisy of an allegedly democratic society mandating a vote and then fining or jailing those who do not follow the mandate. He also pointed out the dangers of forcing uneducated and uninformed citizens to vote against their will. This problem is particularly revealing in Brazil, where illiterate candidates have exploited election laws to run absurd commercials and to assume the persona of silly characters such as a clown, Wonder Woman, Rambo, Crazy Dick, and Hamburger Face, each of which is worth googling for a chuckle. The incumbent clown, by the way, was just reelected on the campaign slogan “it can’t get any worse.” Multiple Barack Obamas and Osama bin Ladens were also running for office, as was, apparently, Jesus. The ballot in Brazil has become goofier than a middle-school election for class president.

Even in the United States, as the election of Barack Obama demonstrates, voting has become more about identity politics, fads, and personalities than about principle or platform. Just over a decade ago, Arnold Schwarzenegger became the Governor of California amid a field of second-rate celebrities while a former professional wrestler (the fake and not the Olympian kind of wrestling) Jesse “the Body” Ventura was winding up his term as the Governor of Minnesota. Today comedian Al Franken holds a seat in the United States Senate. It turns out that Brazil isn’t the only country that can boast having a clown in office.

No serious thinker believes that a Republican or Democratic politician has what it takes to boost the economy, facilitate peace, or generate liberty. The very function of a career politician is antithetical to market freedom; no foolish professional vote-getter ought to have the power he or she enjoys under the current managerial state system, but voting legitimates that power.

It is often said, “If you don’t vote, you can’t complain.” The counterpoint is that voting ensures your complicity with the policies that elected politicians will enact. If you don’t vote, you lack complicity. You are not morally blameworthy for resisting the system that infringes basic rights or that offends your sense of justice and reason. You have not bestowed credibility on the government with your formal participation in its most sacred ritual. The higher the number of voters who participate in an election, the more legitimacy there is for the favored projects of the elected politicians, and the more likely those politicians are to impose their will on the populace by way of legislation or other legal means.

Refusing to vote can send a message: get your act together or we won’t turn out at the polling stations. Low voter turnout undermines the validity of the entire political system. Abstention also demonstrates your power: just watch how the politicians grovel and scramble for your vote, promise you more than they can deliver, beg for your support. This is how it ought to be: Politicians need to work for your vote and to earn it. They need to prove that they are who they purport to be and that they stand for that which they purport to stand. If they can’t do this, they don’t deserve your vote.

Abstention is not apathy; it is the exercise of free expression, a voluntary act of legitimate and peaceful defiance, the realization of a right.

There are reasonable alternatives to absolute abstention: one is to vote for the rare candidate who does, in fact, seek out liberty, true liberty; another is to cast a protest vote for a candidate outside the mainstream. Regardless, your vote is a representation of your person, the indicia of your moral and ethical beliefs. It should not be dispensed with lightly.

If you have the freedom not to vote, congratulations: you still live in a society with a modicum of liberty. Your decision to exercise your liberty is yours alone. Choose wisely.

Red Birds at Law Building, A Poem by Jason Morgan

In Arts & Letters, Creative Writing, Humanities, Poetry, Writing on October 15, 2014 at 8:45 am

Jason Morgan is a New Orleans native and grew up mostly in Louisiana and Tennessee. He attended the University of Tennessee-Chattanooga (BA, History and International Studies) and the University of Hawai’i-Manoa (MA, Asian Studies: China focus), and is now ABD at the University of Wisconsin-Madison (Japanese history). He has attended or conducted research at Nagoya University of Foreign Studies, Nagoya University, Yunnan University in Kunming, PRC, and the University of Texas-San Antonio. He’s currently on a Fulbright grant researching Japanese legal history at Waseda University in Tokyo. His topics include case law during the Taishou Period, and the broad contexualization of the Tokyo War Crimes Trial.  His scholarly work has appeared, or is scheduled to appear, in Modern Age (on American labor history), Japan Review (two reviews of Japanese history monographs), Education About Asia (two reviews of Japanese history textbooks), Human Life Review (on Griswold v. Connecticut; review of book on Catholics and abortion), Metamorphoses (translation of Tanizaki Jun’ichirou’s Randa no Setsu), Southeast Review of Asian Studies (on Japanese translation work), and in book form (two translations of Mizoguchi Yuuzou on Chinese intellectual history; translation of Ono Keishi on Japanese military financing in WWI and during the Siberian Intervention). He has also written for the College Fix and College Insurrection.

Red Birds at Law Building

It is astonishing that we
live in the same world, yet in two
I see the same things that they see,
do (almost) everything they do

but they sit on a sill and sing
outside today’s exam in law:
these are two very different things,
two very different kinds of awe

Paul H. Fry on Deconstruction, Part I

In American Literature, Arts & Letters, Books, History, Humanities, Literary Theory & Criticism, Literature, Pedagogy, Philosophy, Postmodernism, Rhetoric, Rhetoric & Communication, Scholarship, Teaching, Western Philosophy, Writing on October 1, 2014 at 8:45 am

Below is the eighth installment in the lecture series on literary theory and criticism by Paul H. Fry.  The three two lectures are here, here, here, here, here, here, here, here, and here.

Review of “Cheating Lessons,” by James M. Lang

In Academia, America, Arts & Letters, Book Reviews, Books, Humanities, Pedagogy, Teaching on September 24, 2014 at 8:45 am

Allen 2

This review originally appeared in Academic Questions (2014).

A few years ago, when I was teaching composition courses at Auburn University, I had a freshman from Harlem in my class. He had traveled from New York to Alabama to accept a scholarship and become the first person in his family to attend college. He was kind and thoughtful, and I liked him very much, but he was woefully unprepared for higher education; he had trouble comprehending more than a few paragraphs and could not write basic sentences. The university, however, was proud of this recruit, who contributed both geographic and racial diversity to the otherwise (relatively) non-diverse student body.

Encouraged by his tenacity, I met with this student regularly to teach him sentence structure and to help him turn his spoken words into written sentences. Although he improved by degrees over the course of the semester, he was never able to write a complete coherent paragraph.

During the last weeks of class, I informed him that he needed to earn at least a C+ on his final paper to avoid repeating the course. He was conspicuously absent from class whenever preliminary drafts were due, and he never responded to my prodding emails. Shortly before the due date, he materialized in my office and presented a piece of paper that contained several sentences. He asked me questions and attempted to record my responses on his paper. I reminded him that although I was happy to offer guidance, he needed to submit original work. He nodded and left my office. When, at last, he submitted his final paper, it consisted of roughly four intelligible paragraphs that regrettably had nothing to do with the assignment. I inserted these paragraphs into a Google search and discovered that they were lifted, verbatim, from a Wikipedia article unrelated to the assignment. I failed the student but showed him mercy—and spared the university embarrassment—by not reporting him to the administration for disciplinary action.

To this day I wonder if there was something I could have done differently to prevent this student from plagiarizing, or whether his cheating was the inevitable consequence of being unprepared for university study. Many teachers have similar stories.

Academic dishonesty, a topic now admirably undertaken by James M. Lang, has received more scholarly treatment than I was aware of before reading Cheating Lessons: Learning from Academic Dishonesty. Like many of us, Lang grew interested in the subject because of his experiences with students who cheated in his classes. The more research he did on academic dishonesty, the more frustrated he became with “the same basic prescriptions” that were either quixotic or impracticable for one faculty member to undertake alone. One day, Lang realized that if he “looked through the lens of cognitive theory and tried to understand cheating as an inappropriate response to a learning environment that wasn’t working for the student,” he could “empower individual faculty members to respond more effectively to academic dishonesty by modifying the learning environments they constructed.”

Lang’s goal is not to score points or court confrontation, but simply to help teachers and administrators to reduce cheating by restructuring the content and configuration of their courses and classrooms.

Lang divides Cheating Lessons into three parts. The first is a synthesis of the existing scholarly literature on academic dishonesty that concludes with four case studies, about which little needs to be said here. The second part consists of practical guidance to teachers who wish to structure their classrooms to minimize cheating and to cultivate the exchange of ideas. And the third, which is an extension of the second, considers speculations about potential changes to curricula and pedagogy to promote academic integrity not just in the classroom, but across campus.

Most original are parts two and three, which are premised on the structuralist assumption that systems shape and inform the production of knowledge. The treatment of academic dishonesty as a symptom of deterministic models and paradigms makes this book unique. If the models and paradigms can be changed, Lang’s argument runs, then academic dishonesty might decline: the shift needs to be away from the “dispositional factors that influence cheating—such as the student’s gender, or membership in a fraternity or sorority, and so on”—toward “contextual factors,” the most significant of which is “the classroom environment in which students engage in a cheating behavior” (emphases in original). What’s exciting about the structuralist paradigm—if it’s accurate—is that teachers and administrators have the power and agency to facilitate constructive change.

But what if the structuralist paradigm isn’t correct? What if dispositional factors are more determinative than contextual factors in generating academic dishonesty? Lang’s argument depends upon a profound assumption that he expects his readers to share. It’s most likely that dispositional and contextual factors are interactive, not mutually exclusive: consider the student who is not as intelligent as his peers and who resorts to cheating because of his insecurity and the pressure on him to succeed. Lang is onto something, though: students are less likely to learn in an environment that compels them “to complete a difficult task with the promise of an extrinsic reward or the threat of punishment” than they are in an environment that inspires them “with appeals to the intrinsic joy or beauty or utility of the task itself” (emphasis in original). In other words, “in an environment characterized by extrinsic motivation, the learners or competitors care about what happens after the performance rather than relishing or enjoying the performance itself” (emphasis in original).

How does Lang propose that teachers and administrators structure their courses and curricula to foster what he calls “intrinsic motivation” (as against “extrinsic rewards”) among students? For starters, he urges professors to help students learn for mastery and not for grades, to lower the stakes per assignment by multiplying the options for students to earn points or credit, and to instill self-efficacy by challenging students and by affording them increased opportunities to demonstrate their knowledge. In the abstract, these suggestions seem obvious and unhelpful, so Lang backs them up with interviews with accomplished teachers as well as anecdotes about successful classroom experiments: the improvising by Andy Kaufman as he taught Russian literature to prison inmates, for instance, or the unique grading system implemented by John Boyer at Virginia Tech. All the tactics and approaches discussed and promoted by Lang can be traced back to the premise that “the best means we have to reduce cheating is to increase motivation and learning.”

Teachers and administrators are forever trying to motivate their students to learn. It’s easier to conceive of this goal, however, than to achieve it. Teachers everywhere seek to inspire their students to love and pursue knowledge, and despite a plethora of opinions about how best to do so, no general consensus has arisen to establish a definitive course of action for all students and disciplines. Many teachers chose their profession and discipline because they relished their own education and wanted to pass on their knowledge and love of learning to others. Lang’s insistence that teachers inspire a passion for learning is hardly novel; rather, it is the touchstone and stands in contradistinction to the utilitarian, standardized, test-centered, and results-oriented educational strategies that politicians, bureaucrats, and policy wonks now sponsor and defend. In this respect, Cheating Lessons is a refreshing alternative; it’s written by an educator for educators and not, thank goodness, for semiliterate politicians and their sycophantic advisers.

One thing this book is not: a template or checklist that you can follow to construct your own productive learning environment for students. Each learning environment is contextual; one model will not suit every setting and purpose. Because Lang cannot and does not provide step-by-step how-to instructions, Cheating Lessons borders on the self-help genre and is more inspirational and aspirational than it is informational. And Lang’s meandering style—for example, his digressions about Robert Burns and coaching youth sports teams—are disarming enough not only to charm but also to contribute to the impression that Cheating Lessons is “light” reading.

Lang can overdo the playfulness and make exaggerated claims. Early on he quotes a Harvard administrator complaining in 1928 about the problem of cheating among students, an example that’s meant to refute the assumption that “we are in the midst of a cheating epidemic, and that the problem is much worse now than it was in the idyllic past.” Lang adds that he hopes to convince us that “cheating and higher education in America have enjoyed a long and robust history together.” But it’s not as if 1928 is ancient history. Data about academic dishonesty since that time will not convince most readers that there were as many cheating students in the one-room schoolhouses of the nineteenth century, when fewer people had access to formal education, as there are today. Perhaps anticipating such criticism, Lang invites us to “hop in our time machine and leap across centuries” to consider the cheating cultures of the ancient Greeks and of Imperial China “over the course of [a] fourteen-hundred-year history.” But surely the substantial data we have gathered on the twentieth- and twenty-first-century academy cannot be compared to the limited and circumstantial data garnered about these early cultures; surely “illicit communication” by “cell phones” is not comparable to the use of cheat sheets in nineteenth-century China. It seems preposterous to suggest that academic dishonesty in contemporary America exists to the same extent it did centuries ago on different continents and among different peoples with different principles and priorities.

Nevertheless, even readers skeptical of Lang’s structuralist premise and apparent optimism will find much in Cheating Lessons to contemplate and to amuse. Unfortunately, however, even after having read the book I’m still not sure what I could have done differently to prevent my student from cheating.

 

 

 

The Immunity Community

In America, American History, Arts & Letters, Britain, History, Humanities, Jurisprudence, Justice, Law, Libertarianism, Philosophy on September 10, 2014 at 8:45 am

Allen 2

This piece first appeared here as a Mises Emerging Scholar article for the Ludwig von Mises Institute Canada.

The doctrine of sovereign immunity derives from the English notion that “the king can do no wrong” and hence cannot be sued without his consent. The purpose of this doctrine was, in England, from at least the Middle Ages until eighteenth century, to bar certain lawsuits against the monarch and his or her ministers and servants. With the rise of the English Parliament after the death of Elizabeth I, government officers and politicians sought to gain the power of immunity that the monarch and his or her agents had enjoyed.

In practice, however, English subjects were not totally deprived of remedies against the monarch or the government. The doctrine of sovereign immunity was not an absolute prohibition on actions against the crown or against other branches of government;[1] subjects could avail themselves of petitions of right or writs of mandamus, for instance, and monarchs fearful of losing the support of the people would often consent to be sued.

It was not until the monarchy had been demonstrably weakened that the doctrine of sovereign immunity began to be espoused with added urgency and enforced with added zeal. In the late eighteenth century, Sir William Blackstone intoned in his Commentaries on the Laws of England that the king “is not only incapable of doing wrong, but ever of thinking wrong: he can never mean to do an improper thing: in him is no folly of weakness.” These lines convert sovereign immunity into sovereign infallibility, a more ominous yet more dubious pretension.

Once the monarchy had been abolished altogether, the idea that the sovereign had to consent to be sued no longer held credence. As Louis L. Jaffe explains, “Because the King had been abolished, the courts concluded that where in the past the procedure had been by petition of right there was now no one authorized to consent to suit! If there was any successor to the King qua sovereign it was the legislature,” which, having many members subject to differing constituencies, was not as accountable as the monarch had been to the parties seeking to sue.[2]

The principle of sovereign immunity carried over from England to the United States, where most states have enshrined in their constitution an absolute bar against suing the State or its agencies and officers whose actions fall within the scope of official duties. The Eleventh Amendment to the U.S. Constitution likewise states that “the Judicial power of the United States shall not be construed to extend to any suit in law or equity, commenced or prosecuted against one of the United States by Citizens of another State, or by Citizens or Subjects of any Foreign State.” This provision, which applies only in federal courts and which does not on its face prohibit a lawsuit against a state by a citizen of that same state, was adopted in response to the ruling in Chisholm v. Georgia (1793), a case that held sovereign immunity to have been abrogated and that vested in federal courts the authority to preside over disputes between private citizens and state governments.

Notwithstanding the complex issues of federalism at play in the Chisholm decision and in the Eleventh Amendment, the fact remains that the doctrine of sovereign immunity has been applied with widening scope and frequency since the states ratified the Eleventh Amendment in 1795. The U.S. Supreme Court has contributed to the doctrine’s flourishing. “The Supreme Court’s acceptance of sovereign immunity as constitutional principle,” explains one commentator, “depends on its determination of the intent of the Framers, which ignores a great deal of historical evidence from the time of the founding and relies primarily on a discredited account of the Eleventh Amendment first articulated in the 1890 case of Hans v. Louisiana.”[3]

State and federal courts have now built an impregnable wall of immunity around certain state and federal officers. The sovereign immunity that is enshrined in state constitutions is, in theory, not absolute because it is conferred only to certain agents and officers and does not prohibit lawsuits to enjoin such agents and officers from performing unconstitutional or other bad acts. In practice, however, the growth of qualified immunities, which is in keeping with the growth of government itself, has caused more and more agents of the State to cloak themselves in immunity.

Bus drivers, teachers, coroners, constables, high school coaches, doctors and nurses at university hospitals, security guards, justices of the peace, government attorneys, legislators, mayors, boards of education and health, university administrators, Indian reservations, prison guards and wardens, police officers and detectives, janitors in government facilities, licensing boards, tax assessors, librarians, railroad workers, government engineers, judges and justices, school superintendents and principals, towing companies, health inspectors, probation officers, game wardens, museum docents and curators, social workers, court clerks, dog catchers, contractors for public utilities, public notaries, tollbooth attendants, airport traffic controllers, park rangers, ambulance drivers, firefighters, telephone operators, bus drivers, subway workers, city council members, state auditors, agricultural commissioners—all have sought to establish for themselves, with mixed degrees of success, the legal invincibility that comes with being an arm of the state.

Yet the idea that “the king can do no wrong” makes no sense in a governmental system that has lacked a king from its inception. Its application as law has left ordinary citizens with limited recourse against governments (or against people claiming governmental status for the purpose of immunity) that have committed actual wrongs. When the government, even at the state level, consists of vast bureaucracies of the kind that exist today, the doctrine of sovereign immunity becomes absurd. If it is true that in nine states and in the District of Columbia the government employs more than 20% of all workers, imagine how many people are eligible to claim immunity from liability for their tortious conduct and bad acts committed on the job.

Local news reports are full of stories about government employees invoking the doctrine of sovereign immunity; few such stories find their way into the national media. Judge Wade McCree of Michigan, for instance, recently carried out an affair with a woman who was a party in a child-support case on his docket, having sexual intercourse with her in his chambers and “sexting” her even on the day she appeared as a witness in his courtroom. Although McCree was removed from office, he was immune from civil liability. An airport in Charleston, West Virginia, is invoking the doctrine of immunity to shield itself from claims that it contributed to a chemical spill that contaminated the water supply. Officer Darren Wilson may be entitled to immunity for the shooting of Michael Brown, depending on how the facts unfold in that investigation.

The U.S. Supreme Court once famously declared that the doctrine of sovereign immunity “has never been discussed or the reasons for it given, but it has always been treated as an established doctrine.”[4] A disestablishment is now in order. The size and scope of government is simply too massive on the state and national level to sustain this doctrine that undermines the widely held belief of the American Founders that State power must be limited and that the State itself must be held accountable for its wrongs. Friedrich Hayek pointed out that the ideal of the rule of law requires the government to “act under the same law” and to “be limited in the same manner as any private person.”[5] The doctrine of sovereign immunity stands in contradistinction to this ideal: it places an increasing number of individuals above the law.

If the law is to be meaningful and just, it must apply equally to all persons and must bind those who enforce it. It must not recognize and condone privileges bestowed upon those with government connections or incentivize bad behavior within government ranks. Sovereign immunity is a problem that will only worsen if it is not addressed soon. The king can do wrong, and so can modern governments. It’s time for these governments to be held accountable for the harms they produce and to stop hiding behind a fiction that was long ago discredited.

________

[1]See generally, Louis L. Jaffe, “Suits Against Governments and Officers: Sovereign Immunity,” 77 Harvard Law Review 1 (1963).

[2]Jaffe at 2.

[3]Susan Randall, “Sovereign Immunity and the Uses of History,” 81 Nebraska L. Rev. 1, 4 (2002-03).

[4]U.S. v. Lee, 106 U.S. 196, 207 (1882).

[5]F. A. Hayek, The Constitution of Liberty, Vol. 17 of The Collected Works of F.A. Hayek, ed. Ronald Hamowy(Routlege, 2011), p. 318.

Are Lawyers Illiterate?

In Arts & Letters, Books, Essays, History, Humanities, Imagination, Law, Literature, Philosophy, Western Civilization, Western Philosophy on September 3, 2014 at 8:45 am

Allen 2

This piece originally appeared here in The Imaginative Conservative.

Webster’s defines “intelligent” as “endowed with intelligence or intellect; possessed of, or exhibiting, a high or fitting degree of intelligence or understanding.” This modern understanding of “intelligence” as an innate disposition or propensity differs from earlier understandings of the word as meaning “versed” or “skilled.” Milton, for instance, in Paradise Lost, calls the eagle and the stork “intelligent of seasons,” by which he meant that these birds, because of their experience, were cognizant of the seasons.

The older meaning of “intelligent” has less to do with native endowment than it does with gradual understanding. The older meaning, in other words, is that intelligence is acquired by effort and exposure rather than fixed by biological inheritance or natural capacity: one may become intelligent and is not just born that way; intelligence is a cultivated faculty, not an intrinsic feature.

Because of the altered signification of “intelligent,” we use today different words to describe the older meaning: erudite, knowledgeable, informed, traveled, educated. These words seem to us more palatable than their once-favored predecessors: civilized, polished, cultured, genteel, refined. I myself prefer words like “lettered” or “versed” that imply a knowledge of important books and the humanities generally.

The most apt term in this regard is also the most butchered in the current lexicon: “literate.” Contrary to what appears to be the prevailing assumption, “literate” does not simply refer to an ability to read. According to Webster’s, “literate” means “instructed in letters, educated; pertaining to, or learned in, literature.”

Not just to read, but to read well and widely—that is how you become “literate.” Accepting this traditional meaning, I question how many lawyers are or can become literate.

In the 1980s, Ithiel de Sola Pool, a professor of communications and media, determined that the average American adult reads approximately 240 words per minute. At that rate, it would take a person around 2,268.36 minutes (or 37 hours, 48 minutes, and 21.6 seconds) to read War and Peace, which comes in at 544,406 words. If that sounds encouraging—ever wanted to read War and Peace in a day-and-a-half?—consider these offsetting variables: reading at one sitting slows over time; attention span and memory recall are limited; the mind can be exercised only so much before it requires rest; people cannot constantly read for 2,268.36 minutes without going to the restroom or eating or daydreaming, among other things; a healthy lifestyle entails seven to nine hours of sleep per day; large portions of the day are spent carrying out quotidian operations, including showering, cooking, brushing teeth, commuting to and from work, getting dressed and undressed, answering phone calls, reading emails, cleaning, filling out paperwork, paying bills, and so on. Pool, moreover, was not using a text like War and Peace to gather his data, and his subjects were not writing in the margins of their books, taking notes on their laptops, or pausing to engage others in critical conversations about some narrative.

The National Association for Legal Career Professionals has estimated that lawyers at large firms bill on average 1,859 hours per year and work 2,208 hours per year. These numbers are more troubling in view of the fact that large law firms require their attorneys to attend functions with clients and potential clients, time that is neither billable nor considered “working hours.”

If there are around 8,760 hours in a year, and if a healthy person spends about 2,920 of those sleeping, there remain only around 5,840 hours per year for everything else. If “everything else” consisted of nothing—nothing at all—except reading War and Peace, then a lawyer at a large law firm could read that book about 154 times a year. But of course this is not possible, because no person can function as a machine functions. Once the offsetting variables are accounted for—and I have listed only a few that immediately spring to mind, and these for people with no families—it becomes apparent that it is nearly impossible for a lawyer to read more than about four lengthy or difficult books each month, and only the most diligent and disciplined can accomplish that.

Numbers can lead us astray, so let us consider some anecdotal evidence—my own testimony—which suggests that most lawyers are illiterate, or perhaps that lawyers have to try really hard to become literate or to avoid losing their literacy.

I am a lawyer, one who considers himself literate but increasingly in danger of becoming illiterate the longer I remain in my chosen profession. My hope is that literacy stays with you, that if you “frontload,” as it were, you can build a wide enough base to allow for slack in later years.

In 2013, I made an effort to overcome the time restrictions of my job to read through several canonical texts of Western Civilization. For the most part I undertook a book a week, although, because of scheduling constraints, I read what I took to be the most important or most famous sections of the lengthier books and volumes such as Aquinas’s Summa Theologica, a work that would require years of study to fully appreciate. I found myself, on many Thursday evenings, reading so rapidly to finish the text at hand that I could not enjoy myself or absorb the nuances and complexities established by the author.

Reading only one book a week when you are intelligent enough to read more is shameful and disgraceful, the sacrifice of a gift. During graduate school, I could read five or six books a week and can recall more than one week when I read a book a day. But each day I spend working as a lawyer, I am less able to digest the books I consume and to consume the books necessary for intellectual nourishment.

Economists use the term “opportunity cost” to refer to a choice to forego options or to pursue the benefits of one course of action rather than another. The cost of becoming a lawyer is giving up literacy or making its attainment more difficult; the gain, in theory, is a higher salary and financial stability. Whether the gain neutralizes the loss depends on one’s preferences. I myself would not trade for a million dollars the opportunity to read Tolstoy or Shakespeare or Aristotle or Santayana.

To achieve the admiration enjoyed by lawyers, other professionals must do their jobs several times better. Happily, this is not a high bar. That is why people prefer the company of doctors. It is not that lawyers are incompetent or unskilled; it is that they do not put their faculties to good use. All people think, but it is only by degree and by the object of their thought that the literate are distinguished from the illiterate. To put their minds to humane use would improve lawyers’ reputations considerably and call into question that axiom popularized by one of Dickens’s characters: “If there were no bad people, there would be no good lawyers.”

The way I see it, you can spend all your life billing clients and pushing paper under great stress, by investing your talents and resources in prospects that yield no intellectual returns, or you can spend your life establishing high standards of reason, understanding, and creativity by studying the most important and influential works that humans have produced through the ages. You can spend all your time transacting business, prosecuting and defending lawsuits, and preparing briefs and memoranda, or you can cultivate discernment and understanding. The options are not mutually exclusive: I have overstated to draw a sharp contrast, but the point remains.

Do not misunderstand me: working hard and earning profits are not only good and healthy activities but personally fulfilling. Yet they must be supplemented with humane contemplation and the private study of important ideas. Industry and innovation are requisite to a high quality of life, a robust economy, and human flourishing—and they make possible the time and leisure that enable some people to create great art and literature. Not everyone can be literate, and that is a good thing.

It is just that many lawyers never learn to live well and wisely, to place their seemingly urgent matters into perspective, or to appreciate, as Aristotle did, the virtues of moderation. This failure is directly related to lawyers’ neglect of history and philosophy and to their suppression of the moral imagination that works of good literature can awaken. This failure, as well, puts lawyers at a distinct disadvantage when it comes to spiritual, moral, and intellectual pursuits. As Mark Twain quipped, “The man who does not read good books has no advantage over the man who cannot read them.”

Lawyers are illiterate, most of them anyway. Trust them to handle your real estate closings or to manage your negligence claims, to finalize your divorce or to dash off angry letters to your competitors, but do not trust them to instruct you on plain living and high thinking. There are exceptions—Gerald Russello and Daniel Kornstein are two—but generally lawyers are not to be consulted on matters of importance to the soul. For those, we have good books, and with luck, the people who write and read them.

Troy Camplin Reviews “Napoleon in America,” a Novel by Shannon Selin

In America, American History, Arts & Letters, Book Reviews, Books, Creative Writing, History, Humanities, Novels, The Novel, Writing on August 20, 2014 at 8:45 am
Shannon Selin

Shannon Selin

Napoleon in America is a “what-if” historical novel that combines a variety of styles – epistolary, newspaper article, and regular novelistic narrative – to create a work that reads like a very well-written narrative of history. Given that the author is necessarily working with an entirely fictional world – one in which Napoleon escapes from St. Helena to the United States – the fact that she can create such an effect is quite remarkable. The reader is made to feel as if he or she is reading about actual historical events. Of particular note is the fact that Selin creates the impression that we are reading a Great Men History book, which makes it rather distinctive. As such, it is going against the direction in which historical studies have, themselves, gone.

Much contemporary history deals with everyday life, local histories, etc. But given that the protagonist of this novel, Napoleon, is the kind of person who is distinctly bored with everyday life – is too big for everyday life – we should not be surprised to find a story dominated by the overwhelming presence of the personality of Napoleon. It is perhaps for this very reason that the novel becomes involved in the great movements of Napoleon rather than the intimate details of his life. These aspects are touched on here and there, of course, but in the end, we remember Napoleon the Conquerer, not Napoleon the almost-died-when-he-got-to-America. Napoleon quickly recovers to dominate the novel with his personality. But this personality is not one changed by circumstances. He is the Napoleon we all love and loathe. He cannot settle down. He has to conquer.

Thus, with Selin’s novel, we have a complete inversion. The novel has, historically, dealt with everyday people in their everyday lives. The actions of most novelistic characters do not have a major impact on historical events. If we look at the way histories are written over the same time period of the rise of the European novel (which includes American and Canadian literature and, stylistically, much literature written in the rest of the world during the 20th century), we primarily see the complete opposite: an interest in major figures and their major effects on history dominate most historical narratives over this same time period. However, we see a shift within history toward the same kinds of concerns we see in novels: everyday peoples, the histories of institutions, local histories, etc. Thus, we should not be surprised to find novels picking up the kinds of narratives we once found in histories.

Along with the Big Men of the time, Selin deals with the Big Ideas of the time; of course, the Big Men are often the Big Men precisely because they discuss and try to enact the Big Ideas of their time. Liberalism and dictatorship and whether Napoleon is really a liberal or little better than the kings he likes to depose are discussed – as no doubt they were, in fact, discussed historically. We see some of the conflicts within French Liberalism – and some of the contradictions. Was it a mere coincidence that French Liberalism led to the Terror and to the Empire under Napoleon? Or was it simply bad luck? Pro- and anti-Napoleon liberals are unified in their opposition to the Bourbons, but the question is raised as to whether replacing one monarch with another is really an improvement. Yet, there seems a willingness, even among those who oppose Napoleon, to support revolution against the Bourbons, even if it results in another Napoleon (literally or figuratively). Along these lines, Selin does a magnificent job of showing how blinding the opposition to the Bourbons is in the decision by the French government to invade Spain. The King in fact opposes the invasion, but ends up being talked into it; the liberals believe the invasion is a Bourbon plot and evidence of his being a cruel dictator. The reality is more humdrum than the conspiracy theory the liberals are desperate to believe.

Overall, Selin’s book goes beyond what we would expect to find in a historical novel whose main character is a major historical figure. A traditional historical novel would have the characters doing all the major, public actions the history books tell us happened. Selin has to do something quite different. She has to first know what did in fact happen during the historical period in question; she then has to understand Napoleon well enough to understand what he might do in circumstances other than those in which he did, in fact, find himself; and then she has to create a realistic alternative to what did in fact happen, understanding the butterfly effects of a Napoleon in America. It is a garden of forking paths, and one can go in any number of directions. To this end, Selin is certainly effective in her choice of direction. The great uncertainty created by Napoleon’s presence in America is well demonstrated. The U.S. government does not seem to know what to do with him. We are, after all, talking about a young country still learning where it fits in the world. It has the benefit of being separated from Europe – where all the action lies – by a large ocean. But the action has come to America’s shores when Napoleon escapes St. Helena. The uncertainty that leaves Napoleon free to raise an army and wander into Texas is well within the realm of possibilities. As is the naïve belief by some – such as James Bowie – that Napoleon can be “handled.”

The majority of the novel is dominated by the spirit of uncertainty and worry. All the action comes in at the end of the novel, when Napoleon finally does invade Texas. And even then, we are left with a great deal of uncertainty. Napoleon has won a battle and established himself in San Antonio; however, we are left with the question of what will happen next. Napoleon in America has the feeling of the first novel in a sequel. It would not surprise me if Napoleon in Texas were to follow. There is a great deal more to this story that could be explored. Will Napoleon be able to create a long-term presence in Texas? What will be the response of Mexico? What will be the response of the American government? What will be the response of the American settlers? Will the people of Kentucky and Tennessee volunteer to fight for Texas independence under Napoleon as they did for its independence under Austin? Is Napoleon just preparing the way for the Americans to take over, making it a bit easier than it was historically? Or is he perhaps making it a bit harder, since a Mexican government may take Napoleon as a much more serious threat to the government of Mexico than those who only wanted an independent Texas?

For those who enjoy the What-If History genre, these are fun questions to consider. I find it hard to imagine that anyone who reads Napoleon in America – which should include most of those who enjoy historical fiction – would fail to want these questions answered in a sequel.

Troy CamplinTroy Camplin holds a Ph.D. in humanities from the University of Texas at Dallas.  He has taught English in middle school, high school, and college, and is currently taking care of his children at home. He is the author of Diaphysics, an interdisciplinary work on systems philosophy; other projects include the application of F.A. Hayek’s spontaneous order theory to ethics, the arts, and literature. His play “Almost Ithacad” won the PIA Award from the Cyberfest at Dallas Hub Theater.

The Life of Julius Porter Farish

In American History, American Literature, Arts & Letters, History, Southern History, The South on August 13, 2014 at 8:45 am
Sarah Elizabeth Farish

Sarah Elizabeth Farish

Sarah Elizabeth Farish is a graduate of the University of Illinois where she majored in English and Secondary Education. She is starting her first year teaching at Wheaton Academy in Wheaton, Illinois, this fall. She also coaches cross-country. While a northerner by residence she considers herself a southerner at heart, and loves Southern culture and literature very much.

The words “Deep South” stir a passion in our souls that they might not stir up in other folks. Hearing those syllables – pronounced more like “Deeep Sow-uth” in our family – causes several images to scroll through our minds: images of cotton plantations, Spanish moss, white-columned houses, small towns, Coca-Cola plants, Auburn University, and more.

For some reason hearing those words and seeing those images makes me think in black and white, as if the Deep South was a place frozen in time where things haven’t changed and Scout Finch is still strolling around the neighborhood looking for Jem and Dill.

And for many of us, it is that place.

It’s hard to say when and where my family begins but this story is going to be the story of my grandfather, Julius “Jay” Porter Farish III.

On November 15, 1929, the small town of Atmore, Alabama, needed something to hope in. The Great Depression had just started sinking its deep claws into America’s economy and morale.

Alabama has long been heralded as a state with many troubles, and this is true, but it was especially true during the Depression. Racism was rampant, pockets were empty, and folks were set in their ways, sometimes to a fault. Southerners were in church on Sundays, praying for an end to the Depression, and then working hard all week to bring money home to their families.

The mothers were teachers or stayed and worked at home. Black maids helped the white mothers and cooked and cleaned and then returned to the black neighborhoods to do the same for their own families.

My family, the Farishes, moved to Monroeville, Alabama, during the Depression and brought with them a sensible and strong work ethic. They immediately became involved in the town. This small, unsuspecting town would produce a few famous Americans who would alter American history. I’ll talk about them later.

As soon as the Depression ended, the South, like the rest of the nation, was hit with another blow: World War II. Southerners crowded around their radios holding handkerchiefs to their faces as tears rolled down their cheeks; they listened to the horrifying news of Pearl Harbor. Many young men suddenly disappeared from town, and folks prayed that the names of these men would not appear on the injured, missing, or worst of all dead list in the newspaper.

In the nearby town of Opelika, Alabama, Jay’s future wife Barbara Glenn was living alongside German prisoners of war. While she and her friends played kick-the-can in the streets POWs suffered through the Alabama heat but still experienced the Southern Hospitality that was characteristic of our family. Her brother John, my great-uncle, was in the Pacific serving his country as a Navy Sea Bee.

World War II ended and John came home. Despite the fact that he was in his twenties his hair was white and would be until he died. The stress had taken the color from his hair and the joy from his eyes and he returned a different man.

And then, after what felt like a million years but almost as quickly as it had started, the war ended. The streets were flooded with people rejoicing and kissing and laughing. The liquor flowed and hearts were full. Life seemed as if it were turning back around.

After the war America seemed like a joyful place again. Folks had survived the Great Depression and then a war that had shaken them to their core. Men were returning home, going to college, marrying their sweethearts, and quickly starting families.

Our family moved again, this time to Opelika, Alabama, a town right next door to what we hail as the greatest institution in the United States of America: Auburn University, home of the Tigers, although at the time it was Alabama Polytechnic University. Our passion for Auburn ran deeper and more passionately than the red clay beneath our feet. To this day Farishes would give our heart and soul to see Auburn football win, and even more than that we’d give an arm or a leg (or both) to see them destroy the University of Alabama.

Jay played Auburn basketball and was all Southeastern Conference. He was drafted by the Lakers but chose to serve his country in Korea and was there for several years.

In Opelika in the sixties the issue of segregation was unavoidable. Rosa Parks was making news, and our family prayed for her and supported her. Their deeply held Christian beliefs gave them wisdom to see that racism was hurting our society and not helping it.

Our family prayed for Dr. Martin Luther King Jr. and wept when he was assassinated. They were progressive (for their day) in that they put their children in public school while other white parents shuttled their children to the local private schools to keep them away from the black children.

Our family fought the race barrier after they moved to Atlanta, when segregation was illegal but still practiced, and they stood up against racism in the places they lived, ate, and shopped.

In the 1960s the segregation war was in full force. White families were pulling their children out of public schools and placing them in private ones. Protestants were even sending their children to Catholic schools to avoid black schools.

As I said, my grandfather grew up in Monroeville.  He was seen as the town’s athlete from a young age. Nicknamed “Bubber” (pronounced Bubba) in his childhood, he excelled in every sport he played, but mostly basketball and football.

A famous young woman was growing up a few houses down from Bubba, and right across the street from his grandmother’s house. That young woman was Harper Lee, who would write the novel that changed America, To Kill A Mockingbird. Harper, who went by the nickname “Nelle,” was a tomboy, and would often find herself knocking at Bubba’s door and inviting herself into a pickup game of roundball or football.

Nelle’s best friend, Truman Capote, was also in Monroeville during the summer and was known as a bit of a wimp to Bubba and his friends. Whenever they played football, Truman was always the center; however, Bubba and his friends would later joke that Truman accomplished more than they ever would. They mocked him for sitting at the general store and scribbling in his notebooks, but in the end Truman ended up doing just fine.

When Gregory Peck came to Monroeville for the filming of To Kill a Mockingbird Bubba took him to breakfast and until he died loved to tell the story of what a kind man Gregory Peck was.

Bubba’s athletic talent made him the star of his town. He got a scholarship to a small college in south Alabama for a year, and then transferred to Auburn. Monroeville had someone to hope in. Every time Bubba played well (which was often) Monroeville stood behind its man.

He then met Barbara Glenn and they married after a long courtship. Bubba turned down an offer to play for the then-Minneapolis Lakers and instead chose to serve his country in the Korean Conflict. He joined the Air Force and spent several years overseas.

When Jay came home, his family moved to Opelika, Alabama. His three children, Julie, and identical twins Steve and John, were in elementary school. The segregation battle was present even in sleepy Opelika.

Jay and his other family members living in Opelika who had young children were all active in the segregation debate. Nina’s cousin, Winston Smith T, was adamant that they keep their children in public schools.

When all the other parents were taking their white children out of the schools and putting them in private schools, the Farishes stayed in public school. And when the schools hired a black teacher, the Farishes stayed.

Then they moved to Atlanta. Jay joined the Atlanta Country Club to play golf with his work friends. The caddies there were all black men who weren’t allowed to fish on the grounds or play the course unless accompanied by a member, and so Jay made friends with them. He went fishing with them and played with them. He would take his children and grandchildren to fish with the caddies when few other club members would.

Among other things, Jay stood for his faith. His faith in Jesus as the Son of God was the reason that he did all that he did and the reason he broke the barriers he broke.

Because of Jay I stand up against judgment and hatred because of race and refuse to discriminate. My family and I love others with our whole hearts.

And now Jay is gone. However, the legacy he’s left behind for his children and their children will continue to help them stand up for victims of injustice. We are proud of his service to his family, the Auburn family, and his country. But more than that we love him for his love for God.

Follow

Get every new post delivered to your Inbox.

Join 6,503 other followers

%d bloggers like this: