Allen Porter Mendenhall

Archive for the ‘America’ Category

Es buena la Decimocuarta Enmienda?

In America, American History, Arts & Letters, Austrian Economics, Historicism, History, Humanities, Jurisprudence, Law, Liberalism, Libertarianism, Nineteenth-Century America, Philosophy, The Supreme Court on February 18, 2015 at 8:45 am

Allen 2

El artículo original se encuentra aquí. Traducido del inglés por Mariano Bas Uribe.

Pocas cosas dividen a los libertarios como la Decimocuarta Enmienda de la Constitución de Estados Unidos. Gene Healy ha observado que “Liberales clásicos de buena fe se han encontrado en ambos lados de la discusión”.

Por un lado están los que alaban la enmienda por evitar el poder de los estados para prejuzgar, dirigir, regular o usar fuerza de cualquier tipo para imponer leyes discriminatorias sobre sus ciudadanos. Por el otro están los que, aunque reconozcan la naturaleza problemática de las malas conductas y los actos inmorales del estado, no están dispuestos a consentir la transferencia de poder de los estados al gobierno federal, y en particular al poder judicial federal.

La división se reduce a las visiones del federalismo, es decir, al equilibrio o separación de los gobiernos estatales y nacional.

Las secciones primera y quinta de la Decimocuarta Enmienda son las más polémicas. La Sección Uno incluya la Cláusula de Ciudadanía, la Cláusula de Privilegios o Inmunidades, la Cláusula de Proceso Debido y la Cláusula de Igual Protección y la Sección Cinco otorga al Congreso la autoridad para aplicar legislativamente la enmienda. Estas disposiciones han dado mayores poderes al gobierno nacional, permitiendo a los tribunales federales a hacer que los estados cumplan las leyes federales con respecto a ciertos derechos (o supuestos derechos) individuales.

El Tribunal Supremo de Estados Unidos, en Barron v. Baltimore (1833), sostuvo que la Declaración de Derechos (las primeras diez enmiendas a la Constitución de EEUU) obligaban solo al gobierno federal y no a los gobiernos estatales. Mediante la Decimocuarta Enmienda, que fue ratificada oficialmente en 1868, el Tribunal Supremo de Estados Unidos y los tribunales federales inferiores han “incorporado” gradualmente la mayoría de las disposiciones de la Declaración de Derechos para aplicarlas contra los estados. Así que el gobierno federal se ha empoderado para hacer que los gobiernos estatales cumplan disposiciones que originalmente solo pretendían restringir los abusos federales.

Si el gobierno federal fuera el único o el mejor mecanismo para reducir el tipo de discriminación y violaciones de derechos prohibidos por la Decimocuarta Enmienda, esta sería bienvenida y aceptada. Pero no es el único correctivo concebible y, aparte, ¿no es contraintuitivo para los libertarios aplaudir y defender un aumento tanto en el ámbito como en el grado del poder federal, incluso si ese poder, en algunas ocasiones, haya producidos resultados admirables?

En contextos no relacionados con la Decimocuarta Enmienda, casi nunca resulta polémico para los libertarios promover remedios no gubernamentales, locales o descentralizados, para leyes y prácticas injustas y discriminatorias. A menudo se alega que la industria y el comercio y la simple economía son mejores mecanismos para reducir el comportamiento discriminatorio, ya se base en raza, clase, sexo, género o lo que sea, que la fuerza del gobierno. Aun así, frecuentemente esos libertarios que hacen sonar las alarmas acerca de las aproximaciones gubernamental, federal y centralizada de la Decimocuarta Enmienda a las leyes y prácticas discriminatorias son tratados de forma poco sincera, en lugar de con argumentos, como defensores de aquellas leyes y prácticas, en lugar de como oponentes por principio de las reparaciones federales centralizadas para daños sociales.

Cualquier debate sobre la Decimocuarta Enmienda debe ocuparse de la validez de esta aprobación. Durante la Reconstrucción, la ratificación de la Decimocuarta Enmienda se convirtió en una condición previa para la readmisión en la Unión de los antiguos estados confederados. Healy ha llamado a esto “ratificación a punta de bayoneta”, porque, dice, “para acabar con el gobierno militar, se obligó a los estados sureños a ratificar la Decimocuarta Enmienda”. La condición natural de esta reunificación contradice la afirmación de que la Decimocuarta Enmienda fue ratificada por un pacto mutuo entre los estados.

Los jueces federales consideran irrelevante el propósito de la enmienda

En 1873, el juez Samuel F. Miller, junto con otros cuatro jueces, sostuvo que la Decimocuarta Enmienda protegía los privilegios e inmunidades de la ciudadanía nacional, no la estatal. El caso afectaba a regulaciones estatales de mataderos para ocuparse de las emergencias sanitarias que derivaban de sangre animal que se filtraba en el suministro de agua. El juez Miller opinaba que la Decimocuarta Enmienda estaba pensada para ocuparse de la discriminación racial contra los antiguos esclavos en lugar de para la regulación de los carniceros:

Al acabar la guerra [de Secesión], los que habían conseguido restablecer la autoridad del gobierno federal no se contentaron con permitir que esta gran ley de emancipación se basara en los resultados reales de la contienda o la proclamación del ejecutivo [la Declaración de Emancipación], ya que ambos podían ser cuestionados en tiempos posteriores, y determinaron poner estos resultado principal y más valioso en la Constitución de la unión restaurada como uno de sus artículos fundamentales.

Lo que dice el juez Miller es que el significado y propósito de la Decimocuarta Enmienda (proteger y preservar los derechos de los esclavos liberados) se desacredita cuando se usa para justificar la intervención federal en los asuntos económicos cotidianos de un sector estatal concreto. La regulación estatal de los mataderos de animales no es una opresión del mismo tipo o grado que la esclavitud de gente basada en su raza. Argumentar otra cosa es minimizar la gravedad de la ideología racista.

El juez Miller reconocía que la regulación estatal en cuestión era “denunciada no solo por crear un monopolio y conferir privilegios odiosos y exclusivos a un pequeño número de personas a costa de una buena parte de la comunidad de Nueva Orleáns”, la ciudad afectada por los mataderos en cuestión, sino asimismo como una privación del derechos de los carniceros a ejercitar su profesión. Sin embargo, el juez Miller no creía que el gobierno federal tuviera derecho bajo la Constitución a interferir con una autoridad que siempre se había concedido a gobiernos estatales y locales.

Habiendo establecido al alcance limitado de la cláusula de privilegios o inmunidades en los Casos de los mataderos, el Tribunal Supremo acudió posteriormente a la Cláusula de Igual Protección y la Cláusula del Proceso Debido para echar abajo leyes bajo la Decimocuarta Enmienda. Pero el Tribunal Supremo no se ha detenido ante las leyes estatales: ha usado la Cláusula de Igual Protección y la Cláusula del Proceso Debido como pretexto para regular a ciudadanos y empresas privadas. La Decimocuarta Enmienda, que pretendía reducir la discriminación, se ha usado, paradójicamente, para defender programas de acción afirmativa que discriminan a ciertas clases de personas.

Ceder el poder a los jueces federales no les predispone a la libertad. Como la Sección Cinco de la Decimocuarta Enmienda permite al Congreso aprobar enmiendas o leyes que traten de infracciones estatales a la libertad individual, no es necesario ni constitucionalmente sensato que el poder judicial federal asuma ese papel. Los miembros del Congreso, al contrario que los jueces federales que disfrutan del cargo vitaliciamente, son responsables ante los votantes en sus estados y por tanto es más probable que sufran por su infidelidad a la Constitución.

A nivel conceptual, además, parece extraño que los libertarios defiendan internamente lo que condenan en relaciones exteriores, a saber, la doctrina paternalista de que un gobierno central más poderoso tendría que usar su músculo para obligar a cumplir a unidades políticas más pequeñas.

El legado de la enmienda

¿Ha generado resultados constructivos la Decimocuarta Enmienda? En muchas áreas, sí. ¿Son deplorables algunas de las ideologías contra las que se ha dirigido? En muchos casos, sí. ¿Eran malas las normas contra el mestizaje, las normas de segregación escolar y las normas prohibiendo a los afro-americanos actuar como jurados? Sí, por supuesto. Sin embargo no se deduce que solo porque algunos casos bajo la Decimocuarta Enmienda hayan invalidado estas malas leyes, esta sea necesaria o incondicionalmente buena, especialmente a la vista de la pendiente resbaladiza de precedentes que con el tiempo distancian a las normas de su aplicación pretendida. “Si los tribunales empiezan a usar la Decimocuarta Enmienda para aplicar derechos naturales libertarios”, advierte Jacob Huebert en Libertarianism Today, “no sería más que un pequeño paso para que empezaran a usarla para aplicar derechos positivos no libertarios”.

Intelectuales de la izquierda como Erwin Chemerinsky, Charles Black, Peter Edelman y Frank Michelman han defendido la protección y aplicación de “derechos de subsistencia” bajo la Decimocuarta Enmienda. Estos incluirían los derechos a comida, atención sanitaria y salario mínimo proporcionados por el gobierno. Las leyes estatales que evitaran estos derechos (que no proporcionaran estas prestaciones sociales) se considerarían inconstitucionales; el ejecutivo federal aseguraría así que todo ciudadano de los estados transgresores reciba atención sanitaria, alimentos y una renta básica, todo subvencionado por los contribuyentes.

Estoy dispuesto a admitir no solo que en la práctica yo litigaría bajo las disposiciones de la Decimocuarta Enmienda para representar competente y éticamente a mi cliente (imaginar un sistema en el que el poder federal no esté tan atrincherado es inútil para litigantes en un sistema real en que el poder federal está profundamente arraigado), pero también que, en un mundo más ideal, podría haber otras formas menos deletéreas de luchar contra discriminación y violaciones de derechos que la Decimocuarta Enmienda. El taller de la actividad diaria no atiende abstracciones esperanzadas. No se puede deshacer un sistema de la noche a la mañana: los abogados deben actuar con las leyes que tienen disponibles y no pueden inventar otras nuevas para sus casos o agarrarse a una mera política. No si quieren tener éxito.

En ausencia de la Decimocuarta Enmienda, muchas personas y empresas con quejas válidas podrían no tener soluciones constitucionales. Sin embargo eso no significa que los términos y efectos de la Decimocuarta Enmienda sean incuestionablemente deseables o categóricamente buenos. Se pueden celebrar las victorias logradas mediante la Decimocuarta Enmienda mientras se reconoce que debe haber un modo mejor.

La Decimocuarta Enmienda no es en sí misma un bien positivo sino un animal peligroso a manejar con cuidado. Los libertarios como clase tienen una devoción manifiesta impropia a su funcionamiento. Necesitamos en su lugar un debate, abierto, honrado y colegiado acerca de los méritos y la función de esta enmienda, no sea que otras criaturas similares miren al futuro y a costa de nuestras amadas libertades.

 

The Classical Liberalism of Ralph Waldo Emerson

In America, American History, American Literature, Arts & Letters, Austrian Economics, Books, Economics, Emerson, Essays, Ethics, Historicism, History, Humane Economy, Humanities, Liberalism, Libertarianism, Literary Theory & Criticism, Literature, Nineteenth-Century America, Philosophy, Poetry, Politics, Property, Western Philosophy on January 7, 2015 at 8:45 am

Allen 2

“The less government we have, the better.”[1] So declared Ralph Waldo Emerson, a man not usually treated as a classical liberal. Yet this man—the Sage of Concord—held views that cannot be described as anything but classical liberal or libertarian. His is a pastoral libertarianism that glorifies nature as a source of insight and inspiration for those with a poetical sense and a prophetic vision.

None other than Cornel West, no friend of the free market, has said that “Emerson is neither a liberal nor a conservative and certainly not a socialist or even a civic republican. Rather he is a petit bourgeois libertarian, with at times anarchist tendencies and limited yet genuine democratic sentiments.”[2] “Throughout his career,” Neal Dolan adds, “Emerson remained fully committed to the Scottish-inflected Lockean-libertarian liberalism whose influence we have traced to his earliest notebooks.”[3] An abundance of evidence supports this view. Dolan himself has written an entire book on the subject: Emerson’s Liberalism (University of Wisconsin Press, 2009). Emerson extolled the “infinitude of the private man”[4] and projected a “strong libertarian-liberal emphasis”[5] in his essays and speeches. He was not an anarchist: he believed that “[p]ersonal rights, universally the same, demand a government framed on the ratio of the census” because “property demands a government framed on the ratio of owners and of owning.”[6] Nevertheless, he opined that “[e]very actual State is corrupt”[7] and that, if the people in a given territory were wise, no government would be necessary: “[W]ith the appearance of the wise man, the State expires. The appearance of character makes the State unnecessary.”[8] One need only look to one of Emerson’s most famous essays, “Self Reliance,” for proof of his libertarianism.

“Self‑Reliance” is perhaps the most exhilarating expression of individualism ever written, premised as it is on the idea that each of us possesses a degree of genius that can be realized through confidence, intuition, and nonconformity. “To believe your own thought, to believe that what is true for you in your private heart is true for all men,” Emerson proclaims, “that is genius.”[9]

Genius, then, is a belief in the awesome power of the human mind and in its ability to divine truths that, although comprehended differently by each individual, are common to everyone. Not all genius, on this view, is necessarily or universally right, since genius is, by definition, a belief only, not a definite reality. Yet it is a belief that leads individuals to “trust thyself”[10] and thereby to realize their fullest potential and to energize their most creative faculties. Such self‑realization has a spiritual component insofar as “nothing is at last sacred but the integrity of your own mind”[11] and “no law can be sacred to me but that of my nature.”[12]

According to Emerson, genius precedes society and the State, which corrupt rather than clarify reasoning and which thwart rather than generate productivity. “Wild liberty develops iron conscience” whereas a “[w]ant of liberty […] stupefies conscience.”[13] History shows that great minds have challenged the conventions and authority of society and the State and that “great works of art have no more affecting lesson for us than this. They teach us to abide by our spontaneous impression with good‑humored inflexibility then most when the whole cry of voices is on the other side.”[14] Accordingly, we ought to refuse to “capitulate to badges and names, to large societies and dead institutions.”[15] We ought, that is, to be deliberate, nonconformist pursuers of truth rather than of mere apprehensions of truth prescribed for us by others. “Whoso would be a man,” Emerson says, “must be a noncomformist.”[16]

Self‑Interest and Conviction

For Emerson as for Ayn Rand, rational agents act morally by pursuing their self‑interests, including self‑interests in the well‑being of family, friends, and neighbors, who are known and tangible companions rather than abstract political concepts. In Emerson’s words, “The only right is what is after my constitution, the only wrong what is against it.”[17] Or: “Few and mean as my gifts may be, I actually am, and do not need for my own assurance or the assurance of my fellows any secondary testimony.”[18] It is in everyone’s best interest that each individual resides in his own truth without selling off his liberty.[19] “It is,” in other words, “easy to see that a greater self-reliance must work a revolution in all the offices and relations of men.”[20]

It is not that self‑assurance equates with rightness or that stubbornness is a virtue; it is that confidence in what one knows and believes is a condition precedent to achieving one’s goals. Failures are inevitable, as are setbacks; only by exerting one’s will may one overcome the failures and setbacks that are needed to achieve success. Because “man’s nature is a sufficient advertisement to him of the character of his fellows,”[21] self-reliance enables cooperative enterprise: “Whilst I do what is fit for me, and abstain from what is unfit, my neighbor and I shall often agree in our means, and work together for a time to one end.”[22] Counterintuitively, only in total isolation and autonomy does “all mean egotism vanish.”[23]

If, as Emerson suggests, a “man is to carry himself in the presence of all opposition, as if everything were titular and ephemeral but he,”[24] how should he treat the poor? Emerson supplies this answer:

Do not tell me, as a good man did to‑day, of my obligation to put all poor men in good situations. Are they my poor? I tell thee, thou foolish philanthropist, that I grudge the dollar, the dime, the cent, I give to such men as do not belong to me and to whom I do not belong. There is a class of persons to whom by all spiritual affinity I am bought and sold; for them I will go to prison, if need be; but your miscellaneous popular charities; the education at college of fools; the building of meeting‑houses to the vain end to which many now stand; alms to sots; and the thousandfold Relief Societies;—though I confess with shame I sometimes succumb and give the dollar, it is a wicked dollar which by and by I shall have the manhood to withhold.[25]

These lines require qualification. Emerson is not damning philanthropy or charity categorically or unconditionally; after all, he will, he says, go to prison for certain individuals with whom he shares a special relationship. “I shall endeavor to nourish my parents, to support my family, to be the chaste husband of one wife,” he elaborates.[26] Emerson is, instead, pointing out, with much exhibition, that one does not act morally simply by giving away money without conviction or to subsidize irresponsible, unsustainable, or exploitative business activities.

It is not moral to give away a little money that you do not care to part with or to fund an abstract cause when you lack knowledge of, and have no stake in, its outcome. Only when you give money to people or causes with which you are familiar,[27] and with whom or which you have something at stake, is your gift meaningful; and it is never moral to give for show or merely to please society. To give morally, you must mean to give morally—and have something to lose. The best thing one can do for the poor is to help them to empower themselves to achieve their own ends and to utilize their own skills—to put “them once more in communication with their own reason.”[28] “A man is fed,” Emerson says, not that he may be fed, but that he may work.”[29] Emerson’s work ethic does not demean the poor; it builds up the poor. It is good and right to enable a poor man to overcome his conditions and to elevate his station in life, but there is no point in trying to establish absolute equality among people, for only the “foolish […] suppose every man is as every other man.”[30] The wise man, by contrast, “shows his wisdom in separation, in gradation, and his scale of creatures and of merits as wide as nature.”[31] Such separation and gradation are elements of the beautiful variety and complexity of the natural, phenomenal world in which man pursues his aims and accomplishes what he wills.

Dissent

Emerson famously remarks that a “foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines.”[32] Much ink has been spilled to explain (or explain away) these lines. I take them to mean, in context, that although servile flattery and showy sycophancy may gain a person recognition and popularity, they will not make that person moral or great but, instead, weak and dependent. There is no goodness or greatness in a consistency imposed from the outside and against one’s better judgment; many ideas and practices have been consistently bad and made worse by their very consistency. “With consistency,” therefore, as Emerson warns, “a great soul has simply nothing to do.”[33]

Ludwig von Mises seems to have adopted the animating, affirming individualism of Emerson, and even, perhaps, Emerson’s dictum of nonconformity. Troping Emerson, Mises remarks that “literature is not conformism, but dissent.”[34] “Those authors,” he adds, “who merely repeat what everybody approves and wants to hear are of no importance. What counts alone is the innovator, the dissenter, the harbinger of things unheard of, the man who rejects the traditional standards and aims at substituting new values and ideas for old ones.”[35] This man does not mindlessly stand for society and the State and their compulsive institutions; he is “by necessity anti‑authoritarian and anti‑governmental, irreconcilably opposed to the immense majority of his contemporaries. He is precisely the author whose books the greater part of the public does not buy.”[36] He is, in short, an Emersonian, as Mises himself was.

The Marketplace of Ideas

To be truly Emersonian you may not accept the endorsements and propositions here as unconditional truth, but must, instead, read Emerson and Mises and Rand for yourself to see whether their individualism is alike in its affirmation of human agency resulting from inspirational nonconformity. If you do so with an inquiring seriousness, while trusting the integrity of your own impressions, you will, I suspect, arrive at the same conclusion I have reached.

There is an understandable and powerful tendency among libertarians to consider themselves part of a unit, a movement, a party, or a coalition, and of course it is fine and necessary to celebrate the ways in which economic freedom facilitates cooperation and harmony among groups or communities; nevertheless, there is also a danger in shutting down debate and in eliminating competition among different ideas, which is to say, a danger in groupthink or compromise, in treating the market as an undifferentiated mass divorced from the innumerable transactions of voluntarily acting agents. There is, too, the tendency to become what Emerson called a “retained attorney”[37] who is able to recite talking points and to argue the predictable “airs of the bench”[38] without engaging the opposition in a meaningful debate.

Emerson teaches not only to follow your convictions but to engage and interact with others lest your convictions be kept to yourself and deprived of any utility. It is the free play of competing ideas that filters the good from the bad; your ideas aren’t worth a lick until you’ve submitted them to the test of the marketplace.

“It is easy in the world,” Emerson reminds us, “to live after the world’s opinion; it is easy in solitude to live after our own; but the great man is he who in the midst of the crowd keeps with perfect sweetness the independence of solitude.”[39] We can stand together only by first standing alone. Thus, “[w]e must go alone.”[40] You must “[i]nsist on yourself”[41] and “[s]peak the truth.”[42] You must channel your knowledge and originality to enable others to empower themselves. All collectives are made up of constituent parts; the unit benefits from the aggregate constructive action of motivated individuals. Emerson teaches us that if we all, each one of us, endeavor to excel at our favorite preoccupations and to expand the reach of our talent and industry, we will better the lives of those around us and pass along our prosperity to our posterity.

[1] Ralph Waldo Emerson, “Politics,” in Emerson: Essays & Poems (The Library of America, 1996), p. 567.

[2] Cornel West, The American Evasion of Philosophy (University of Wisconsin Press, 1989), p. 40.

[3] Neal Dolan, “Property in Being,” in A Political Companion to Ralph Waldo Emerson, edited by Alan M. Levine and Daniel S. Malachuk (The University Press of Kentucky, 2011), p. 371.

[4] Ralph Waldo Emerson, correspondence in The Journals and Miscellaneous Notebooks of Ralph Waldo Emerson, 16 vols., ed. William H. Gilman, Ralph H. Orth, et al. (Cambridge: Harvard University Press, 1960-1982). This quote comes from Vol. 7, p. 342.

[5] Neal Dolan, Emerson’s Liberalism (University of Wisconsin Press, 2009), p. 182.

[6] Emerson, “Politics,” at 560.

[7] Emerson, “Politics,” at 563.

[8] Emerson, “Politics,” at 568.

[9] Ralph Waldo Emerson, “Self-Reliance,” in Emerson: Essays & Poems (The Library of America, 1996), p. 259.

[10] Emerson, “Self-Reliance,” at 260.

[11] Emerson, “Self-Reliance,” at 261.

[12] Emerson, “Self-Reliance,” at 262.

[13] Emerson, “Politics” at 565-566.

[14] Emerson, “Self-Reliance,” at 259.

[15] Emerson, “Self-Reliance,” at 262.

[16] Emerson, “Self-Reliance,” at 261.

[17] Emerson, “Self-Reliance,” at 262.

[18] Emerson, “Self-Reliance,” at 263.

[19] Emerson, “Self-Reliance,” at 274.

[20] Emerson, “Self-Reliance,” at 275.

[21] Emerson, “Politics,” at 566.

[22] Emerson, “Politics,” at 567.

[23] Emerson, “Nature,” in Emerson: Essays and Poems, p. 10. The original reads “all mean egotism vanishes” rather than “vanish.”

[24] Emerson, “Self-Reliance,” at 262.

[25] Emerson, “Self-Reliance,” at 262-63.

[26] Emerson, “Self-Reliance,” at 273.

[27] “Consider whether you have satisfied your relations to father, mother, cousin, neighbor, town, cat, and dog,” Emerson says. Emerson, “Self Reliance,” at 274.

[28] Emerson, “Self-Reliance,” at 276.

[29] Emerson, “Nature,” at 13.

[30] Emerson, “Nature,” at 27.

[31] Emerson, “Nature,” at 27.

[32] Emerson, “Self-Reliance,” at 265.

[33] Emerson, “Self-Reliance,” at 265.

[34] Ludwig von Mises, The Anti-Capitalistic Mentality (Auburn: The Ludwig von Mises Institute, 2008), p. 51.

[35] Mises, The Anti-Capitalistic Mentality, at 51.

[36] Mises, The Anti-Capitalistic Mentality, at 51.

[37] Emerson, “Self-Reliance,” at 264.

[38] Emerson, “Self-Reliance,” at 264.

[39] Emerson, “Self-Reliance,” at 263.

[40] Emerson, “Self-Reliance,” at 272.

[41] Emerson, “Self-Reliance,” at 278.

[42] Ralph Waldo Emerson, “The Divinity School Address,” in Emerson: Essays & Poems (The Library of America, 1996), p. 77.

Interview with Robert J. Ernst, author of “The Inside War”

In America, American History, American Literature, Arts & Letters, Books, Creative Writing, Fiction, History, Humanities, Nineteenth-Century America, Novels, Southern History, Southern Literary Review, Southern Literature, The Novel, The South, Writing on December 4, 2014 at 8:45 am

This interview originally appeared here in Southern Literary Review.

Robert Ernst

Robert Ernst

APM: Thanks for taking the time to sit down for this interview, Bob. Your novel The Inside War is about an Appalachian mountain family during the Civil War. How long have you been interested in the Civil War?

RJE: I have had an interest in the Civil War for many years. Specifically, the effect of the war on Appalachia became an interest as I researched family history, now more than a decade ago. I realized that not much had been written, outside of academic treatises, on this aspect of the war. Bushwhacking ambushes, bands of roving deserters, intensely opposed partisan factions, and a breakdown in civil society befell western North Carolina. Of course, much study had been given to the poverty of the area during the twentieth century, but not much, save bluegrass music, about its culture. What I discovered was a vibrant pre-war society thoroughly rent by the war. And, the area did not recover.

APM: The story of Will Roberts, your novel’s protagonist, is similar to that of many actual soldiers who fought for the Confederacy. How much historical research went into this book? It seems as if there are a number of events in your story—Sammy Palmer’s shooting of the sheriff, for instance—that track historical occurrences.

RJE: Much of the story is based on historical events. In fact, Will Roberts was a real person, as was his brother, Edwin. I traced their wartime adventures, researched the battles and conditions of their captivity and wove a fictional story around them. Likewise their wives, as portrayed in the story, were based on real people, although their story is more fictionalized. The novel does incorporate many historical characters and events that occurred in the vicinity of Marshall, North Carolina, by which I attempt to portray a picture of the character of the area and the severe impact of the war on it.

APM: There are some themes in the book that cover an aspect of the Civil War that is not often covered. Tell us about those.

RJE: The tactic of bushwhacking, or ambushing mountain patrols, is one. Guerrilla warfare as a matter of accepted tactics was new and was a terrifying degradation of the morality of warfare. There was a real cultural divide among the citizens of western North Carolina between those who supported the North, the “tories,” and those who supported the Confederacy. These divisions played out in many ways, most notably in atrocities like the Shelton Laurel massacre, but more subtly in familial and neighbor relationships. I doubt many women suffered as did those in Appalachia, from the depredations, theft and physical threat of the men who populated the mountains during the war. I was surprised to learn of the inhumane prison conditions at Ft. Delaware. Everyone knows of Andersonville, but not many are aware of Ft. Delaware. We know of the great Civil War battles, but there were scores of skirmishes every week that terrified the contestants and shaped their perceptions. Certainly, Roberts’s family suffered greatly, even though their war happened in the background to better known events.

APM: You seem careful not to glorify war but to present it as the complex tragedy that it is. The book’s epigraph states, “For those who have suffered war.” I wonder if the process of writing this book taught you anything about war itself. What do you think?

Allen Mendenhall

Allen Mendenhall

RJE: The grand histories of the conflicts, eulogizing the fallen and celebrating the victorious are all necessary parts of our remembrance of a terrible, national conflict. What I found in researching this story was intense personal suffering, unnoted except at the basic unit of society, the family, and rippling out to the church, neighborhood and town. Why would a woman abandon her children? What would drive a member of the home guard to massacre captives – mere boys? How could people, so crushed, hope? And, of course, the main theme of The Inside War is hope; hope after, and despite the loss and suffering. As we deal with the veterans of the conflict with radical Islamists we need to surround them with a culture of hope.

APM: From one attorney to another, do you think being a lawyer affects your writing in any way—from the preparation to the organization to the style?

RJE: That’s interesting. Certainly the actual practice of law involves clear writing. I have a hard time reading novels written in stream of consciousness or in rambling, shuffling styles. So, hopefully this book will be understandable and clear to the reader. I like the process of legal research and enjoyed the process of researching this book. However, the characters, though based on historical figures, came about from my imagination, which is why the book is a novel and not a history.

APM: It’s been said that the Revolutionary War produced political philosophy in America whereas the Civil War produced literature. Do you agree with this, and if so, why?

RJE: Perhaps the truth in that statement devolves from the Revolutionary War defining the creation of a nation, the Civil War defining its character. The revolution tested the theories of individual liberty and melded them, free of sovereign control, imperfectly into a new nation. The Civil War represents a gigantic challenge to the notion that a nation of citizens can be free. Millions were intimately involved in the latter conflict and the upheaval and changes were intensely felt and recorded in innumerable books. But the fundamental story of both wars is ongoing, in my view, and that is America must re-experience, “a new birth of freedom,” with regularity if America is to retain her vibrancy and hope.

APM: Thanks, Bob, for taking the time. I appreciate it, and I know our readers do, too.

Causation and Criminal Law

In America, Criminal Law, Humanities, Jurisprudence, Justice, Law, Philosophy on October 29, 2014 at 8:45 am

Allen 2

Actus reus, which is shorthand for the opening words in the Latin phrase actus non facit reum nisi mens sit rea (“an act does not make a person guilty unless his mind is also guilty”), is one element of a crime that a prosecutor must prove to establish criminal liability. A prosecutor must prove, in particular, that the defendant’s actus reus caused the harmful result at issue in the case. To do so, the prosecutor must show not only that the act was the “actual cause” of the harm (i.e., the “factual cause” or the “but for” cause”) but also that the act was the “proximate cause” of the harm (i.e., the “legal cause”).

The so-called “but for” test, also known as the sine qua non test, seeks to determine whether a particular act brought about the particular harm to the alleged victim. If the question whether the harm would not have happened but for the defendant’s action is answered in the affirmative, then causation is established; accordingly, if the harm would have happened notwithstanding the defendant’s act, then the defendant’s act is not a “cause in fact.” The “but for” test is not satisfied unless the prosecutor can show that the harm was foreseeable; if the harm was not foreseeable, then the defendant cannot be said to be the actual cause of the harm, only the proximate cause of the harm.

Determining causation is difficult when two people are performing different acts at different times, and each of their acts could have caused the harm at the time the harm occurred. The two acts by the two different people constitute concurrent sufficient causes under the “but for” test. Because there are two different people who could have “caused” the harm according to the “but for” test, yet only one of the two people actually caused the harm, the “but for” test fails to establish causation.

There are two tests that courts may apply when there are multiple sufficient causes under the facts. The first is the substantial factor test, according to which a defendant is criminally liable if his acts are shown to be a substantial factor leading to the harm to the alleged victim. This test is not commonly used because it can be arbitrary and subjective. The better test is a modified form of the “but for” test, formulated this way: “But for the defendant’s voluntary act, the harm would not have occurred not just when it did, but as it did.” Even this revised test falls short of ideal. For instance, it is not clear how this test is applied when two non-lethal acts combine to cause the death of one victim.

Regardless of which tests for causation obtain or prevail in a particular case, a prosecutor must establish each element of a crime beyond a reasonable doubt. That standard, at least, is a legal certainty.

Free Not to Vote

In America, Arts & Letters, Austrian Economics, Libertarianism, News and Current Events, Politics on October 22, 2014 at 8:45 am

Allen 2

This piece first appeared here as a Mises Emerging Scholar article for the Ludwig von Mises Institute Canada.

The 2014 U.S. midterm elections are coming up, and I don’t intend to vote. A vote is like virginity: you don’t give it away to the first flower-bearing suitor. I haven’t been given a good reason, let alone flowers, to vote for any candidate, so I will stay home, as well I should.

This month, my wife, a Brazilian citizen, drove from Auburn, Alabama, to Atlanta, Georgia, on a Sunday morning to cast her vote for the presidential election in Brazil. She arrived at the Brazilian consulate and waited in a long line of expatriates only to be faced with a cruel choice: vote for the incumbent socialist Dilma Rousseff of the Workers’ Party, for the socialist Aécio Neves of the Brazilian Social Democracy Party who is billed as a center-right politician, for the environmentalist socialist Marina Silva of the Socialist Party, or for any of the other socialist candidates who were polling so low that they had no chance of victory. Brazil maintains a system of compulsory voting in addition to other compulsory schemes such as conscription for all males aged 18.

Logan Albright recently wrote about the folly of compulsory voting, support for which is apparently growing in Canada. He criticized the hypocrisy of an allegedly democratic society mandating a vote and then fining or jailing those who do not follow the mandate. He also pointed out the dangers of forcing uneducated and uninformed citizens to vote against their will. This problem is particularly revealing in Brazil, where illiterate candidates have exploited election laws to run absurd commercials and to assume the persona of silly characters such as a clown, Wonder Woman, Rambo, Crazy Dick, and Hamburger Face, each of which is worth googling for a chuckle. The incumbent clown, by the way, was just reelected on the campaign slogan “it can’t get any worse.” Multiple Barack Obamas and Osama bin Ladens were also running for office, as was, apparently, Jesus. The ballot in Brazil has become goofier than a middle-school election for class president.

Even in the United States, as the election of Barack Obama demonstrates, voting has become more about identity politics, fads, and personalities than about principle or platform. Just over a decade ago, Arnold Schwarzenegger became the Governor of California amid a field of second-rate celebrities while a former professional wrestler (the fake and not the Olympian kind of wrestling) Jesse “the Body” Ventura was winding up his term as the Governor of Minnesota. Today comedian Al Franken holds a seat in the United States Senate. It turns out that Brazil isn’t the only country that can boast having a clown in office.

No serious thinker believes that a Republican or Democratic politician has what it takes to boost the economy, facilitate peace, or generate liberty. The very function of a career politician is antithetical to market freedom; no foolish professional vote-getter ought to have the power he or she enjoys under the current managerial state system, but voting legitimates that power.

It is often said, “If you don’t vote, you can’t complain.” The counterpoint is that voting ensures your complicity with the policies that elected politicians will enact. If you don’t vote, you lack complicity. You are not morally blameworthy for resisting the system that infringes basic rights or that offends your sense of justice and reason. You have not bestowed credibility on the government with your formal participation in its most sacred ritual. The higher the number of voters who participate in an election, the more legitimacy there is for the favored projects of the elected politicians, and the more likely those politicians are to impose their will on the populace by way of legislation or other legal means.

Refusing to vote can send a message: get your act together or we won’t turn out at the polling stations. Low voter turnout undermines the validity of the entire political system. Abstention also demonstrates your power: just watch how the politicians grovel and scramble for your vote, promise you more than they can deliver, beg for your support. This is how it ought to be: Politicians need to work for your vote and to earn it. They need to prove that they are who they purport to be and that they stand for that which they purport to stand. If they can’t do this, they don’t deserve your vote.

Abstention is not apathy; it is the exercise of free expression, a voluntary act of legitimate and peaceful defiance, the realization of a right.

There are reasonable alternatives to absolute abstention: one is to vote for the rare candidate who does, in fact, seek out liberty, true liberty; another is to cast a protest vote for a candidate outside the mainstream. Regardless, your vote is a representation of your person, the indicia of your moral and ethical beliefs. It should not be dispensed with lightly.

If you have the freedom not to vote, congratulations: you still live in a society with a modicum of liberty. Your decision to exercise your liberty is yours alone. Choose wisely.

Review of “Cheating Lessons,” by James M. Lang

In Academia, America, Arts & Letters, Book Reviews, Books, Humanities, Pedagogy, Teaching on September 24, 2014 at 8:45 am

Allen 2

This review originally appeared in Academic Questions (2014).

A few years ago, when I was teaching composition courses at Auburn University, I had a freshman from Harlem in my class. He had traveled from New York to Alabama to accept a scholarship and become the first person in his family to attend college. He was kind and thoughtful, and I liked him very much, but he was woefully unprepared for higher education; he had trouble comprehending more than a few paragraphs and could not write basic sentences. The university, however, was proud of this recruit, who contributed both geographic and racial diversity to the otherwise (relatively) non-diverse student body.

Encouraged by his tenacity, I met with this student regularly to teach him sentence structure and to help him turn his spoken words into written sentences. Although he improved by degrees over the course of the semester, he was never able to write a complete coherent paragraph.

During the last weeks of class, I informed him that he needed to earn at least a C+ on his final paper to avoid repeating the course. He was conspicuously absent from class whenever preliminary drafts were due, and he never responded to my prodding emails. Shortly before the due date, he materialized in my office and presented a piece of paper that contained several sentences. He asked me questions and attempted to record my responses on his paper. I reminded him that although I was happy to offer guidance, he needed to submit original work. He nodded and left my office. When, at last, he submitted his final paper, it consisted of roughly four intelligible paragraphs that regrettably had nothing to do with the assignment. I inserted these paragraphs into a Google search and discovered that they were lifted, verbatim, from a Wikipedia article unrelated to the assignment. I failed the student but showed him mercy—and spared the university embarrassment—by not reporting him to the administration for disciplinary action.

To this day I wonder if there was something I could have done differently to prevent this student from plagiarizing, or whether his cheating was the inevitable consequence of being unprepared for university study. Many teachers have similar stories.

Academic dishonesty, a topic now admirably undertaken by James M. Lang, has received more scholarly treatment than I was aware of before reading Cheating Lessons: Learning from Academic Dishonesty. Like many of us, Lang grew interested in the subject because of his experiences with students who cheated in his classes. The more research he did on academic dishonesty, the more frustrated he became with “the same basic prescriptions” that were either quixotic or impracticable for one faculty member to undertake alone. One day, Lang realized that if he “looked through the lens of cognitive theory and tried to understand cheating as an inappropriate response to a learning environment that wasn’t working for the student,” he could “empower individual faculty members to respond more effectively to academic dishonesty by modifying the learning environments they constructed.”

Lang’s goal is not to score points or court confrontation, but simply to help teachers and administrators to reduce cheating by restructuring the content and configuration of their courses and classrooms.

Lang divides Cheating Lessons into three parts. The first is a synthesis of the existing scholarly literature on academic dishonesty that concludes with four case studies, about which little needs to be said here. The second part consists of practical guidance to teachers who wish to structure their classrooms to minimize cheating and to cultivate the exchange of ideas. And the third, which is an extension of the second, considers speculations about potential changes to curricula and pedagogy to promote academic integrity not just in the classroom, but across campus.

Most original are parts two and three, which are premised on the structuralist assumption that systems shape and inform the production of knowledge. The treatment of academic dishonesty as a symptom of deterministic models and paradigms makes this book unique. If the models and paradigms can be changed, Lang’s argument runs, then academic dishonesty might decline: the shift needs to be away from the “dispositional factors that influence cheating—such as the student’s gender, or membership in a fraternity or sorority, and so on”—toward “contextual factors,” the most significant of which is “the classroom environment in which students engage in a cheating behavior” (emphases in original). What’s exciting about the structuralist paradigm—if it’s accurate—is that teachers and administrators have the power and agency to facilitate constructive change.

But what if the structuralist paradigm isn’t correct? What if dispositional factors are more determinative than contextual factors in generating academic dishonesty? Lang’s argument depends upon a profound assumption that he expects his readers to share. It’s most likely that dispositional and contextual factors are interactive, not mutually exclusive: consider the student who is not as intelligent as his peers and who resorts to cheating because of his insecurity and the pressure on him to succeed. Lang is onto something, though: students are less likely to learn in an environment that compels them “to complete a difficult task with the promise of an extrinsic reward or the threat of punishment” than they are in an environment that inspires them “with appeals to the intrinsic joy or beauty or utility of the task itself” (emphasis in original). In other words, “in an environment characterized by extrinsic motivation, the learners or competitors care about what happens after the performance rather than relishing or enjoying the performance itself” (emphasis in original).

How does Lang propose that teachers and administrators structure their courses and curricula to foster what he calls “intrinsic motivation” (as against “extrinsic rewards”) among students? For starters, he urges professors to help students learn for mastery and not for grades, to lower the stakes per assignment by multiplying the options for students to earn points or credit, and to instill self-efficacy by challenging students and by affording them increased opportunities to demonstrate their knowledge. In the abstract, these suggestions seem obvious and unhelpful, so Lang backs them up with interviews with accomplished teachers as well as anecdotes about successful classroom experiments: the improvising by Andy Kaufman as he taught Russian literature to prison inmates, for instance, or the unique grading system implemented by John Boyer at Virginia Tech. All the tactics and approaches discussed and promoted by Lang can be traced back to the premise that “the best means we have to reduce cheating is to increase motivation and learning.”

Teachers and administrators are forever trying to motivate their students to learn. It’s easier to conceive of this goal, however, than to achieve it. Teachers everywhere seek to inspire their students to love and pursue knowledge, and despite a plethora of opinions about how best to do so, no general consensus has arisen to establish a definitive course of action for all students and disciplines. Many teachers chose their profession and discipline because they relished their own education and wanted to pass on their knowledge and love of learning to others. Lang’s insistence that teachers inspire a passion for learning is hardly novel; rather, it is the touchstone and stands in contradistinction to the utilitarian, standardized, test-centered, and results-oriented educational strategies that politicians, bureaucrats, and policy wonks now sponsor and defend. In this respect, Cheating Lessons is a refreshing alternative; it’s written by an educator for educators and not, thank goodness, for semiliterate politicians and their sycophantic advisers.

One thing this book is not: a template or checklist that you can follow to construct your own productive learning environment for students. Each learning environment is contextual; one model will not suit every setting and purpose. Because Lang cannot and does not provide step-by-step how-to instructions, Cheating Lessons borders on the self-help genre and is more inspirational and aspirational than it is informational. And Lang’s meandering style—for example, his digressions about Robert Burns and coaching youth sports teams—are disarming enough not only to charm but also to contribute to the impression that Cheating Lessons is “light” reading.

Lang can overdo the playfulness and make exaggerated claims. Early on he quotes a Harvard administrator complaining in 1928 about the problem of cheating among students, an example that’s meant to refute the assumption that “we are in the midst of a cheating epidemic, and that the problem is much worse now than it was in the idyllic past.” Lang adds that he hopes to convince us that “cheating and higher education in America have enjoyed a long and robust history together.” But it’s not as if 1928 is ancient history. Data about academic dishonesty since that time will not convince most readers that there were as many cheating students in the one-room schoolhouses of the nineteenth century, when fewer people had access to formal education, as there are today. Perhaps anticipating such criticism, Lang invites us to “hop in our time machine and leap across centuries” to consider the cheating cultures of the ancient Greeks and of Imperial China “over the course of [a] fourteen-hundred-year history.” But surely the substantial data we have gathered on the twentieth- and twenty-first-century academy cannot be compared to the limited and circumstantial data garnered about these early cultures; surely “illicit communication” by “cell phones” is not comparable to the use of cheat sheets in nineteenth-century China. It seems preposterous to suggest that academic dishonesty in contemporary America exists to the same extent it did centuries ago on different continents and among different peoples with different principles and priorities.

Nevertheless, even readers skeptical of Lang’s structuralist premise and apparent optimism will find much in Cheating Lessons to contemplate and to amuse. Unfortunately, however, even after having read the book I’m still not sure what I could have done differently to prevent my student from cheating.

 

 

 

The Immunity Community

In America, American History, Arts & Letters, Britain, History, Humanities, Jurisprudence, Justice, Law, Libertarianism, Philosophy on September 10, 2014 at 8:45 am

Allen 2

This piece first appeared here as a Mises Emerging Scholar article for the Ludwig von Mises Institute Canada.

The doctrine of sovereign immunity derives from the English notion that “the king can do no wrong” and hence cannot be sued without his consent. The purpose of this doctrine was, in England, from at least the Middle Ages until eighteenth century, to bar certain lawsuits against the monarch and his or her ministers and servants. With the rise of the English Parliament after the death of Elizabeth I, government officers and politicians sought to gain the power of immunity that the monarch and his or her agents had enjoyed.

In practice, however, English subjects were not totally deprived of remedies against the monarch or the government. The doctrine of sovereign immunity was not an absolute prohibition on actions against the crown or against other branches of government;[1] subjects could avail themselves of petitions of right or writs of mandamus, for instance, and monarchs fearful of losing the support of the people would often consent to be sued.

It was not until the monarchy had been demonstrably weakened that the doctrine of sovereign immunity began to be espoused with added urgency and enforced with added zeal. In the late eighteenth century, Sir William Blackstone intoned in his Commentaries on the Laws of England that the king “is not only incapable of doing wrong, but ever of thinking wrong: he can never mean to do an improper thing: in him is no folly of weakness.” These lines convert sovereign immunity into sovereign infallibility, a more ominous yet more dubious pretension.

Once the monarchy had been abolished altogether, the idea that the sovereign had to consent to be sued no longer held credence. As Louis L. Jaffe explains, “Because the King had been abolished, the courts concluded that where in the past the procedure had been by petition of right there was now no one authorized to consent to suit! If there was any successor to the King qua sovereign it was the legislature,” which, having many members subject to differing constituencies, was not as accountable as the monarch had been to the parties seeking to sue.[2]

The principle of sovereign immunity carried over from England to the United States, where most states have enshrined in their constitution an absolute bar against suing the State or its agencies and officers whose actions fall within the scope of official duties. The Eleventh Amendment to the U.S. Constitution likewise states that “the Judicial power of the United States shall not be construed to extend to any suit in law or equity, commenced or prosecuted against one of the United States by Citizens of another State, or by Citizens or Subjects of any Foreign State.” This provision, which applies only in federal courts and which does not on its face prohibit a lawsuit against a state by a citizen of that same state, was adopted in response to the ruling in Chisholm v. Georgia (1793), a case that held sovereign immunity to have been abrogated and that vested in federal courts the authority to preside over disputes between private citizens and state governments.

Notwithstanding the complex issues of federalism at play in the Chisholm decision and in the Eleventh Amendment, the fact remains that the doctrine of sovereign immunity has been applied with widening scope and frequency since the states ratified the Eleventh Amendment in 1795. The U.S. Supreme Court has contributed to the doctrine’s flourishing. “The Supreme Court’s acceptance of sovereign immunity as constitutional principle,” explains one commentator, “depends on its determination of the intent of the Framers, which ignores a great deal of historical evidence from the time of the founding and relies primarily on a discredited account of the Eleventh Amendment first articulated in the 1890 case of Hans v. Louisiana.”[3]

State and federal courts have now built an impregnable wall of immunity around certain state and federal officers. The sovereign immunity that is enshrined in state constitutions is, in theory, not absolute because it is conferred only to certain agents and officers and does not prohibit lawsuits to enjoin such agents and officers from performing unconstitutional or other bad acts. In practice, however, the growth of qualified immunities, which is in keeping with the growth of government itself, has caused more and more agents of the State to cloak themselves in immunity.

Bus drivers, teachers, coroners, constables, high school coaches, doctors and nurses at university hospitals, security guards, justices of the peace, government attorneys, legislators, mayors, boards of education and health, university administrators, Indian reservations, prison guards and wardens, police officers and detectives, janitors in government facilities, licensing boards, tax assessors, librarians, railroad workers, government engineers, judges and justices, school superintendents and principals, towing companies, health inspectors, probation officers, game wardens, museum docents and curators, social workers, court clerks, dog catchers, contractors for public utilities, public notaries, tollbooth attendants, airport traffic controllers, park rangers, ambulance drivers, firefighters, telephone operators, bus drivers, subway workers, city council members, state auditors, agricultural commissioners—all have sought to establish for themselves, with mixed degrees of success, the legal invincibility that comes with being an arm of the state.

Yet the idea that “the king can do no wrong” makes no sense in a governmental system that has lacked a king from its inception. Its application as law has left ordinary citizens with limited recourse against governments (or against people claiming governmental status for the purpose of immunity) that have committed actual wrongs. When the government, even at the state level, consists of vast bureaucracies of the kind that exist today, the doctrine of sovereign immunity becomes absurd. If it is true that in nine states and in the District of Columbia the government employs more than 20% of all workers, imagine how many people are eligible to claim immunity from liability for their tortious conduct and bad acts committed on the job.

Local news reports are full of stories about government employees invoking the doctrine of sovereign immunity; few such stories find their way into the national media. Judge Wade McCree of Michigan, for instance, recently carried out an affair with a woman who was a party in a child-support case on his docket, having sexual intercourse with her in his chambers and “sexting” her even on the day she appeared as a witness in his courtroom. Although McCree was removed from office, he was immune from civil liability. An airport in Charleston, West Virginia, is invoking the doctrine of immunity to shield itself from claims that it contributed to a chemical spill that contaminated the water supply. Officer Darren Wilson may be entitled to immunity for the shooting of Michael Brown, depending on how the facts unfold in that investigation.

The U.S. Supreme Court once famously declared that the doctrine of sovereign immunity “has never been discussed or the reasons for it given, but it has always been treated as an established doctrine.”[4] A disestablishment is now in order. The size and scope of government is simply too massive on the state and national level to sustain this doctrine that undermines the widely held belief of the American Founders that State power must be limited and that the State itself must be held accountable for its wrongs. Friedrich Hayek pointed out that the ideal of the rule of law requires the government to “act under the same law” and to “be limited in the same manner as any private person.”[5] The doctrine of sovereign immunity stands in contradistinction to this ideal: it places an increasing number of individuals above the law.

If the law is to be meaningful and just, it must apply equally to all persons and must bind those who enforce it. It must not recognize and condone privileges bestowed upon those with government connections or incentivize bad behavior within government ranks. Sovereign immunity is a problem that will only worsen if it is not addressed soon. The king can do wrong, and so can modern governments. It’s time for these governments to be held accountable for the harms they produce and to stop hiding behind a fiction that was long ago discredited.

________

[1]See generally, Louis L. Jaffe, “Suits Against Governments and Officers: Sovereign Immunity,” 77 Harvard Law Review 1 (1963).

[2]Jaffe at 2.

[3]Susan Randall, “Sovereign Immunity and the Uses of History,” 81 Nebraska L. Rev. 1, 4 (2002-03).

[4]U.S. v. Lee, 106 U.S. 196, 207 (1882).

[5]F. A. Hayek, The Constitution of Liberty, Vol. 17 of The Collected Works of F.A. Hayek, ed. Ronald Hamowy(Routlege, 2011), p. 318.

The Lawyers’ Guild

In America, American History, History, Law, Legal Education & Pedagogy, Nineteenth-Century America on August 27, 2014 at 8:45 am

Allen 2

This piece originally appeared here as a Mises Emerging Scholar article for the Ludwig von Mises Institute Canada.

Last month, thousands of recent law school graduates sat for a bar examination in their chosen state of practice. They were not undertaking a harmless rite of passage but overcoming a malicious obstacle: an artificial barrier to entry in the form of occupational licensure.

Barriers to entry are restrictions on access to, or participation in, markets or vocations. Occupational licensure is a type of barrier to entry that regulates professions by requiring certification and licensing in the manner of medieval guilds. Medicine and law are perhaps the most recognizable professions to require their practitioners to obtain and maintain licenses.

The purpose of occupational licensure is to reduce competition by using government power to restrict membership eligibility in a profession. The criteria for membership are often prohibitively expensive for low-income earners. To be admitted to the law in nearly every state in the United States, you must not only pass a bar examination but also earn a law degree from an accredited law school, admission to which requires a bachelor’s degree from an accredited university.

The average student-loan debt for graduates of American colleges is around $29,400. The average student-loan debt for graduates of American law schools is between $75,700 and $125,000, depending on whether the school is public or private. The American Bar Association imposes heavy burdens on law schools such as accreditation standards that are inefficient and that drive up costs so that over time the high price of legal education is passed on to the public in the form of attorneys’ fees and costs. Having already saddled themselves with student-loan debts, recent law-school graduates pay thousands of dollars for bar-preparation courses to study for an examination that, if passed, will open the door to a job market that is the worst in recent memory. Nobody struggling financially should attempt to leap over each of these expensive hurdles.

Before the rise of bar examinations and professional licensure during the Progressive Era in the United States, aspiring attorneys simply “read law” as apprentices for practicing attorneys or as clerks for local law firms. Once they achieved a certain level of competence, apprentices were released from their tutelage and eligible to accept clients. Those jurisdictions that did require examinations allowed judges to conduct informal interviews with candidates to determine the candidates’ moral and intellectual fitness for practice. Such examinations were typically mere formalities: few candidates failed; few careers were at stake as the interview took place. Newly admitted attorneys had to demonstrate their excellence in order to gain clients. They launched their careers by charging low fees that even the poorest in society could pay. Attorneys who did not prove fit for practice never gained enough clients to sustain their business and were forced to embark on other professions.

In the late-nineteenth and early-twentieth century, energetic and entrepreneurial members of the middle to lower classes in cities such as New York and Chicago began to threaten the legal establishment that had previously been comprised of a mostly wealthy and elite fraternity. This fraternity simply could not compete with low-cost providers of legal services because, for example, the most elite attorneys considered it unseemly and degrading to advertise for services or to offer contingency fees. Bar associations that were once voluntary organizations of upper class professionals therefore began to use their political clout and government connections to obtain powers conferred by legislatures. They wanted to keep the lower classes out of their profession and to preserve a highbrow reputation for lawyers. They began to exercise a monopolistic control over the practice of law within their respective jurisdictions. Today they constitute authorized arms of the State.

In most jurisdictions’ bar associations determine who may be admitted as members and who must be excluded, whether and to what extent lawyers may advertise their services, what constitutes the “authorized” practice of law, whether a law firm must have a physical office with a non-residential mailing address, and under what conditions contingency fees are permissible. These anti-competitive practices hit communities most in need the hardest by increasing the costs of legal services beyond the ordinary person’s ability to pay.

The bar examination is the most hyped precondition for membership in a state bar association. Like hazing, it is more ritual than training; it does not help one learn to be an attorney or indicate any requisite skills for practice. It tests how well someone can memorize arcane and esoteric rules and their trivial exceptions, many of which have no bearing on actual practice. Few if any lawyers spend their days memorizing rules for courts or clients, and no one who intends to practice, say, corporate law in a big city needs to memorize obscure criminal law rules that were long ago superseded by statute.

Despite reciprocity among some states, the bar examination restricts the free flow of qualified attorneys across state lines, forcing even the best attorneys to limit their services to certain jurisdictions. The bar examination also creates racial disparities among practicing attorneys as minority passage rates tend to be lower, a fact that flies in the face of nearly every bar association’s purported commitment to diversity.

Keeping the number of lawyers low ensures that lawyers may charge higher fees. Keeping the barriers to entry high ensures that the number of lawyers remains low. It’s a popular fallacy to complain that there are too many lawyers. We don’t need fewer lawyers; we need more, so long as we gain them through competitive forces on a free market.

We need to unleash capitalism in the legal system for the benefit of everyone. We could start by eliminating the bar examination. Doing so would have no marked effect on the quality of lawyers. It would drive down the high costs of legal services by injecting the legal system with some much-needed competition. It would make practitioners out of the able and intelligent people who wanted to attend law school but were simply too prudent to waste three years of their lives and to take on tens-of-thousands of dollars of student-loan debt while entry-level legal jobs were scarce and entry-level legal salaries were low. Justifications for the bar examination are invariably predicated on paternalistic assumptions about the ability of ordinary people to choose qualified attorneys; such arguments ignore the number of ordinary people who, today, cannot afford qualified attorneys at all under the current anticompetitive system.

Abolishing the bar examination would benefit the very community it is supposed to protect: the lay public.

Troy Camplin Reviews “Napoleon in America,” a Novel by Shannon Selin

In America, American History, Arts & Letters, Book Reviews, Books, Creative Writing, History, Humanities, Novels, The Novel, Writing on August 20, 2014 at 8:45 am
Shannon Selin

Shannon Selin

Napoleon in America is a “what-if” historical novel that combines a variety of styles – epistolary, newspaper article, and regular novelistic narrative – to create a work that reads like a very well-written narrative of history. Given that the author is necessarily working with an entirely fictional world – one in which Napoleon escapes from St. Helena to the United States – the fact that she can create such an effect is quite remarkable. The reader is made to feel as if he or she is reading about actual historical events. Of particular note is the fact that Selin creates the impression that we are reading a Great Men History book, which makes it rather distinctive. As such, it is going against the direction in which historical studies have, themselves, gone.

Much contemporary history deals with everyday life, local histories, etc. But given that the protagonist of this novel, Napoleon, is the kind of person who is distinctly bored with everyday life – is too big for everyday life – we should not be surprised to find a story dominated by the overwhelming presence of the personality of Napoleon. It is perhaps for this very reason that the novel becomes involved in the great movements of Napoleon rather than the intimate details of his life. These aspects are touched on here and there, of course, but in the end, we remember Napoleon the Conquerer, not Napoleon the almost-died-when-he-got-to-America. Napoleon quickly recovers to dominate the novel with his personality. But this personality is not one changed by circumstances. He is the Napoleon we all love and loathe. He cannot settle down. He has to conquer.

Thus, with Selin’s novel, we have a complete inversion. The novel has, historically, dealt with everyday people in their everyday lives. The actions of most novelistic characters do not have a major impact on historical events. If we look at the way histories are written over the same time period of the rise of the European novel (which includes American and Canadian literature and, stylistically, much literature written in the rest of the world during the 20th century), we primarily see the complete opposite: an interest in major figures and their major effects on history dominate most historical narratives over this same time period. However, we see a shift within history toward the same kinds of concerns we see in novels: everyday peoples, the histories of institutions, local histories, etc. Thus, we should not be surprised to find novels picking up the kinds of narratives we once found in histories.

Along with the Big Men of the time, Selin deals with the Big Ideas of the time; of course, the Big Men are often the Big Men precisely because they discuss and try to enact the Big Ideas of their time. Liberalism and dictatorship and whether Napoleon is really a liberal or little better than the kings he likes to depose are discussed – as no doubt they were, in fact, discussed historically. We see some of the conflicts within French Liberalism – and some of the contradictions. Was it a mere coincidence that French Liberalism led to the Terror and to the Empire under Napoleon? Or was it simply bad luck? Pro- and anti-Napoleon liberals are unified in their opposition to the Bourbons, but the question is raised as to whether replacing one monarch with another is really an improvement. Yet, there seems a willingness, even among those who oppose Napoleon, to support revolution against the Bourbons, even if it results in another Napoleon (literally or figuratively). Along these lines, Selin does a magnificent job of showing how blinding the opposition to the Bourbons is in the decision by the French government to invade Spain. The King in fact opposes the invasion, but ends up being talked into it; the liberals believe the invasion is a Bourbon plot and evidence of his being a cruel dictator. The reality is more humdrum than the conspiracy theory the liberals are desperate to believe.

Overall, Selin’s book goes beyond what we would expect to find in a historical novel whose main character is a major historical figure. A traditional historical novel would have the characters doing all the major, public actions the history books tell us happened. Selin has to do something quite different. She has to first know what did in fact happen during the historical period in question; she then has to understand Napoleon well enough to understand what he might do in circumstances other than those in which he did, in fact, find himself; and then she has to create a realistic alternative to what did in fact happen, understanding the butterfly effects of a Napoleon in America. It is a garden of forking paths, and one can go in any number of directions. To this end, Selin is certainly effective in her choice of direction. The great uncertainty created by Napoleon’s presence in America is well demonstrated. The U.S. government does not seem to know what to do with him. We are, after all, talking about a young country still learning where it fits in the world. It has the benefit of being separated from Europe – where all the action lies – by a large ocean. But the action has come to America’s shores when Napoleon escapes St. Helena. The uncertainty that leaves Napoleon free to raise an army and wander into Texas is well within the realm of possibilities. As is the naïve belief by some – such as James Bowie – that Napoleon can be “handled.”

The majority of the novel is dominated by the spirit of uncertainty and worry. All the action comes in at the end of the novel, when Napoleon finally does invade Texas. And even then, we are left with a great deal of uncertainty. Napoleon has won a battle and established himself in San Antonio; however, we are left with the question of what will happen next. Napoleon in America has the feeling of the first novel in a sequel. It would not surprise me if Napoleon in Texas were to follow. There is a great deal more to this story that could be explored. Will Napoleon be able to create a long-term presence in Texas? What will be the response of Mexico? What will be the response of the American government? What will be the response of the American settlers? Will the people of Kentucky and Tennessee volunteer to fight for Texas independence under Napoleon as they did for its independence under Austin? Is Napoleon just preparing the way for the Americans to take over, making it a bit easier than it was historically? Or is he perhaps making it a bit harder, since a Mexican government may take Napoleon as a much more serious threat to the government of Mexico than those who only wanted an independent Texas?

For those who enjoy the What-If History genre, these are fun questions to consider. I find it hard to imagine that anyone who reads Napoleon in America – which should include most of those who enjoy historical fiction – would fail to want these questions answered in a sequel.

Troy CamplinTroy Camplin holds a Ph.D. in humanities from the University of Texas at Dallas.  He has taught English in middle school, high school, and college, and is currently taking care of his children at home. He is the author of Diaphysics, an interdisciplinary work on systems philosophy; other projects include the application of F.A. Hayek’s spontaneous order theory to ethics, the arts, and literature. His play “Almost Ithacad” won the PIA Award from the Cyberfest at Dallas Hub Theater.

Abolish the Bar Exam

In America, American History, History, Law on July 23, 2014 at 8:45 am

Allen 2

This article originally appeared here at LewRockwell.comand was reposted on this blog last year in July.  I repost it here again this year for all those who are taking the bar exam this week and next week.

Every year in July, thousands of anxious men and women, in different states across America, take a bar exam in hopes that they will become licensed attorneys. Having memorized hundreds if not thousands of rules and counter-rules — also known as black letter law — these men and women come to the exam equipped with their pens, laptops, and government-issued forms of identification. Nothing is more remote from their minds than that the ideological currents that brought about this horrifying ritual were fundamentally statist and unquestionably bad for the American economy.

The bar exam is a barrier to entry, as are all forms of professional licensure. Today the federal government regulates thousands of occupations and excludes millions of capable workers from the workforce by means of expensive tests and certifications; likewise various state governments restrict upward mobility and economic progress by mandating that workers obtain costly degrees and undergo routinized assessments that have little to do with the practical, everyday dealings of the professional world.

As a practicing attorney, I can say with confidence that many paralegals I know can do the job of an attorney better than some attorneys, and that is because the practice of law is perfected not by abstract education but lived experience.

So why does our society require bar exams that bear little relation to the ability of a person to understand legal technicalities, manage case loads, and satisfy clients? The answer harkens back to the Progressive Era when elites used government strings and influence to prevent hardworking and entrepreneurial individuals from climbing the social ladder.

Lawyers were part of two important groups that Murray Rothbard blamed for spreading statism during the Progressive Era: the first was “a growing legion of educated (and often overeducated) intellectuals, technocrats, and the ‘helping professions’ who sought power, prestige, subsidies, contracts, cushy jobs from the welfare state, and restrictions of entry into their field via forms of licensing,” and the second was “groups of businessmen who, after failing to achieve monopoly power on the free market, turned to government — local, state, and federal — to gain it for them.”

The bar exam was merely one aspect of the growth of the legal system and its concomitant centralization in the early twentieth century. Bar associations began cropping up in the 1870s, but they were, at first, more like professional societies than state-sponsored machines. By 1900, all of that changed, and bar associations became a fraternity of elites opposed to any economic development that might threaten their social status. The elites who formed the American Bar Association (ABA), concerned that smart and savvy yet poor and entrepreneurial men might gain control of the legal system, sought to establish a monopoly on the field by forbidding advertising, regulating the “unauthorized” practice of law, restricting legal fees to a designated minimum or maximum, and scaling back contingency fees. The elitist progressives pushing these reforms also forbade qualified women from joining their ranks.

The American Bar Association was far from the only body of elites generating this trend. State bars began to rise and spread, but only small percentages of lawyers in any given state were members. The elites were reaching to squeeze some justification out of their blatant discrimination and to strike a delicate balance between exclusivity on the one hand, and an appearance of propriety on the other. They made short shrift of the American Dream and began to require expensive degrees and education as a prerequisite for bar admission. It was at this time that American law schools proliferated and the American Association of Law Schools (AALS) was created to evaluate the quality of new law schools as well as to hold them to uniform standards.

At one time lawyers learned on the job; now law schools were tasked with training new lawyers, but the result was that lawyers’ real training was merely delayed until the date they could practice, and aspiring attorneys had to be wealthy enough to afford this delay if they wanted to practice at all.

Entrepreneurial forces attempted to fight back by establishing night schools to ensure a more competitive market, but the various bar associations, backed by the power of the government, simply dictated that law school was not enough: one had to first earn a college degree before entering law school if one were to be admitted to practice. Then two degrees were not enough: one had to pass a restructured, formalized bar exam as well.

Bar exams have been around in America since the eighteenth century, but before the twentieth century they were relaxed and informal and could have been as simple as interviewing with a judge. At the zenith of the Progressive Era, however, they had become an exclusive licensing agency for the government. It is not surprising that at this time bar associations became, in some respects, as powerful as the states themselves. That’s because bar associations were seen, as they are still seen today, as agents and instrumentalities of the state, despite that their members were not, and are not, elected by the so-called public.

In our present era, hardly anyone thinks twice of the magnificent powers exercised and enjoyed by state bar associations, which are unquestionably the most unquestioned monopolies in American history. What other profession than law can claim to be entirely self-regulated? What other profession than law can go to such lengths to exclude new membership and to regulate the industry standards of other professions?

Bar associations remain, on the whole, as progressive today as they were at their inception. Their calls for pro bono work and their bias against creditors’ attorneys, to name just two examples, are wittingly or unwittingly part of a greater movement to consolidate state power and to spread ideologies that increase dependence upon the state and “the public welfare.” It is rare indeed to find the rhetoric of personal responsibility or accountability in a bar journal. Instead, lawyers are reminded of their privileged and dignified station in life, and of their unique position in relation to “members of the public.”

The thousands of men and women who will sit for the bar exam this month are no doubt wishing they didn’t have to take the test. I wish they didn’t have to either; there should be no bar exam because such a test presupposes the validity of an authoritative entity to administer it. There is nothing magical about the practice of law; all who are capable of doing it ought to have a chance to do it. That will never happen, of course, if bar associations continue to maintain total control of the legal profession. Perhaps it’s not just the exam that should go.

Follow

Get every new post delivered to your Inbox.

Join 7,147 other followers

%d bloggers like this: