See Disclaimer Below.

Author Archive

British Origins of American Estate and Land Law

In American History, Britain, Economics, History, Humanities, Law, Property on September 17, 2014 at 8:45 am

Allen 2

Estates and land are the foundation of property law in England and the United States.  After the Battle of Hastings in 1066, when William the Conqueror, or William I, became the King of England, he recognized that land ownership was essential to the governance of his kingdom.  Announcing himself owner of all English lands, he distributed property to those loyal to him. The recipients became “tenants”; the rent was called “services”; knights performed “services” on behalf of the king, thus earning their honorific title and their rights to certain lands.

William bestowed a special designation, tenant in chief, to those who were offered more land than they could use.  Everything necessary to survive and flourish at this time came from the land: food, water, shelter, building supplies and other equipment, and mineral resources.  Tenants, therefore, would parcel out tracts of their land to others in exchange for fees and services.  Recipients of the parceled tracts would parcel out smaller tracts of land, and this process of parceling would continue until the people living on the land had no rights to the land.

The result was that ownership in tracts of land became known as freehold or nonfreehold.  Interests in freehold tracts included fee simple estates, fee tale estates, and life estates; interests in nonfreehold tracts included periodic tenancies, terms, and tenancies at will.  These six categories of land ownership and title remain with us today.

The Immunity Community

In America, American History, Arts & Letters, Britain, History, Humanities, Jurisprudence, Justice, Law, Libertarianism, Philosophy on September 10, 2014 at 8:45 am

Allen 2

This piece first appeared here as a Mises Emerging Scholar article for the Ludwig von Mises Institute Canada.

The doctrine of sovereign immunity derives from the English notion that “the king can do no wrong” and hence cannot be sued without his consent. The purpose of this doctrine was, in England, from at least the Middle Ages until eighteenth century, to bar certain lawsuits against the monarch and his or her ministers and servants. With the rise of the English Parliament after the death of Elizabeth I, government officers and politicians sought to gain the power of immunity that the monarch and his or her agents had enjoyed.

In practice, however, English subjects were not totally deprived of remedies against the monarch or the government. The doctrine of sovereign immunity was not an absolute prohibition on actions against the crown or against other branches of government;[1] subjects could avail themselves of petitions of right or writs of mandamus, for instance, and monarchs fearful of losing the support of the people would often consent to be sued.

It was not until the monarchy had been demonstrably weakened that the doctrine of sovereign immunity began to be espoused with added urgency and enforced with added zeal. In the late eighteenth century, Sir William Blackstone intoned in his Commentaries on the Laws of England that the king “is not only incapable of doing wrong, but ever of thinking wrong: he can never mean to do an improper thing: in him is no folly of weakness.” These lines convert sovereign immunity into sovereign infallibility, a more ominous yet more dubious pretension.

Once the monarchy had been abolished altogether, the idea that the sovereign had to consent to be sued no longer held credence. As Louis L. Jaffe explains, “Because the King had been abolished, the courts concluded that where in the past the procedure had been by petition of right there was now no one authorized to consent to suit! If there was any successor to the King qua sovereign it was the legislature,” which, having many members subject to differing constituencies, was not as accountable as the monarch had been to the parties seeking to sue.[2]

The principle of sovereign immunity carried over from England to the United States, where most states have enshrined in their constitution an absolute bar against suing the State or its agencies and officers whose actions fall within the scope of official duties. The Eleventh Amendment to the U.S. Constitution likewise states that “the Judicial power of the United States shall not be construed to extend to any suit in law or equity, commenced or prosecuted against one of the United States by Citizens of another State, or by Citizens or Subjects of any Foreign State.” This provision, which applies only in federal courts and which does not on its face prohibit a lawsuit against a state by a citizen of that same state, was adopted in response to the ruling in Chisholm v. Georgia (1793), a case that held sovereign immunity to have been abrogated and that vested in federal courts the authority to preside over disputes between private citizens and state governments.

Notwithstanding the complex issues of federalism at play in the Chisholm decision and in the Eleventh Amendment, the fact remains that the doctrine of sovereign immunity has been applied with widening scope and frequency since the states ratified the Eleventh Amendment in 1795. The U.S. Supreme Court has contributed to the doctrine’s flourishing. “The Supreme Court’s acceptance of sovereign immunity as constitutional principle,” explains one commentator, “depends on its determination of the intent of the Framers, which ignores a great deal of historical evidence from the time of the founding and relies primarily on a discredited account of the Eleventh Amendment first articulated in the 1890 case of Hans v. Louisiana.”[3]

State and federal courts have now built an impregnable wall of immunity around certain state and federal officers. The sovereign immunity that is enshrined in state constitutions is, in theory, not absolute because it is conferred only to certain agents and officers and does not prohibit lawsuits to enjoin such agents and officers from performing unconstitutional or other bad acts. In practice, however, the growth of qualified immunities, which is in keeping with the growth of government itself, has caused more and more agents of the State to cloak themselves in immunity.

Bus drivers, teachers, coroners, constables, high school coaches, doctors and nurses at university hospitals, security guards, justices of the peace, government attorneys, legislators, mayors, boards of education and health, university administrators, Indian reservations, prison guards and wardens, police officers and detectives, janitors in government facilities, licensing boards, tax assessors, librarians, railroad workers, government engineers, judges and justices, school superintendents and principals, towing companies, health inspectors, probation officers, game wardens, museum docents and curators, social workers, court clerks, dog catchers, contractors for public utilities, public notaries, tollbooth attendants, airport traffic controllers, park rangers, ambulance drivers, firefighters, telephone operators, bus drivers, subway workers, city council members, state auditors, agricultural commissioners—all have sought to establish for themselves, with mixed degrees of success, the legal invincibility that comes with being an arm of the state.

Yet the idea that “the king can do no wrong” makes no sense in a governmental system that has lacked a king from its inception. Its application as law has left ordinary citizens with limited recourse against governments (or against people claiming governmental status for the purpose of immunity) that have committed actual wrongs. When the government, even at the state level, consists of vast bureaucracies of the kind that exist today, the doctrine of sovereign immunity becomes absurd. If it is true that in nine states and in the District of Columbia the government employs more than 20% of all workers, imagine how many people are eligible to claim immunity from liability for their tortious conduct and bad acts committed on the job.

Local news reports are full of stories about government employees invoking the doctrine of sovereign immunity; few such stories find their way into the national media. Judge Wade McCree of Michigan, for instance, recently carried out an affair with a woman who was a party in a child-support case on his docket, having sexual intercourse with her in his chambers and “sexting” her even on the day she appeared as a witness in his courtroom. Although McCree was removed from office, he was immune from civil liability. An airport in Charleston, West Virginia, is invoking the doctrine of immunity to shield itself from claims that it contributed to a chemical spill that contaminated the water supply. Officer Darren Wilson may be entitled to immunity for the shooting of Michael Brown, depending on how the facts unfold in that investigation.

The U.S. Supreme Court once famously declared that the doctrine of sovereign immunity “has never been discussed or the reasons for it given, but it has always been treated as an established doctrine.”[4] A disestablishment is now in order. The size and scope of government is simply too massive on the state and national level to sustain this doctrine that undermines the widely held belief of the American Founders that State power must be limited and that the State itself must be held accountable for its wrongs. Friedrich Hayek pointed out that the ideal of the rule of law requires the government to “act under the same law” and to “be limited in the same manner as any private person.”[5] The doctrine of sovereign immunity stands in contradistinction to this ideal: it places an increasing number of individuals above the law.

If the law is to be meaningful and just, it must apply equally to all persons and must bind those who enforce it. It must not recognize and condone privileges bestowed upon those with government connections or incentivize bad behavior within government ranks. Sovereign immunity is a problem that will only worsen if it is not addressed soon. The king can do wrong, and so can modern governments. It’s time for these governments to be held accountable for the harms they produce and to stop hiding behind a fiction that was long ago discredited.

________

[1]See generally, Louis L. Jaffe, “Suits Against Governments and Officers: Sovereign Immunity,” 77 Harvard Law Review 1 (1963).

[2]Jaffe at 2.

[3]Susan Randall, “Sovereign Immunity and the Uses of History,” 81 Nebraska L. Rev. 1, 4 (2002-03).

[4]U.S. v. Lee, 106 U.S. 196, 207 (1882).

[5]F. A. Hayek, The Constitution of Liberty, Vol. 17 of The Collected Works of F.A. Hayek, ed. Ronald Hamowy(Routlege, 2011), p. 318.

Are Lawyers Illiterate?

In Arts & Letters, Books, Essays, History, Humanities, Imagination, Law, Literature, Philosophy, Western Civilization, Western Philosophy on September 3, 2014 at 8:45 am

Allen 2

This piece originally appeared here in The Imaginative Conservative.

Webster’s defines “intelligent” as “endowed with intelligence or intellect; possessed of, or exhibiting, a high or fitting degree of intelligence or understanding.” This modern understanding of “intelligence” as an innate disposition or propensity differs from earlier understandings of the word as meaning “versed” or “skilled.” Milton, for instance, in Paradise Lost, calls the eagle and the stork “intelligent of seasons,” by which he meant that these birds, because of their experience, were cognizant of the seasons.

The older meaning of “intelligent” has less to do with native endowment than it does with gradual understanding. The older meaning, in other words, is that intelligence is acquired by effort and exposure rather than fixed by biological inheritance or natural capacity: one may become intelligent and is not just born that way; intelligence is a cultivated faculty, not an intrinsic feature.

Because of the altered signification of “intelligent,” we use today different words to describe the older meaning: erudite, knowledgeable, informed, traveled, educated. These words seem to us more palatable than their once-favored predecessors: civilized, polished, cultured, genteel, refined. I myself prefer words like “lettered” or “versed” that imply a knowledge of important books and the humanities generally.

The most apt term in this regard is also the most butchered in the current lexicon: “literate.” Contrary to what appears to be the prevailing assumption, “literate” does not simply refer to an ability to read. According to Webster’s, “literate” means “instructed in letters, educated; pertaining to, or learned in, literature.”

Not just to read, but to read well and widely—that is how you become “literate.” Accepting this traditional meaning, I question how many lawyers are or can become literate.

In the 1980s, Ithiel de Sola Pool, a professor of communications and media, determined that the average American adult reads approximately 240 words per minute. At that rate, it would take a person around 2,268.36 minutes (or 37 hours, 48 minutes, and 21.6 seconds) to read War and Peace, which comes in at 544,406 words. If that sounds encouraging—ever wanted to read War and Peace in a day-and-a-half?—consider these offsetting variables: reading at one sitting slows over time; attention span and memory recall are limited; the mind can be exercised only so much before it requires rest; people cannot constantly read for 2,268.36 minutes without going to the restroom or eating or daydreaming, among other things; a healthy lifestyle entails seven to nine hours of sleep per day; large portions of the day are spent carrying out quotidian operations, including showering, cooking, brushing teeth, commuting to and from work, getting dressed and undressed, answering phone calls, reading emails, cleaning, filling out paperwork, paying bills, and so on. Pool, moreover, was not using a text like War and Peace to gather his data, and his subjects were not writing in the margins of their books, taking notes on their laptops, or pausing to engage others in critical conversations about some narrative.

The National Association for Legal Career Professionals has estimated that lawyers at large firms bill on average 1,859 hours per year and work 2,208 hours per year. These numbers are more troubling in view of the fact that large law firms require their attorneys to attend functions with clients and potential clients, time that is neither billable nor considered “working hours.”

If there are around 8,760 hours in a year, and if a healthy person spends about 2,920 of those sleeping, there remain only around 5,840 hours per year for everything else. If “everything else” consisted of nothing—nothing at all—except reading War and Peace, then a lawyer at a large law firm could read that book about 154 times a year. But of course this is not possible, because no person can function as a machine functions. Once the offsetting variables are accounted for—and I have listed only a few that immediately spring to mind, and these for people with no families—it becomes apparent that it is nearly impossible for a lawyer to read more than about four lengthy or difficult books each month, and only the most diligent and disciplined can accomplish that.

Numbers can lead us astray, so let us consider some anecdotal evidence—my own testimony—which suggests that most lawyers are illiterate, or perhaps that lawyers have to try really hard to become literate or to avoid losing their literacy.

I am a lawyer, one who considers himself literate but increasingly in danger of becoming illiterate the longer I remain in my chosen profession. My hope is that literacy stays with you, that if you “frontload,” as it were, you can build a wide enough base to allow for slack in later years.

In 2013, I made an effort to overcome the time restrictions of my job to read through several canonical texts of Western Civilization. For the most part I undertook a book a week, although, because of scheduling constraints, I read what I took to be the most important or most famous sections of the lengthier books and volumes such as Aquinas’s Summa Theologica, a work that would require years of study to fully appreciate. I found myself, on many Thursday evenings, reading so rapidly to finish the text at hand that I could not enjoy myself or absorb the nuances and complexities established by the author.

Reading only one book a week when you are intelligent enough to read more is shameful and disgraceful, the sacrifice of a gift. During graduate school, I could read five or six books a week and can recall more than one week when I read a book a day. But each day I spend working as a lawyer, I am less able to digest the books I consume and to consume the books necessary for intellectual nourishment.

Economists use the term “opportunity cost” to refer to a choice to forego options or to pursue the benefits of one course of action rather than another. The cost of becoming a lawyer is giving up literacy or making its attainment more difficult; the gain, in theory, is a higher salary and financial stability. Whether the gain neutralizes the loss depends on one’s preferences. I myself would not trade for a million dollars the opportunity to read Tolstoy or Shakespeare or Aristotle or Santayana.

To achieve the admiration enjoyed by lawyers, other professionals must do their jobs several times better. Happily, this is not a high bar. That is why people prefer the company of doctors. It is not that lawyers are incompetent or unskilled; it is that they do not put their faculties to good use. All people think, but it is only by degree and by the object of their thought that the literate are distinguished from the illiterate. To put their minds to humane use would improve lawyers’ reputations considerably and call into question that axiom popularized by one of Dickens’s characters: “If there were no bad people, there would be no good lawyers.”

The way I see it, you can spend all your life billing clients and pushing paper under great stress, by investing your talents and resources in prospects that yield no intellectual returns, or you can spend your life establishing high standards of reason, understanding, and creativity by studying the most important and influential works that humans have produced through the ages. You can spend all your time transacting business, prosecuting and defending lawsuits, and preparing briefs and memoranda, or you can cultivate discernment and understanding. The options are not mutually exclusive: I have overstated to draw a sharp contrast, but the point remains.

Do not misunderstand me: working hard and earning profits are not only good and healthy activities but personally fulfilling. Yet they must be supplemented with humane contemplation and the private study of important ideas. Industry and innovation are requisite to a high quality of life, a robust economy, and human flourishing—and they make possible the time and leisure that enable some people to create great art and literature. Not everyone can be literate, and that is a good thing.

It is just that many lawyers never learn to live well and wisely, to place their seemingly urgent matters into perspective, or to appreciate, as Aristotle did, the virtues of moderation. This failure is directly related to lawyers’ neglect of history and philosophy and to their suppression of the moral imagination that works of good literature can awaken. This failure, as well, puts lawyers at a distinct disadvantage when it comes to spiritual, moral, and intellectual pursuits. As Mark Twain quipped, “The man who does not read good books has no advantage over the man who cannot read them.”

Lawyers are illiterate, most of them anyway. Trust them to handle your real estate closings or to manage your negligence claims, to finalize your divorce or to dash off angry letters to your competitors, but do not trust them to instruct you on plain living and high thinking. There are exceptions—Gerald Russello and Daniel Kornstein are two—but generally lawyers are not to be consulted on matters of importance to the soul. For those, we have good books, and with luck, the people who write and read them.

The Lawyers’ Guild

In America, American History, History, Law, Legal Education & Pedagogy, Nineteenth-Century America on August 27, 2014 at 8:45 am

Allen 2

This piece originally appeared here as a Mises Emerging Scholar article for the Ludwig von Mises Institute Canada.

Last month, thousands of recent law school graduates sat for a bar examination in their chosen state of practice. They were not undertaking a harmless rite of passage but overcoming a malicious obstacle: an artificial barrier to entry in the form of occupational licensure.

Barriers to entry are restrictions on access to, or participation in, markets or vocations. Occupational licensure is a type of barrier to entry that regulates professions by requiring certification and licensing in the manner of medieval guilds. Medicine and law are perhaps the most recognizable professions to require their practitioners to obtain and maintain licenses.

The purpose of occupational licensure is to reduce competition by using government power to restrict membership eligibility in a profession. The criteria for membership are often prohibitively expensive for low-income earners. To be admitted to the law in nearly every state in the United States, you must not only pass a bar examination but also earn a law degree from an accredited law school, admission to which requires a bachelor’s degree from an accredited university.

The average student-loan debt for graduates of American colleges is around $29,400. The average student-loan debt for graduates of American law schools is between $75,700 and $125,000, depending on whether the school is public or private. The American Bar Association imposes heavy burdens on law schools such as accreditation standards that are inefficient and that drive up costs so that over time the high price of legal education is passed on to the public in the form of attorneys’ fees and costs. Having already saddled themselves with student-loan debts, recent law-school graduates pay thousands of dollars for bar-preparation courses to study for an examination that, if passed, will open the door to a job market that is the worst in recent memory. Nobody struggling financially should attempt to leap over each of these expensive hurdles.

Before the rise of bar examinations and professional licensure during the Progressive Era in the United States, aspiring attorneys simply “read law” as apprentices for practicing attorneys or as clerks for local law firms. Once they achieved a certain level of competence, apprentices were released from their tutelage and eligible to accept clients. Those jurisdictions that did require examinations allowed judges to conduct informal interviews with candidates to determine the candidates’ moral and intellectual fitness for practice. Such examinations were typically mere formalities: few candidates failed; few careers were at stake as the interview took place. Newly admitted attorneys had to demonstrate their excellence in order to gain clients. They launched their careers by charging low fees that even the poorest in society could pay. Attorneys who did not prove fit for practice never gained enough clients to sustain their business and were forced to embark on other professions.

In the late-nineteenth and early-twentieth century, energetic and entrepreneurial members of the middle to lower classes in cities such as New York and Chicago began to threaten the legal establishment that had previously been comprised of a mostly wealthy and elite fraternity. This fraternity simply could not compete with low-cost providers of legal services because, for example, the most elite attorneys considered it unseemly and degrading to advertise for services or to offer contingency fees. Bar associations that were once voluntary organizations of upper class professionals therefore began to use their political clout and government connections to obtain powers conferred by legislatures. They wanted to keep the lower classes out of their profession and to preserve a highbrow reputation for lawyers. They began to exercise a monopolistic control over the practice of law within their respective jurisdictions. Today they constitute authorized arms of the State.

In most jurisdictions’ bar associations determine who may be admitted as members and who must be excluded, whether and to what extent lawyers may advertise their services, what constitutes the “authorized” practice of law, whether a law firm must have a physical office with a non-residential mailing address, and under what conditions contingency fees are permissible. These anti-competitive practices hit communities most in need the hardest by increasing the costs of legal services beyond the ordinary person’s ability to pay.

The bar examination is the most hyped precondition for membership in a state bar association. Like hazing, it is more ritual than training; it does not help one learn to be an attorney or indicate any requisite skills for practice. It tests how well someone can memorize arcane and esoteric rules and their trivial exceptions, many of which have no bearing on actual practice. Few if any lawyers spend their days memorizing rules for courts or clients, and no one who intends to practice, say, corporate law in a big city needs to memorize obscure criminal law rules that were long ago superseded by statute.

Despite reciprocity among some states, the bar examination restricts the free flow of qualified attorneys across state lines, forcing even the best attorneys to limit their services to certain jurisdictions. The bar examination also creates racial disparities among practicing attorneys as minority passage rates tend to be lower, a fact that flies in the face of nearly every bar association’s purported commitment to diversity.

Keeping the number of lawyers low ensures that lawyers may charge higher fees. Keeping the barriers to entry high ensures that the number of lawyers remains low. It’s a popular fallacy to complain that there are too many lawyers. We don’t need fewer lawyers; we need more, so long as we gain them through competitive forces on a free market.

We need to unleash capitalism in the legal system for the benefit of everyone. We could start by eliminating the bar examination. Doing so would have no marked effect on the quality of lawyers. It would drive down the high costs of legal services by injecting the legal system with some much-needed competition. It would make practitioners out of the able and intelligent people who wanted to attend law school but were simply too prudent to waste three years of their lives and to take on tens-of-thousands of dollars of student-loan debt while entry-level legal jobs were scarce and entry-level legal salaries were low. Justifications for the bar examination are invariably predicated on paternalistic assumptions about the ability of ordinary people to choose qualified attorneys; such arguments ignore the number of ordinary people who, today, cannot afford qualified attorneys at all under the current anticompetitive system.

Abolishing the bar examination would benefit the very community it is supposed to protect: the lay public.

Troy Camplin Reviews “Napoleon in America,” a Novel by Shannon Selin

In America, American History, Arts & Letters, Book Reviews, Books, Creative Writing, History, Humanities, Novels, The Novel, Writing on August 20, 2014 at 8:45 am
Shannon Selin

Shannon Selin

Napoleon in America is a “what-if” historical novel that combines a variety of styles – epistolary, newspaper article, and regular novelistic narrative – to create a work that reads like a very well-written narrative of history. Given that the author is necessarily working with an entirely fictional world – one in which Napoleon escapes from St. Helena to the United States – the fact that she can create such an effect is quite remarkable. The reader is made to feel as if he or she is reading about actual historical events. Of particular note is the fact that Selin creates the impression that we are reading a Great Men History book, which makes it rather distinctive. As such, it is going against the direction in which historical studies have, themselves, gone.

Much contemporary history deals with everyday life, local histories, etc. But given that the protagonist of this novel, Napoleon, is the kind of person who is distinctly bored with everyday life – is too big for everyday life – we should not be surprised to find a story dominated by the overwhelming presence of the personality of Napoleon. It is perhaps for this very reason that the novel becomes involved in the great movements of Napoleon rather than the intimate details of his life. These aspects are touched on here and there, of course, but in the end, we remember Napoleon the Conquerer, not Napoleon the almost-died-when-he-got-to-America. Napoleon quickly recovers to dominate the novel with his personality. But this personality is not one changed by circumstances. He is the Napoleon we all love and loathe. He cannot settle down. He has to conquer.

Thus, with Selin’s novel, we have a complete inversion. The novel has, historically, dealt with everyday people in their everyday lives. The actions of most novelistic characters do not have a major impact on historical events. If we look at the way histories are written over the same time period of the rise of the European novel (which includes American and Canadian literature and, stylistically, much literature written in the rest of the world during the 20th century), we primarily see the complete opposite: an interest in major figures and their major effects on history dominate most historical narratives over this same time period. However, we see a shift within history toward the same kinds of concerns we see in novels: everyday peoples, the histories of institutions, local histories, etc. Thus, we should not be surprised to find novels picking up the kinds of narratives we once found in histories.

Along with the Big Men of the time, Selin deals with the Big Ideas of the time; of course, the Big Men are often the Big Men precisely because they discuss and try to enact the Big Ideas of their time. Liberalism and dictatorship and whether Napoleon is really a liberal or little better than the kings he likes to depose are discussed – as no doubt they were, in fact, discussed historically. We see some of the conflicts within French Liberalism – and some of the contradictions. Was it a mere coincidence that French Liberalism led to the Terror and to the Empire under Napoleon? Or was it simply bad luck? Pro- and anti-Napoleon liberals are unified in their opposition to the Bourbons, but the question is raised as to whether replacing one monarch with another is really an improvement. Yet, there seems a willingness, even among those who oppose Napoleon, to support revolution against the Bourbons, even if it results in another Napoleon (literally or figuratively). Along these lines, Selin does a magnificent job of showing how blinding the opposition to the Bourbons is in the decision by the French government to invade Spain. The King in fact opposes the invasion, but ends up being talked into it; the liberals believe the invasion is a Bourbon plot and evidence of his being a cruel dictator. The reality is more humdrum than the conspiracy theory the liberals are desperate to believe.

Overall, Selin’s book goes beyond what we would expect to find in a historical novel whose main character is a major historical figure. A traditional historical novel would have the characters doing all the major, public actions the history books tell us happened. Selin has to do something quite different. She has to first know what did in fact happen during the historical period in question; she then has to understand Napoleon well enough to understand what he might do in circumstances other than those in which he did, in fact, find himself; and then she has to create a realistic alternative to what did in fact happen, understanding the butterfly effects of a Napoleon in America. It is a garden of forking paths, and one can go in any number of directions. To this end, Selin is certainly effective in her choice of direction. The great uncertainty created by Napoleon’s presence in America is well demonstrated. The U.S. government does not seem to know what to do with him. We are, after all, talking about a young country still learning where it fits in the world. It has the benefit of being separated from Europe – where all the action lies – by a large ocean. But the action has come to America’s shores when Napoleon escapes St. Helena. The uncertainty that leaves Napoleon free to raise an army and wander into Texas is well within the realm of possibilities. As is the naïve belief by some – such as James Bowie – that Napoleon can be “handled.”

The majority of the novel is dominated by the spirit of uncertainty and worry. All the action comes in at the end of the novel, when Napoleon finally does invade Texas. And even then, we are left with a great deal of uncertainty. Napoleon has won a battle and established himself in San Antonio; however, we are left with the question of what will happen next. Napoleon in America has the feeling of the first novel in a sequel. It would not surprise me if Napoleon in Texas were to follow. There is a great deal more to this story that could be explored. Will Napoleon be able to create a long-term presence in Texas? What will be the response of Mexico? What will be the response of the American government? What will be the response of the American settlers? Will the people of Kentucky and Tennessee volunteer to fight for Texas independence under Napoleon as they did for its independence under Austin? Is Napoleon just preparing the way for the Americans to take over, making it a bit easier than it was historically? Or is he perhaps making it a bit harder, since a Mexican government may take Napoleon as a much more serious threat to the government of Mexico than those who only wanted an independent Texas?

For those who enjoy the What-If History genre, these are fun questions to consider. I find it hard to imagine that anyone who reads Napoleon in America – which should include most of those who enjoy historical fiction – would fail to want these questions answered in a sequel.

Troy CamplinTroy Camplin holds a Ph.D. in humanities from the University of Texas at Dallas.  He has taught English in middle school, high school, and college, and is currently taking care of his children at home. He is the author of Diaphysics, an interdisciplinary work on systems philosophy; other projects include the application of F.A. Hayek’s spontaneous order theory to ethics, the arts, and literature. His play “Almost Ithacad” won the PIA Award from the Cyberfest at Dallas Hub Theater.

The Life of Julius Porter Farish

In American History, American Literature, Arts & Letters, History, Southern History, The South on August 13, 2014 at 8:45 am

Paul H. Fry on “Linguistics and Literature”

In Academia, Arts & Letters, Humanities, Literary Theory & Criticism, Literature, Pedagogy, Philosophy, Rhetoric, Rhetoric & Communication, Semiotics, Teaching, The Academy on August 6, 2014 at 8:45 am

Below is the seventh installment in the lecture series on literary theory and criticism by Paul H. Fry.  The three two lectures are here, here, here, here, here, here, here, and here.

Pantry, 1982

In Arts & Letters, Creative Writing, Humanities, Poetry, Writing on July 30, 2014 at 8:45 am

Allen 2

 

This poem first appeared in The Echo.

 

A box of cereal, stale, ants running

Up the side, two brown bananas that

 

He says cleanse the pores

(If rubbed thoroughly),

 

An unwrapped chocolate bar

And a plethora of cans, unopened:

 

In a locked pantry, Little Maddy sits

Plucking the stems

 

Off Granny-Smiths. Just ten more

Minutes. Maddy, weary, wondering

 

Just when daddy would come home.

Time: the pantry is unlocked

 

And out comes light

And apples and, lastly, Maddy.

 

Daddy reaches

For the two rotting bananas,

 

Notes can upon unopened can,

Unwraps the chocolate bar,

 

Smears chocolate on his fingers,

Stops, thinks how unlikely it is

 

For apples to lose their stems.



Abolish the Bar Exam

In America, American History, History, Law on July 23, 2014 at 8:45 am

Allen 2

This article originally appeared here at LewRockwell.comand was reposted on this blog last year in July.  I repost it here again this year for all those who are taking the bar exam this week and next week.

Every year in July, thousands of anxious men and women, in different states across America, take a bar exam in hopes that they will become licensed attorneys. Having memorized hundreds if not thousands of rules and counter-rules — also known as black letter law — these men and women come to the exam equipped with their pens, laptops, and government-issued forms of identification. Nothing is more remote from their minds than that the ideological currents that brought about this horrifying ritual were fundamentally statist and unquestionably bad for the American economy.

The bar exam is a barrier to entry, as are all forms of professional licensure. Today the federal government regulates thousands of occupations and excludes millions of capable workers from the workforce by means of expensive tests and certifications; likewise various state governments restrict upward mobility and economic progress by mandating that workers obtain costly degrees and undergo routinized assessments that have little to do with the practical, everyday dealings of the professional world.

As a practicing attorney, I can say with confidence that many paralegals I know can do the job of an attorney better than some attorneys, and that is because the practice of law is perfected not by abstract education but lived experience.

So why does our society require bar exams that bear little relation to the ability of a person to understand legal technicalities, manage case loads, and satisfy clients? The answer harkens back to the Progressive Era when elites used government strings and influence to prevent hardworking and entrepreneurial individuals from climbing the social ladder.

Lawyers were part of two important groups that Murray Rothbard blamed for spreading statism during the Progressive Era: the first was “a growing legion of educated (and often overeducated) intellectuals, technocrats, and the ‘helping professions’ who sought power, prestige, subsidies, contracts, cushy jobs from the welfare state, and restrictions of entry into their field via forms of licensing,” and the second was “groups of businessmen who, after failing to achieve monopoly power on the free market, turned to government — local, state, and federal — to gain it for them.”

The bar exam was merely one aspect of the growth of the legal system and its concomitant centralization in the early twentieth century. Bar associations began cropping up in the 1870s, but they were, at first, more like professional societies than state-sponsored machines. By 1900, all of that changed, and bar associations became a fraternity of elites opposed to any economic development that might threaten their social status. The elites who formed the American Bar Association (ABA), concerned that smart and savvy yet poor and entrepreneurial men might gain control of the legal system, sought to establish a monopoly on the field by forbidding advertising, regulating the “unauthorized” practice of law, restricting legal fees to a designated minimum or maximum, and scaling back contingency fees. The elitist progressives pushing these reforms also forbade qualified women from joining their ranks.

The American Bar Association was far from the only body of elites generating this trend. State bars began to rise and spread, but only small percentages of lawyers in any given state were members. The elites were reaching to squeeze some justification out of their blatant discrimination and to strike a delicate balance between exclusivity on the one hand, and an appearance of propriety on the other. They made short shrift of the American Dream and began to require expensive degrees and education as a prerequisite for bar admission. It was at this time that American law schools proliferated and the American Association of Law Schools (AALS) was created to evaluate the quality of new law schools as well as to hold them to uniform standards.

At one time lawyers learned on the job; now law schools were tasked with training new lawyers, but the result was that lawyers’ real training was merely delayed until the date they could practice, and aspiring attorneys had to be wealthy enough to afford this delay if they wanted to practice at all.

Entrepreneurial forces attempted to fight back by establishing night schools to ensure a more competitive market, but the various bar associations, backed by the power of the government, simply dictated that law school was not enough: one had to first earn a college degree before entering law school if one were to be admitted to practice. Then two degrees were not enough: one had to pass a restructured, formalized bar exam as well.

Bar exams have been around in America since the eighteenth century, but before the twentieth century they were relaxed and informal and could have been as simple as interviewing with a judge. At the zenith of the Progressive Era, however, they had become an exclusive licensing agency for the government. It is not surprising that at this time bar associations became, in some respects, as powerful as the states themselves. That’s because bar associations were seen, as they are still seen today, as agents and instrumentalities of the state, despite that their members were not, and are not, elected by the so-called public.

In our present era, hardly anyone thinks twice of the magnificent powers exercised and enjoyed by state bar associations, which are unquestionably the most unquestioned monopolies in American history. What other profession than law can claim to be entirely self-regulated? What other profession than law can go to such lengths to exclude new membership and to regulate the industry standards of other professions?

Bar associations remain, on the whole, as progressive today as they were at their inception. Their calls for pro bono work and their bias against creditors’ attorneys, to name just two examples, are wittingly or unwittingly part of a greater movement to consolidate state power and to spread ideologies that increase dependence upon the state and “the public welfare.” It is rare indeed to find the rhetoric of personal responsibility or accountability in a bar journal. Instead, lawyers are reminded of their privileged and dignified station in life, and of their unique position in relation to “members of the public.”

The thousands of men and women who will sit for the bar exam this month are no doubt wishing they didn’t have to take the test. I wish they didn’t have to either; there should be no bar exam because such a test presupposes the validity of an authoritative entity to administer it. There is nothing magical about the practice of law; all who are capable of doing it ought to have a chance to do it. That will never happen, of course, if bar associations continue to maintain total control of the legal profession. Perhaps it’s not just the exam that should go.

Paul H. Fry on “Semiotics and Structuralism”

In Arts & Letters, Books, Communication, Humanities, Literary Theory & Criticism, Literature, Pedagogy, Philosophy, Rhetoric, Scholarship, Semiotics, Teaching, The Academy, Western Philosophy, Writing on July 16, 2014 at 8:45 am

Below is the seventh installment in the lecture series on literary theory and criticism by Paul H. Fry.  The three two lectures are here, here, here, here, here, here, and here.