See Disclaimer Below.

Author Archive

Thoughts on ‘The Road to Serfdom’: Chapter 1, “The Abandoned Road

In Arts & Letters, Austrian Economics, Book Reviews, Books, Britain, Economics, Epistemology, Essays, Ethics, Historicism, History, Humane Economy, Humanities, Liberalism, Libertarianism, Literary Theory & Criticism, Literature, Modernism, Philosophy, Politics, Pragmatism, Western Civilization, Western Philosophy on September 11, 2013 at 7:45 am

Slade Mendenhall

Slade Mendenhall is an M.Sc. candidate in Comparative Politics at the London School of Economics, with specializations in conflict and Middle Eastern affairs. He holds degrees in Economics and Mass Media Arts from the University of Georgia and writes for The Objective Standard and themendenhall.com, where he is also editor.

This analysis is the second installment in a series of chapter analyses of Friedrich Hayek’s The Road to Serfdom. The previous analysis of Hayek’s introduction can be found here.

If Hayek’s introduction gave us a brief summary of the ideas and practices he is setting out to oppose and contextualized the progression toward a socialist political culture in the last half century of Europe’s history, his first chapter, “The Abandoned Road”, firmly roots his grievances in the present and the problems facing England at the time of his writing and seeks to explain how England (and the West more generally) arrived there. He describes the intellectual evasions, distortions, and faulted epistemology—often consisting of poorly defined key concepts —that led to and are, in his time, perpetuating the state of affairs he observes. He then proceeds to address the subject of liberalism and how socialists who misconceive of their own system do so at least as much with its antithesis. In the process, Hayek makes many excellent observations, but also succumbs to several dangerous philosophical errors and unsubstantiated claims against laissez-faire capitalism that tarnish what might otherwise be an outstanding defense against government controls.

Hayek begins the chapter with one of the most argumentatively powerful, poignant approaches that one can take in opposing socialist ideas: illustrating to those who support more moderate, tempered versions of statist controls that though they may differ in degree from those statists they oppose, the philosophical fundamentals they advocate are the same. “We all are, or at least were until recently, certain of one thing,” he writes,

“that the leading ideas which during the last generation have become common to most people of goodwill and have determined the major changes in our social life cannot have been wrong. We are ready to accept almost any explanation of the present crisis of our civilisation except one:  that the present state of the world may be the result of genuine error on our own part, and that the pursuit of some of our most cherished ideals have apparently produced utterly different results from those which we expected” (8).

Hayek’s point is well made and much needed at a time when such widespread, utter contradictions were even more severe than they are today. Writing to Britons in the 1940s, but with as much truth to offer Americans who stumbled over the same contradictions in the 1960s and 1970s, as the platitude “we are all socialists now” manifested on Nixon’s lips as “we are all Keynesians now” (and with less fundamental difference between them than Keynesians would have you believe), he asks us to recognize that “the tendencies which have culminated in the creation of the totalitarian systems were not confined to the countries which have succumbed to them” (8-9). Nor, for that matter, are they confined to those times, and Hayek’s message to this effect—the importance of recognizing the same fundamental ideas across contexts—is as much needed today as it was then.

He goes on to recognize that the conflict between the Axis and Allied powers in World War II is fundamentally a conflict of ideas: “The external conflict is a result of a transformation of European thought in which others have moved so much faster as to bring them into irreconcilable conflict with our ideals, but which has not left us unaffected.” He is quick to point out, though, that “the history of these countries in the years before the rise of the totalitarian system showed few features with which we are not familiar” (9).

Such an appreciation for the motive power of ideas in human conflict was not so unique in Hayek’s time. In fact, the Allied leaders superlatively acknowledged the enemy they faced as “fascism” and condemned it explicitly (though the economic and social policies of FDR, along with his earlier overt flirtations with such ideas, may have made the condemnation somewhat ironic). If Hayek has a lesson to teach to this effect, it is most needed in today’s world, when the significance of philosophy is so frequently cast aside by the influences of multiculturalist nihilism and the failure, even in academia, to appreciate the role of broadly held cultural ideas in deciding man’s fate. At a time when the mention of a “clash of civilizations” invites accusations of oppressive Western chauvinism, Hayek’s acknowledgement that conflicting fundamental ideas may lead to actual conflict is a welcome reminder.

Much of the chapter appropriately looks to fundamental ideology as the cause for the rise of Nazism, seeing the rejection of individualism in favor of collectivism as a necessary prerequisite to the “National-Socialist revolution” and a “decisive step in the destruction of that civilisation which modern man had built up from the age of the Renaissance.” The spirit of this argument is undoubtedly sound. However, the method by which he proceeds to argue it leaves much to be desired. Hayek proceeds down a path of questionable historical interpretations, a half-cocked swipe at moral philosophy (that, as we shall see, is flawed but not unfamiliar to readers of this site), and ultimately an incomplete defense of the liberal policies he hopes to defend—showing the consequences of that brief glimpse of skepticism we witnessed in the introduction.

In his historical contextualization of the trends he observes, Hayek writes,

“How sharp a break not only with the recent past but with the whole evolution of Western civilisation the modern trend towards socialism means, becomes clear if we consider it not merely against the background of the nineteenth century, but in a longer historical perspective. We are rapidly abandoning not the views merely of Cobden and Bright, of Adam Smith and Hume, or even of Locke and Milton… Not merely the nineteenth- and eighteenth-century liberalism, but the basic individualism inherited by us from Erasmus and Montaigne, from Cicero and Tacitus, Pericles and Thucydides is progressively relinquished” (10).

Hayek’s invocation of these great names in the history of liberal thought is, in most instances, not misplaced. It is true that all emerged from Western civilization and that to varying extents they all fit well into the liberal, individualist tradition he means to illustrate. One would be wise to regard the inclusion of Hume and Montaigne, paragons of skepticism, as only conditional points on such a list, though Hayek’s own skepticism and that of many libertarians in his tradition would certainly allow them.

More broadly, however, it must be said that the individuals mentioned, no matter how great their contributions to political and social thought, were not often the rule in their place and time, but the exception. One can admire the works of Pericles, but should bear in mind the fickle reception he received among the Athenians. Likewise, Cicero may deserve praise above any in his time, but for those virtues we might praise he was slaughtered without trial by a dictator who faced no consequences.

Thus, as admirable as Hayek’s examples may be, to suggest that they were the norm throughout most of Western civilization is unsubstantiated. They may have embodied those qualities that most distinguished Western civilization and have been most responsible for its progress, but it was a progress often achieved by much-abused minorities. The Renaissance, Enlightenment, and nineteenth century were the high-points of individualism and Western ideals, and Hayek is right in singling them out. However, he also runs the risk of obscuring the philosophical roots of National Socialism, itself the product of contrary trends in Western thought, by engaging in careless generalization from those high-points and distinguished individuals to Western history in general.

Departing from this somewhat problematic historical interpretation, Hayek moves through a favorable discussion of the benefits of economic and political freedom on scientific innovation. His recognition and argument that “[w]herever the barriers to the free exercise of human ingenuity were removed man became rapidly able to satisfy ever-widening ranges of desire” is incontestable (12). He also anticipates the common objections by socialist apologists today who characterize the Industrial Revolution as a period of oppression by citing the difficult living conditions of the urban poor. He rightly rejects this by contextualizing the period in the experiences and expectations of those who lived through it, writing that

“[w]e cannot do justice to this astonishing growth if we measure it by our present standards, which themselves result from this growth and now make many defects obvious. To appreciate what it meant to those who took part in it we must measure it by the hopes and wishes men held when it began… that by the beginning of the twentieth century the working man in the Western world had reached a degree of material comfort, security, and personal independence which a hundred years before had seemed scarcely possible” (12-13).

What proceeds from there is where Hayek seems on unsteady footing, as he briefly undertakes the task of trying to explain what ideas diverted man from the individualist course set from the Renaissance to the nineteenth century. Inexplicably, Hayek credits an excess of ambition as responsible for the turn toward socialism. He writes,

“What in the future will probably appear the most significant and far-reaching effect of this success is the new sense of power over their own fate, the belief in the unbounded possibilities of improving their own lot, which the success already achieved created among men. With success grew ambition—and man had every right to be ambitious” (13).

He returns to the idea again later, writing that,

“Because of the growing impatience with the slow advance of liberal policy, the just irritation with those who used liberal phraseology in defence of anti-social privileges, and the boundless ambition seemingly justified by the material improvements already achieved, it came to pass that toward the turn of the century the belief in the basic tenets of liberalism was more and more relinquished” (14-15).

It is here that Hayek’s inadequacy in analyzing philosophical ideas, and perhaps an economic bias toward looking at matters purely as a function of supply and demand, begins to show. The notion that an inadequate or insufficiently rapid provision of living standards by capitalism is to blame for the introduction and spread of socialism is baseless, as it not only commits the philosophical error of attributing a total change in fundamental beliefs to external conditions, but also ignores the fact that the introduction of socialist policies preceded the slowdown in quality of living improvements in the Western world—and, furthermore, that the slowdown still wasn’t all that slow, as anyone who looks at world history from 1870 to 1928 will readily observe.

Thus, Hayek’s notion that “ambition” is somehow to blame is irrational. If we accept the notion that capitalism was responsible for man’s improved quality of living, then the only function that ambition should serve in this context is to drive men back toward capitalism and its fundamental values—not toward socialism. To the contrary, it is not an excess of ambition that drove men away from capitalism, but the fact that the philosophical principles that underlie and empower capitalism were not consistently established in the minds of its practitioners in the first place. That is: those who lived under capitalism had not explicitly embraced reason as man’s means of acquiring knowledge, nor rational egoism as his proper ethical system, and thus lacked the fundamentals on which individualism rests. Thus, ultimately, the individualism that Hayek admires was present in the West, but not firmly rooted enough to survive the philosophical revival of Plato in the forms of Kant and Hegel. Undercut by their philosophies, in the face of Marx and Engels the West was a pushover.

Hayek’s invocation of excess ambition as an explanation for socialism shows that though he understands the role of political ideology in man’s fate, his ability to explain how that ideology stems from deeper levels of philosophy is severely lacking. Unfortunately, he does not allow this lack of expertise to stop him from making such baseless speculations as to the roots of socialism being in man’s ambition, nor from making a similarly arbitrary and more dangerous conjecture: that the essential quality that animated the Renaissance and Western civilization’s embrace of individual man was “tolerance.”

“Tolerance,” he writes, “is, perhaps, the only word which still preserves the full meaning of the principle which during the whole of this period was in the ascendant and which only in recent times has again been in decline, to disappear completely with the rise of the totalitarian state” (3). Hayek offers no further explanation to support this statement or the implication that tolerance was the animating virtue of these times, or at the very least played some crucial role in it. Nor does he illustrate the point with citations or examples. The claim stands alone.

We are thus left to speculate as to his actual beliefs on this point. However, a look at a somewhat younger contemporary libertarian economist who dabbled in political writings such as this and who shares certain philosophical fundamentals—namely a skepticist epistemology—may shed some light on the claim. Milton Friedman similarly cited ‘tolerance’ and, more specific to Friedman’s case, “tolerance based on humility” as the fundamental basis of his libertarianism. That is: the rejection of statism based not on the rights of individuals but based on the fact that no one can rightly initiate force against another since the initiator has no basis by which to know whether the cause in whose name he would initiate that force is right or wrong. Put simply, it establishes a social system in which peaceable relations between men depend upon the impossibility of establishing objective principles. In which ignorance, not knowledge, is man’s saving grace. In which moral certainty is perceived to be the root of all tyranny.

(I will not go further into Friedman’s confused moral philosophy here, though it is encouraged that the reader reference my article “The Failures of Milton Friedman” for a fuller explanation his views and the dangers they entail.)

Whether Hayek’s implication in citing “tolerance” as the great virtue lost by the rise of collectivism is in line with Milton Friedman’s connections of “tolerance” and libertarianism is unknown, but the fact that the two men share a skepticist epistemology and both ultimately land at the same word to describe the virtue that they see to be animating their ideals cannot be ignored and provides a possible explanation for Hayek’s unsupported statement.

Where skepticist epistemology and haphazard forays into moral philosophy are found, an incomplete defense of freedom usually follows. So it is here with Hayek, who shows us precisely his conception of freedom and how it should be fought for, writing, “There is nothing in the basic principles of liberalism to make it a stationary creed, there are no hard and fast rules fixed once and for all. The fundamental principle that in the ordering of our affairs we should make as much use as possible of the spontaneous forces of society, and resort as little as possible to coercion, is capable of an infinite variety of applications” (13).

I will not engage with this statement directly, as it has been soundly argued elsewhere in other essays from this publication such as “The Philosophy of Capitalism” and Brian Underwood’s “Political Capitalism”, as well as in Ayn Rand’s essays “Man’s Rights”, “The Objectivist Ethics”, and “The Nature of Government.” I will observe simply that for a man accepted by many to be symbolic of twentieth century liberalism to take such a pragmatic, unprincipled approach to the defense of freedom stands as much as a symbol of the unsteadiness and lack of a moral basis in that movement as it does a condemnation of the man himself. What’s more, it shows that no sound defense of liberty can be based on a skepticist epistemology. A defense of man begins with an admiration for man and his nature as a rational, efficacious being. Whoever hopes to undertake a task so daunting and so crucial as a defense of man’s rights against oppression cannot enter the fray with a puttering “Who knows?!” as his battle cry.

It is the inevitable fate of such pragmatists that they should ultimately abandon a strict conception of liberty and that they should shrink principles down to the level of momentarily expedient guidelines to be cast aside at the first sign of opposition. We must be immensely grateful that the Founding Fathers of the United States had the moral basis to recognize and firmly assert the rights of “life, liberty, and the pursuit of happiness”, yoking future statesmen to these principles rather than settling for such a shrugging recommendation that they “make as much use as possible of the spontaneous forces of society.” We must be proud that Jefferson swore “an oath upon the altar of God eternal hostility against every form of tyranny over the mind of man”, and not merely an oath to “resort as little as possible to coercion.”

The distortions, sadly, do not end there. Hayek confounds our expectations further by seeking to balance his critique of socialism with a contrary charge against advocates of full individual rights, writing that “[p]robably nothing has done so much harm to the liberal cause as the wooden insistence of some liberals on certain rough rules of thumb, above all the principle of laissez faire” [emphasis mine] (13).

Hayek’s ambiguous accusation against advocates of laissez-faire, that they are somehow partly responsible for the rise of socialist policies, apparently rests on the capitalists having viewed the principle as a “hard and fast… rule which knew no exceptions” (13).  He goes on to explain that the downfall of liberalism is explainable by reference to the liberal’s strict adherence to the laissez-faire principle, finding it “inevitable that, once their position was penetrated at some points, it should soon collapse as a whole” (13).

At this point, Hayek quickly reveals several key implications: that advocates of laissez-faire are partly responsible for the rise of socialism, that laissez-faire is a flawed system, and that its legitimacy has indeed “collapse[d]” through being disproven. He continues, “No sensible person should have doubted that the crude rules in which the principles of economic policy of the nineteenth century were expressed were only a beginning, that we had yet much to learn, and that there were still immense possibilities of advancement on the lines on which we had moved” (14).

To be clear: Hayek is not referring to changes in application or translation of the existing principles, but a shift in principles as such. ‘What’, one must ask, ‘could have fundamentally changed so drastically in the period in question, to make the basic principles of economic freedom no longer relevant or applicable in one period as they had been in the previous one?’ According to Hayek, it was the inevitable result of having

“gained increasing intellectual mastery of the forces of which we had to make use. There were many obvious tasks, such as our handling of the monetary system, and the prevention or control of monopoly, and an even greater number of less obvious but hardly less important tasks to be undertaken in other fields, where there could be no doubt that the governments possessed enormous powers for good and evil;” (14)

Thus, Hayek posits that our “increasing intellectual mastery” (though I can think of a century of economic instability primarily brought by government controls that would refute this alleged “mastery”) is to credit for government intervention in the economy. He implies that the belief that governments could regulate the economy by force somehow translates to the presumption that they should do so—a significant leap that Hayek does not and cannot, without reference to philosophy, explain. Not only does this misconceive of the problem; it carelessly implies that those statesmen of earlier times did not intervene in the economy because they could not conceive of how to do so. To the contrary: earlier liberal thinkers did not plead ignorance in the face of proposed interventionism—they opposed it on principle, and suggesting otherwise is a discredit to their defenses of liberty.

Hayek’s passing statements apparently endorsing the “control of monopoly” and his suggestion that “the governments possessed enormous powers for good and evil”—that is, that good could be achieved by force just as surely as evil—only add layers to the disappointing picture established thus far. He goes on to make an unconvincing argument that the slow pace of economic progress under liberalism was to blame for people having turned away from it—a confounding claim to make about a century that witnessed the most rapid and dramatic rise in quality of life in the history of humankind, and one that even Marx himself would likely have disputed as unsubstantiated.

Finally, he ends the chapter on an agreeable note with a brief description of how the geographical flow of ideas—from Britain and the US east to continental Europe—reversed at this period in history and the prevailing current turned westward, exporting German socialist ideas to the Atlantic. He astutely summarizes how the ideas of Marx, Hegel, List, Schmoller, Sombart, and Mannheim overtook the intellectual tone set by the English after 1870. He ends on the essential point that it was ultimately the lack of confidence in their own convictions by Western thinkers that made this shift possible. In this effort—narrating the history of philosophical and cultural trade balances—Hayek is excellent and displays the power of which he is capable when he remains in his purview, capitalizing on his unique perspective.

After a promising introduction, the first chapter of Hayek’s book has proven shaky at best. The flaws are numerous and fatal: a questionable interpretation of the histories of both liberalism’s origins and socialism’s ascendance, a dangerously inadequate grasp of the role of moral philosophy in the histories he details, a desire to blame liberalism for its own destruction with insufficient substantiation, a skepticist rejection of principles that leads to a pragmatist’s approach to policy, and, finally, a rejection of laissez-faire capitalism.

To his credit, Hayek is overall favorable on matters of economic history, arguing effectively for the role of capitalism in promoting scientific progress and advances in standards of living. However, his suggestion that advancement in the nineteenth and early twentieth centuries was slow, and that this slowness of progress is to blame for the West’s acceptance of socialism, is largely without a supporting argument, is contrary to the unrivaled history of economic progress that we know to have characterized that period, and, incidentally, indulges a determinist philosophy that we saw him as likely to avoid in the introduction—a serious point of inconsistency.

Overall, Hayek’s first chapter is a dramatic step down from the introduction and a disappointment considering the reputation of the book. It is, in its own way, an abandonment of the road, if in a slightly different direction than those whom Hayek criticizes. Though future chapters may redeem the work to some extent, the fact that so much ground is lost in the first few pages is a severe blow, but one that is in keeping with the suspicions which we noted in assessing the introduction and which we warned to be on the lookout for. It illustrates well the consequences of even small cracks in one’s intellectual foundation and confirms the value of critically applying careful philosophical detective work in reading works such as this, no matter their reputation.

Thoughts on ‘The Road to Serfdom’: Introduction

In America, Arts & Letters, Austrian Economics, Book Reviews, Books, Britain, Economics, Historicism, History, Humane Economy, Humanities, Liberalism, Libertarianism, Literature, Philosophy, Politics, Western Civilization, Western Philosophy on September 9, 2013 at 7:45 am

Slade Mendenhall

Slade Mendenhall is an M.Sc. candidate in Comparative Politics at the London School of Economics, with specializations in conflict and Middle Eastern affairs. He holds degrees in Economics and Mass Media Arts from the University of Georgia and writes for The Objective Standard and themendenhall.com, where he is also editor.

This piece commences a series of analyses on Friedrich Hayek’s The Road to Serfdom. For those unfamiliar with the work, first published in 1943, it details the famed Austrian economist’s observations, drawn from having lived in Austria in the years after World War I, witnessing firsthand the culture of political ideas that preceded and led to the rise of Nazism there, and then, some decades later, living in England, teaching at the London School of Economics, and observing the rise of similar ideas at work in English political culture at the onset of her own period of experimentation with socialism.

Britain was, at the time, feeling the onset of what would become a set of devastating postwar economic ailments: the loss of many colonies—sold off one by one to finance the war, severe physical destruction (though not as bad as on the Continent), a trade imbalance skyrocketing the prices of much-needed American goods, and an economy of permits and privation in basic commodities. The end of the war would bring the sweeping 1945 victory of Labour and greater troubles with the onset of the Brain Drain, a period of bitter class resentment, and nationalizations of industry. Shortly after the second edition of The Road to Serfdom was printed in 1946, England was facing strikes, falling exports, and almost £200m lost every week as dollar convertibility was introduced in 1947.

In the midst of it all was a growing culture of socialism in both major parties. As Hayek wrote, “the socialism of which we speak is not a party matter, and the questions which we are discussing have little to do with the questions at dispute between political parties” (3). Though Labour would be its more avowed exponents, the fundamentals of socialist ideology were well enough embedded so as not to be challenged at any basic moral or systematic level by either side. What’s more, many Britons would see this as a proud new political and economic identity for a Britain without an empire. Historian Norman Stone writes,

“the British were pleased with themselves, supposing also that their example was one to be widely followed as some sort of ‘third way’ between American capitalism and Soviet Communism… combining the ‘economic democracy’ of Communism and the ‘political democracy’ of the West: socialism without labour camps…. People who argued to the contrary [such as Hayek—ed.] were in a small minority… but even in the later 1940s these supposedly half-demented figures were starting to have reality on their side. It struck with a ferocious blow, in the second post-war winter. The money began to run out, and the government became quite badly divided as to priorities.”

It is easy to imagine how remorsefully vindicated Hayek must have felt in those first few years after the publication of The Road to Serfdom—affirmed and disappointed in the way that all those who warn of impending danger are wont to feel.

Though the book would be praised by proponents of liberalism from the time of its publishing to the present and cause a stir among his peers in academia, policymakers would be, as they ever are, roughly a generation late in feeling the aftershocks of this groundbreaking statement. By the time began its creep into the political lexicon, Hayek had moved on from the LSE, going on to teach at the University of Chicago (in its Committee on Social Thought, as the School of Economics vehemently opposed his hiring under their banner), the University of Freiburg, the University of California, and the University of Salzburg, where in 1974 he was awarded the Nobel Prize in Economics.

Since the onset of the 2007 recession, sales of The Road to Serfdom, along with other works that challenge the fabric and assumptions of modern Western philosophy, political culture, and economics such as Ayn Rand’s Atlas Shrugged, have skyrocketed. In 2010, 66 years after its publication, The Road to Serfdom became a #1 bestseller on Amazon.

As this and other such works grow in popularity, it is important to take a second look at them, assessing both their virtues and faults, their accomplishments and their shortcomings. The analysis that follows sets out to do just that. It is an overall favorable assessment, as this author agrees with many of Hayek’s basic political premises. However, for that reason, it will also more scrupulously critique and highlight perceived flaws, ambiguous wording, platitudes, and those floating abstractions common in political treatises that, though they seem plausible at first glance, prove deeply flawed when translated into concrete practice. Though these analyses will strive to give an adequate overall summary of what Hayek himself writes, the reader is encouraged to read Hayek’s words along with these critiques and to judge for himself their validity.

It is broadly understood that those concerned with the cause of liberty must be vigilant in our criticisms of its destroyers, but it is no less essential—if not more so—that we be judicious toward those authors and works on which we base our own beliefs, as every philosophy is a structure and every flaw in that structure a weakness. The closer our faults are to our foundations, the greater our vulnerability. As more and more libertarians and capitalists turn to works such as Hayek’s to form understandings and shape their beliefs, let us look carefully to what ideas we are resting upon. We have nothing to lose but our contradictions.

Note on citations: all page references, unless otherwise stated, are based on the February 1946 edition published by George Routledge & Sons LTD.

Introduction

Hayek’s introduction effectively sets the tone for the rest of the work by illustrating his own unique perspective, having come “as near as possible to twice living through the same period—or at least twice watching a very similar evolution of ideas,” (1) then giving us a brief summary of what wisdom that twice-lived experience has offered him: an understanding of the linkages between the spread of socialist ideas, the various debates it engenders in countries operating on similar philosophical premises, and the eventual rise of dictatorship.

The summary of events transpiring in the half-century leading up to World War II that Hayek describes is perhaps most powerful and most distinctive for its recognition of the role of ideas in man’s life. Hayek superbly recognizes the consequential nature of ideas in human life, writing “If in the long run we are the makers of our own fate, in the short run we are the captives of the ideas we have created. Only if we recognise the danger in time can we hope to avoid it” (2).

In this short passage, just a few paragraphs in, Hayek has already distinguished himself from the long and destructive philosophical and political tradition of determinism and, more subtly and implicitly, by viewing the connection between man’s ideas and actions, rejected the mind-body dichotomy, which has long divided philosophers and intellectuals between those who concerned themselves with the workings of man’s mind, dismissing his physical actions as inconsequential marginalia, and those concerned with man’s physical nature but who view the content of his mind as meaningless.

These abstract philosophical notes are crucial, allowing us to establish several inferences as to what misguided political camps and ideologies Hayek will successfully avoid being mired in. By denying the metaphysical premise of determinism (whether in its environmental or genetic forms), Hayek embraces the concept of free will and the essential premise that ideas matter, inviting us to commence his work with the presumption that what wisdom we glean from it individually might be actionable and applicable in our own lives and experiences. This quickly separates him from the philosophical premises of the Left (or, to indulge a common but unbearably ironic label, “progressivism”), whose policies largely rest upon some variant of determinist metaphysics, leading them perpetually to the conclusion that man, left to his own free will, is doomed to irrationality, but that the ideal society is achievable through the right amount of systematic tweaking and statist controls. It already begins to become clear what premises lead Hayek to become the symbol of liberalism he is today.

In embracing the importance of the mind and the function of ideas, however, he does not assume a mysticist rejection of reality. To the contrary, he presents to us the implicit proposition that the “ideas we have created” will have very real consequences, and that to change our fates we must scrutinize and perhaps alter our ideas and those of our culture. It rests on the recognition that man is not immune from his own illogic and that, to paraphrase Rand, while the practice of reason may be evaded, the consequences of evading reason cannot be. This acknowledgment separates him from the premises that underlie much of conservative political thought, also concerned with the perfection of man, but oriented toward controlling his thoughts and beliefs, viewing the force of government as a means of instilling values in the minds of its people to produce a more moral citizenry.

Hayek’s Road to Serfdom is a warning, and all warnings are fundamentally rejections of the determinist premise.  What’s more: it is an intellectual warning connecting certain ideas and beliefs to their metaphysical consequences. While common logic, particularly among those who recognize the practical benefits of liberty, would suggest that that which one values should be left free to flourish, to the contrary, both progressives and conservatives seek to control those aspects of man which they most value—progressives, man’s body; conservatives, man’s mind—relegating its opposite to a status of expendability.

If all philosophy can be thought of as the great duel between two men—Plato and Aristotle—both sides of the political spectrum in Hayek’s time, as in our own, are operating on a fundamentally Platonic premise that divides man’s physical and spiritual nature. True liberalism is fundamentally a diversion from this view in favor of the Aristotelian view of man as a unified entity, to be treated and thought of as such, his life and fate as his own, and his right to dispose of them as he sees fit unchallenged. Thus, Hayek, as an exponent of such liberalism, whether he recognizes and describes it as such himself, begins with this philosophical framework. Whether he maintains it in the chapters to come is a separate question, but his grounding is thus far solid.

Wasting no time, Hayek soon enters the fundamental comparison of his book: that of the ideological roots of Nazism and the rise of socialist thought in Britain precisely at a time when the two nations are at war.

Much equivocating in classrooms, editorial pages, and student coffee shops has transpired in the last seventy-plus years as to the differences between Nazism and true socialism, with socialist apologists quibbling about how Nazis abused what was a noble ideal in socialism. Most engage in such momentous evasions and distortions as to treat socialism and fascism as in any way opposites, portraying what is in fact a genus-type distinction as fundamentally inimical, when they are, in fact, merely differences in application of the same basic premises.

Hayek tolerates none of this, observing,

“Few are ready to recognize that the rise of Fascism and Nazism was not a reaction against the socialist trends of the preceding period, but a necessary outcome of those tendencies… As a result, many who think themselves infinitely superior to the aberrations of Nazism and sincerely hate all its manifestations, work at the same time for ideals whose realization would lead straight to the abhorred tyranny” (3).

Indeed, one cannot help but feel that little has yet changed in Western intellectualism when Hayek describes the parallels between Germany after World War I and England during World War II: “There is the same contempt for nineteenth-century liberalism, the same spurious ‘realism’, and even cynicism, the same fatalistic acceptance of ‘inevitable trends’… It does not affect our problem that some groups may want less socialism than others, that some want socialism mainly in the interest of one group and others in that of another. The important point is that, if we take the people whose views influence developments, they are now in this country in some measure all socialists” (2-3).

More familiarity ensues when Hayek notes how Germany was once held in England and other Western countries as an ideal to be pursued and how that idealized conception has since been transferred elsewhere: “Although one does not like to be reminded, it is not so many years since the socialist policy of [Germany] was generally held up by progressives as an example to be imitated, just as in more recent years Sweden has been the model country to which progressive eyes were directed” (2). One so often sees the case of Swedish socialism invoked as a statist ideal in today’s world, since the recession of 2008, but it is often forgotten how old this example is—mentioned here by Hayek in the 1940s, discredited for its proclaimed cultural superiority by Ayn Rand in the 1960s, but still going strong as part of statist mythology today.

In support of his parallel, Hayek rightly rejects the concrete superficial details of German National Socialism to which the broader abstraction of ‘fascism’ is so unproductively and irrationally married in the minds of most who refer to and write of it. More than any other ideology, the word ‘fascism’ has attained a pejorative quality that has overcome its literal meaning and distorted the popular understanding of it to such an extent that most today will readily proclaim that they reject it, but remain utterly incapable of defining it. Modern dictionaries and encyclopedias are similarly unhelpful, as much victims of the disintegrated epistemology of their times as those who reference them.

(This is not the place to go into a fuller explanation of the meaning of fascism, but those interested would do well to refer to my previous essay on the subject, “Understanding Fascism”.)

Thus, in Hayek’s understanding of National Socialism will be found no deterministic German racial explanations, recognizing both the influences of German fascist thought on the English and the early role played by Thomas Carlyle and Houston Stewart Chamberlain, a Scot and an Englishman, on the formation of fascist ideas.

A cautious approach is wise here, as while no racial explanation to the effect that some innate German-ness led to National Socialism can be held as rational, the role of culture and philosophy in German society is indispensable to understanding its rise. Hayek goes on to write, “It would be a mistake to believe that the specific German rather than the socialist element produced totalitarianism. It was the prevalence of socialist views and not Prussianism that Germany had in common with Italy and Russia—and it was from the masses and not from the classes steeped in the Prussian tradition, and favored by it, that National-Socialism arose” (7).

True as much of that is, to say “the socialist element produced totalitarianism” is perhaps only to scratch the surface by acknowledging that one political idea was connected to another It does not explain why the socialist element was accepted in the first place. For that, one must look to German culture. To that end, Leonard Peikoff’s The Ominous Parallels offers an incomparable philosophical genealogy of Nazism that would serve as a necessary complement to Hayek’s work, assuming Hayek continues down the path he is setting out here.

Perhaps the most detrimental statement in Hayek’s introduction is said rather in passing. After having written that “by moving from one country to another, one may sometimes watch similar phases of intellectual development… They suggest, if not the necessity, at least the probability, that developments will take a similar course” (1), “some of the forces which have destroyed freedom in Germany are also at work here” (2), and “our chance of averting a similar fate depends on our facing the danger and on our being prepared to revise even our most cherished hopes and ambitions if they should prove to be the source of the danger” (2-3), Hayek betrays the premise upon which he has built up his whole work by conceding, “All parallels between developments in different countries are, of course, deceptive; but I am not basing my argument mainly on such parallels” (3).

Certainly it must be admitted that parallels between such developments are not deterministic or without mitigating factors, not immune to changes in trajectory. But to suggest that they “are, of course, deceptive” is perilously asserting a skepticist rejection of the principle of causality and the recognition in earlier statements of the role of ideas. Hayek would do well to apply the same social scientific rigor to the subject of politics that he does in economics, recognizing that just as effects of supply and demand on prices are assessed by holding constant certain variables, so the effect of ideas presumes a measure of ceteris paribus, but this does not negate the principle demonstrated by such models or demand of the author some token measure of self-doubt.

In all, Hayek’s introduction is strong and offers much to think about, hope for, and consider proceeding onward into his analyses. His overall support for the importance of ideas, propensity (if somewhat unconfidently) toward conceptual integration and a comparative approach to political ideologies, and positive views of individual man and political freedom make for a promising start. Hayek even provides sound reasoning for why England should be interested in engaging in such self-critical analysis, arguing,

“[T]his will enable us to understand our enemy and the issue at stake between us. It cannot be denied that there is yet little recognition of the positive ideals for which we are fighting. We know that we are fighting for freedom to shape our life according to our own ideas. That is a great deal, but not enough. It is not enough to give us the firm beliefs which we need to resist an enemy who uses propaganda as one of his main weapons not only in the most blatant but also in the most subtle forms. It is still more insufficient when we have to counter this propaganda among the people under his control and elsewhere, where the effect of this propaganda will not disappear with the defeat of the Axis powers… It is a lamentable fact that the English in their dealings with the dictators before the war, not less than in their attempts at propaganda and in the discussion of their war aims, have shown an inner insecurity and uncertainty of aim which can be explained only by confusion about their own ideals and the nature of the differences which separated them from the enemy. We have been misled as much because we have refused to believe that the enemy was sincere in the profession of some beliefs we shared as because we believed in the sincerity of some of his other claims” (4).

Likewise, we begin to see his potential faults: a propensity to begin at the level of politics without looking more deeply toward philosophical and cultural ideas, and a creeping skepticism that may lead him to an unconfident defense of his comparative approach and, thus, the warning he seeks to achieve with it. Whether these virtues and potential faults continue, only time and further reading will reveal, but as for the introduction, Hayek hits all of his marks: providing context, provoking questions and challenges, establishing a conceptual framework, and enticing our curiosity. A solid start to a modern defense of classical liberalism.

10 Things to Know About Mortgages

In Law, Legal Education & Pedagogy, Property on September 4, 2013 at 7:45 am

Allen 2

1. A mortgagee obtains a security interest (a note) from the mortgagor who mortgages the asset (the house) to secure the loan for the mortgagee.

2. The note is a security that, under the Uniform Commercial Code (UCC), is a negotiable instrument that may be bought and sold.

3. The mortgage is recorded in the land records, thus allowing the mortgagee, in the event of the default of the mortgagor, to enforce the security on the note by foreclosing the rights of the mortgagor to the mortgaged property.

4. In most cases, a defaulting mortgagor will have the opportunity to redeem the property after the foreclosure of its rights, although a cloud will remain on the title to the property. A foreclosure is the elimination of the equitable right of redemption to take both legal and equitable title to the property in fee simple. When a mortgagor defaults by, say, failing to make payments, the mortgagee (or its assigns) files a lien on the property and can eventually eliminate the aforementioned rights and exercise a power of sale. The power of sale is also known as a non-judicial foreclosure and is authorized and provided for by the mortgage itself or the deed of trust. In a “power of sale” foreclosure, if the debtor does not cure his or her default or file bankruptcy to stop the sale, the mortgagee will conduct a public auction much like a sheriff’s auction, and the mortgagee can itself bid on the property; unlike other bidders, the mortgagee can bid on credit.

5. Usually a mortgagee transfers the note to a special purpose vehicle (SPV) that pools the note with other such notes, all of which are sold/assigned (in the form of a “security”) on what’s called the “secondary mortgage market.” Proceeds generated by the assigned security are paid back to the original mortgagee to “compensate” the mortgagee for selling a “security” in the first place.

6. The mortgagee collects payments from the mortgagor (i.e., the monthly mortgage payments) and passes them along to the investors who have an assigned interest (i.e., a “security”) in the note.

7. In theory, the buying and selling of notes on the secondary mortgage market lowers the interest rates of all mortgagors and gives investors a low-risk “security” to invest in so that all parties are better off. The investors hold only an assignment of the note (i.e., a “security”), not the actual mortgage recorded in the land records. Therefore, they have an interest in the mortgage by way of the assigned note, but as assignees, they are protected from certain liabilities that could befall the originating mortgagee and are free from requirements such as recording their interests in the land records. Mortgagees like investors (who, again, enter into the process by way of SPVs) because they essentially pay off the risks associated with being the originating interest and then take on that interest with little risk involved.

8. The process of pooling notes and issuing securities to investors (thereby lowering interest rates over time) is called “securitization.” The investors, through SPVs, pay off the original mortgagees and thereby relieve the mortgagees of their burden of risk in being the lender to the mortgagor. In return, the investors gain an interest in the mortgage and, as assignees, do not have the risk that the mortgagee would have had in its relationship with the mortgagor. Therefore, the mortgagee and the investors have entered into a mutually beneficial relationship whereby each offsets the risks of the other. All of this happens without any effect on the mortgagor and his/her mortgage payments, except possibly to the extent that the mortgagor may, if anything, benefit from lower interest rates. However, mortgagors may not understand that SPVs and investors have been assigned the rights enjoyed by the mortgagee (namely, the right to foreclose), although the mortgagees should have apprised mortgagors that the security in the note has passed along to other parties.

9. The investors become what are called “trustees” that hold the securities (i.e., the assignments of the rights to the note). The rights and obligations of the mortgagees, SPVs, and trustees/investors are spelled out in a “Pooling and Servicing Agreement,” which includes provisions about allowing the SPVs or trustees/investors to foreclose on the mortgagor’s rights by virtue of the chain of interest that extended from the mortgagee’s original transfer to the SPV to the investors (i.e., the “trustees”).

10. This whole process has created difficulties with recording, which, although not required of assignees, often had to take place in order to have a record tracking the passing of interest from the SPV to the investor/trustees and potentially to other investor/trustees, etc. Accordingly, companies came into being that handled the transfer process and maintained electronic databases for investors and the mortgage industry generally to track the passing of interests in notes/securities. These companies are called “nominees.” When drafting their original note, the mortgagor and mortgagee designate a company to serve as a “nominee” for the mortgagee’s successor and assigns. The nominee does not own or fund the mortgage loan, but instead tracks the transfer of interest in the loan and becomes the mortgagee of record. A nominee makes its money by collecting on membership fees that are required of anyone wanting access to the database. There is an overlap between nominees and mortgagees because nominees serve the function and do the work of the mortgagee. In fact, banks and mortgage companies hire and certify people who can work with and for nominees.

Why the Union Soldiers Fought

In America, American History, Arts & Letters, Book Reviews, Books, Historicism, History, Humanities, Nineteenth-Century America, Politics, Southern History, The South on August 28, 2013 at 8:45 am

This review originally appeared here in The University Bookman.

Allen Mendenhall

Nearly every Southerner was raised studying the Civil War, or, as some here call it, the War Between the States. By the time I entered the public school system in Marietta, Georgia, in the 1980s, the War had long been a cornerstone of the curriculum, although Lost Cause mythology had dissipated and the Confederacy was hardly treated with tones of admiration. It became clear, however, that the War was more complicated than my teachers let on, that the events leading to and following this great conflict represented more than a morality play between competing forces of good and evil. There was, for example, the case of the Roswell Mill. Decades and decades ago, at this mill, the wives, mothers, sisters, daughters, and young sons of Confederate soldiers labored while the soldiers were off at war. One day Sherman’s Army showed up at the mill and absconded with the women and children. When the Confederate soldiers returned home, their women and children were gone. No one knows exactly what happened to the women and children of the mill, which is why they are still, to this day, called “The Lost Women and Children of Roswell.”

Recently trends in scholarship about the War have been uncritical in their assessments (or lack of assessments) of Union ideology as a contributing factor to the War. Gary Gallagher’s recent The Union War, a companion text to Gallagher’s earlier book The Confederate War (Harvard University Press 1997), corrects this trend.

This book is a restorative history, and a timely one at that. The year 2011 marks the 150th anniversary of the War, and for the last four decades, Gallagher notes, scholarship on the War has neglected to emphasize the ideology of Unionism.

Unionism is central to any understanding of the War. As Gallagher explains, “[T]he focus on emancipation and race sometimes suggests the War had scant meaning apart from these issues—and especially that Union victory had little or no value without emancipation.” Although Union soldiers may have understood that issues related to slavery precipitated fighting in 1861, for them that is not what the war was “about.” Gallagher adds that a “portrait of the nation that is dominated by racism, exclusion, and oppression obscures more than it reveals,” not least of all because it ignores the vast influx of immigrants and the relative receptivity toward different cultures that Americans championed to varying degrees, even at that time.

Gallagher’s goal in this book is to disabuse readers of the notion that the War was, for the typical Union citizen-solder, “about slavery.” The book asks three fundamental questions: “What did the war for Union mean in mid-nineteenth century America? How and why did emancipation come to be part of the war for Union? How did armies of citizen-soldiers figure in conceptions of the war, the process of emancipation, and the shaping of national sentiment?” In answering these questions, Gallagher’s focus is on “one part of the population in the United States—citizens in the free states and four loyal slaveholding states who opposed secession and supported a war to restore the Union.” Gallagher concludes that the War was, for the aforementioned citizens, one for Union, and that it only happened to bring about the emancipation of slaves. Emancipation was never the goal; it was a result.

“From the perspective of loyal Americans,” Gallagher explains, “their republic stood as the only hope for democracy in a western world that had fallen more deeply into the stifling embrace of oligarchy since the failed European revolutions of the 1840s.” According to this reading, Southern slaveholders of the planter classes represented the aristocracy that was responsible for the creation of the Confederacy. The Southern elite seemed like a throwback to monarchy. Citizen-soldiers of the Union Army believed that by taking on the Confederacy, they were restoring democratic principles and preserving the “Union,” a term that contemporary readers who lack historical perspective will have trouble understanding. Miseducated by Hollywood fantasies and adorations—consider the films Glory and Gettysburg—the average American today has lost all constructive sense of Unionism as it was understood to mid-nineteenth century Americans, especially in the North.

In five short chapters totaling 162 pages—notes excluded—Gallagher repeatedly identifies problems in the recent historical record, and then reworks and revises those problems, improving the record. He criticizes the tunnel-vision of scholars who write about The Grand Review as an exercise in racial exclusion, for instance, and he suggests that instead nineteenth-century descriptions of this procession indicate that “Unionism” meant something like “nation” and “America,” signifiers that stood in contradistinction to oligarchy and that were only tangentially related to racial ideology. By systematically picking apart various histories while summarizing and synthesizing a wealth of recent scholarship, Gallagher has produced what could be called a prolonged bibliographical or historiographical essay with extended asides about what is wrong in his field.

What is wrong, he suggests, is imposing contemporary preoccupations with race onto the mindsets of nineteenth-century Americans. Against this tendency, Gallagher reminds us of forgotten facts—for instance, that the passage of the Thirteenth Amendment had more to do with political unity than racial enlightenment, or that, over the course of the War, concerted military action by ordinary individuals (not the acts of rebel slaves, Abraham Lincoln, or congressmen) determined which black populations in the South became free. Gallagher interrogates the difference between Lincoln the “Savior of the Union” and Lincoln “The Great Emancipator.” He supports the study of military history, which other academics have scorned. All of this plays into Gallagher’s claim that although “almost all white northerners would have responded in prejudiced terms if asked about African Americans, they were not consumed with race as much of the recent literature would suggest.”

The take-home point from this book is that devotion to Union had greater currency for most Americans than did any contemporary understanding of a commitment to race. “Recapturing how the concept of Union resonated and reverberated throughout the loyal states in the Civil War era,” Gallagher submits, “is critical to grasping northern motivation.” This motivation was rooted in the belief that Union would preserve rather than jeopardize liberty, and had little to do with slavery, except in that an important side result was liberty for all.

Gallagher has reminded us of the importance of Unionism to the War and to the psychology of the average Northerner. He has reminded us that race was hardly a chief concern to the typical Northern soldier, and that retrospective imposition of our concerns onto theirs is poor scholarship and bad history.

What was Gomillion v. Lightfoot?

In America, American History, Arts & Letters, History, Humanities, Law, Politics, Southern History, The South on August 21, 2013 at 8:45 am

Allen Mendenhall

This piece originally appeared here in the Encyclopedia of Alabama.

In Gomillion v. Lightfoot, the U.S. Supreme Court ruled in 1960 that Tuskegee city officials had redrawn the city’s boundaries unconstitutionally to ensure the election of white candidates in the city’s political races. The case was one of several events that laid the foundation for the 1965 Voting Rights Act, which prohibited discriminatory voting practices. The case was named for Tuskegee Normal and Industrial Institute (present-day Tuskegee University) professor Charles A. Gomillion, who was lead plaintiff, and the defendant, Tuskegee mayor, Philip M. Lightfoot, among other city officials.

Gomillion, dean of students and chair of the social sciences division at Tuskegee, for years had facilitated voter registration movements for blacks in Tuskegee. He learned in 1957 that several white citizens were promoting a bill in the state legislature to redefine the boundaries of the city to ensure election victories by whites in 1960. Resisting these efforts and urging others to oppose any referenda meant to disfranchise black voters, Gomillion and other activists appealed to the City Council, wrote to the County Commission, lobbied the state legislature, and published an open letter in the Montgomery Advertiser. Despite these efforts, Local Act No. 140, introduced by Samuel M. Engelhardt Jr., passed in the state legislature in 1957. It reconfigured the boundaries of the city from a simple square shape to a figure with 28 sides, removing from the city Tuskegee Institute and all but four or five of the nearly 400 black voters, but none of more than 1,300 white residents. Gomillion and the Tuskegee Civic Association treated this initial setback as an opportunity to institute legal proceedings and thereby to mobilize concerted political action.

Gomillion and other petitioners, black citizens of Alabama and residents (or former residents) of Tuskegee, alleged that the act violated the “due process” and “equal protection” clauses of the Fourteenth Amendment to the Constitution. They claimed that the redrawn city boundaries disfranchised black voters; therefore, they alleged, the act had a discriminatory purpose. In fact, the act’s author, Engelhardt, was executive secretary of the White Citizens’ Council of Alabama.

Tuskegee’s white citizens were trying to change the city’s boundaries to head off the rise in African Americans registering to vote. After World War II, local African Americans wanted to play a more active role in the city’s civic life, and whites became more determined to deny them that right. Redrawing the city’s boundaries had the unintended effect of uniting Tuskegee Institute’s African American intellectuals with the less educated African Americans living outside the sphere of the school. Some members of the school’s faculty realized that possessing advanced degrees ultimately provided them no different status among the city’s white establishment.

Initially, the U.S. District Court for the Middle District of Alabama, in Montgomery, headed by Judge Frank M. Johnson, dismissed the case, ruling that the state had the right to draw boundaries, a ruling that was upheld by the Court of Appeals for the Fifth Circuit in New Orleans. The case was appealed before the Supreme Court on October 18 and 19, 1960. Gomillion did not travel to Washington, D.C., with the lawyers handling his side of the case. Veteran Alabama civil rights attorney Fred Gray and Robert L. Carter, lead counsel for the National Association for the Advancement of Colored People (NAACP), argued the case, with assistance from Arthur D. Shores, who provided additional legal counsel. They claimed that the state’s intent in the redistricting had been to discriminate covertly against African Americans.

On November 14, the Supreme Court rendered a unanimous decision in favor of the petitioners. Justice Felix Frankfurter, writing for the majority, held that the act violated the Fifteenth Amendment, which prohibits states from passing laws depriving citizens of the right to vote, and thus reversed the lower courts’ rulings. Frankfurter likewise dismissed the city’s appeal of generalities about state authority. He conceded that states retain extensive powers, but that they may not do whatever they please with municipalities. The case showed that all state powers were subject to limitations imposed by the U.S. Constitution; therefore, states were not insulated from federal judicial review when they jeopardized federally protected rights. In 1961, the results of the decision went into effect; under the direction of Judge Johnson, the gerrymandering was reversed and the original map was reinstituted.

Additional Resources

Elwood, William A. “An Interview with Charles G. Gomillion.” Callaloo 40 (Summer 1989): 576-99.

Gomillion, C. G. “The Negro Voter in the South.” Journal of Negro Education 26(3): 281-86.

Gomillion v. Lightfoot, 364 U.S. 339 (1960).

Norrell, Robert J. Reaping the Whirlwind: The Civil Rights Movement in Tuskegee. New York: Alfred A. Knopf, 1985.

Taper, Bernard. Gomillion versus Lightfoot: The Tuskegee Gerrymander Case. New York: McGraw-Hill, 1962.

Unmasking

In Arts & Letters, Creative Writing, Essays, Humanities, The South, Writing on August 14, 2013 at 8:45 am

This essay first appeared here in Kestrel: A Journal of Literature and Art.

Allen Mendenhall

There is no remembrance of former things; neither shall there be any remembrance of things that are to come with those that shall come after.

                                                                                           Qoheleth 1 : 11

Southerners are particular about the way they preserve their loved ones; they encourage embalming, for instance, although at one time they shunned it as unconsented-to tampering with the body.  Eventually someone decided, rather wisely, that the deceased, had they a choice, would like a genteel display of their “shell.”  This meant more than sanitization: it meant dressing the dead like ladies or gentlemen on their way to church.  Which is precisely where they were going—just before they were buried in the ground.  For the most part, Southerners don’t cremate.  (A preacher once told me that the Bible discourages cremation.)

In the South—more than in other regions—funerals are hierarchical affairs: one’s nearness to the deceased signifies one’s importance to the family.  This holds for the church and burial service and is especially true if the departed was popular in life.  Being closest to the deceased, pallbearers shoulder the weightiest burden.

Nowhere is decorum more important than at a funeral procession.  It’s unseemly for one who’s not party to the procession to fail to bow his head and arrange a grave face as the procession passes.  If you’re in a vehicle, you pull over to the curb and, so long as it isn’t dangerous to do so, take up the sidewalk as if on foot.  Quitting the vehicle is, in general, inadvisable if by the time you encounter the procession the hearse is no longer in sight.  Or if, alternatively, the weather doesn’t permit.  If you’re in line, the modus operandi is ecclesiastic—ordered from clergy, to immediate kin, to next-of-kin, to distant family, to friends, to the rest.  Losing your place in line is, accordingly, like losing your intimacy with the family, for whom these rituals are carried out.

I was eight when Great-Granddaddy died.  Mom piloted me before his open-casket and whispered, “That’s not Great-Granddaddy.  That’s just a shell.  Great-Granddaddy’s gone to heaven.”

I looked down at the thing, the shell, the facsimile that seemed uncannily human, and said to myself—perhaps out loud—“That’s not Great-Granddaddy.  That’s something else.”  But the thing appeared real, strange, so nearly alive that it repulsed me.  Its eyes, thank God, were closed, but its mannequin face, vacant and plastic, nauseated me.

Mom prodded me away, hollering at my cousin to take me outside.  My first brush with death, while necessary, had not imparted a healthy understanding of mortality.

My grandmother, Nina, tried to familiarize me with the inescapable while I was still a boy.  Instead of taking me to playgrounds, she took me to cemeteries for what she called “Southern preparations.”  These outings usually occurred on warm spring afternoons, when azaleas bloomed bright white and pink, when yellow Jessamine vines crawled up walls and fences, when dogwoods yawned inflorescent, and when tulips, still un-beheaded, stretched with impeccable posture.  When, in short, nature was doing anything but dying.

Nina shared facts about various grave plots, giving the lowdown on so-and-so’s passing—“he died in Korea,” “he of aids,” “she during pregnancy,” and so forth.  When she finished, we fed the swans.

Which attacked me once.  I was standing on the riverbank, feeding the once-ugly ducklings by hand just as Nina had taught me, when, like Leda, I was enveloped by a feathered glory of beating white wings.  Traumatized, I no longer stood on shore but sat on the roof of the car.  To make me feel less sissy, Nina sat on the hood and pretended that she, too, was afraid.  It wasn’t their size exactly.  Nor the way they tussled with graceful wrath.  Maybe it was the mask about their swan eyes.  I’m sure it was that: the concealment, secret identity, veiled feelings.

Just before I got married, my fiancée, Giuliana, flew in from São Paulo to meet my family.  After supper, Nina insisted that I drive her through the cemetery.  I hadn’t been in years but instantly recognized the rod-iron gates that once seemed so colossal.  There was the river.  The ducks.  The swans.  In the distance, a family, their heads bowed, stood under a high green tent.

Giuliana was not disturbed by this detour.  Quite the contrary:  she felt in some way moved.  It was as if Nina had invited her into a private, intimate space: one that contradicted this modern world of medical science in which everyone tries to postpone or avert death.  In a cemetery one couldn’t help but think of decomposition, permanence, the soul.  One couldn’t help but track the beat of one’s heart, measure the inhales and exhales of one’s breathing.  One couldn’t help, that is, but cherish the fact that one’s alive.

My cell phone buzzed.  An unknown number flashed across the screen.  I answered, “Hello?”

“Mr. Mendenhall?”

“Yes.”

“Are you in the car?”

“No.”

“This is the cancer center at St. Joseph’s Hospital.  We need you to come in.”

I was twenty-four, and about to hear, “You have cancer.”

Nothing—not even a Southern upbringing—can prepare you for those three words.

The odd thing about preachers is that, depending on time and place, their company is either most welcome or most unwelcome.  When I got the call, the cancer call, my uncle, a preacher, was beside me, and I was, to that end, glad.  He made me feel the power of presence, to say nothing of companionship:  I was not alone.

My uncle—Uncle Steve—preaches in the only Southern Baptist church in Chicago.  Unlike most Southern Baptist preachers down South, he eschews the noisy and spectacular, preferring, instead, politesse and restraint.  Bookish and professorial, his voice nasal, his nose suitably sloped to hold up his saucer-sized spectacles, he loves theology and will tell you as much at the drop of a hat.  What with his general softness, he might, with a bit more age, have been mistaken for Truman Capote, with whom, incidentally, his father—my grandfather—had grown up in Monroeville, Alabama.

A man of custom, a student of Latin and Greek, fluent in Russian and French, a former lawyer and journalist, Uncle Steve is uncommonly qualified to carry on the sanctifying traditions of Western Civilization.  He is, in short, a gentleman and a scholar.  And he was in Atlanta that day, standing in the Varsity parking lot, his belly stuffed full of chili dogs, his ketchup-smudged face like an advertisement for this, the world’s largest drive-in restaurant.

I could feel his gaze moving over me and spared him the discomfort of asking what was the matter.

“I have cancer,” I said.

As the words issued from my mouth, my chest felt as though someone were driving a stake into it.  Cancer.  That thing other people got.  Old people.  Not young and healthy people.  Not me.

I tried to act normal, but in doing so betrayed what I really felt—terror.

Uncle Steve put his arm around me.  “Come on.  Let’s get to the hospital.”

Every hour on the hour, the employees of St. Joseph’s Hospital pray together.  These moments, though heavily orchestrated, bring peace to the ill and dying, the sick and suffering.  The nurses and doctors who wander the hallways pause while a disembodied, female voice recites the Lord’s Prayer, first in English, then in Spanish.  “Our Father, who art in heaven…”—the words echo off the cold, linoleum tiles—“hallowed be thy name.

This was happening when I walked into the waiting room.  A nurse, a heavyset black woman with the softest eyes I’d ever seen, was behind the counter, her necklace, weighed down by a tiny crucified Jesus, dangling at her pillow-like breasts.  She whispered, again and again, amen, amen, and then, looking up, took me in with those deep knowing eyes, spoke without speaking.  Sunlight streamed through the cool, trapezoid panes of glass in the ceiling, falling across her face and hair at a low angle.

At last the prayer ended.  She unfolded her hands and smiled formally.  “Good afternoon, how may I help you?”

Responding with “I have cancer” didn’t feel right, so I said, “I’m here to see Dr. Danaker.”

That was all she needed to know.

“Bless your heart, child,” she said.  And, for the first time, I got emotional.  She hugged me, calling me child again; then, right then, I wanted to be a child, wanted her to scoop me into her arms and cradle me, wanted her thick, strong body wrapped around me; but there, too, was Uncle Steve, dignified and collected.  I couldn’t break down in front of him.

The nurse ushered me into a white, windowless room with expansive tile walls and sat me on a tissue-papered chair, which swished and crackled whenever I readjusted my derrière.

There I was.  Conscious.  Being, yet trying to fathom not being.  I imagined myself in a coffin, like that horrid shell, Great-Granddaddy.  Which only made things worse, for I knew that, once in the coffin, I would have no notion of being there.  The problem was thinking itself.  I couldn’t imagine being dead because I couldn’t imagine not imagining.

On Sunday mornings, before church, dad had always made my siblings and me read from the obituaries.  This, he said, would acquaint us with the fragility of life.  He also thought the best way to learn was from experience.  But he’d known only one person who’d experienced death and, almost impossibly, lived to tell about it—Martin, a friend of the family, who’d apparently died three times and, on the operating table, been revived.  Martin loved cigarettes, which he called the backbone of Southern economy and which, he readily admitted, had brought about his three near-fatalities.

Except Martin didn’t put it in those terms.  To him, cigarettes had allowed him to float outside his body for a while, to see what death was like.  For better or worse, Martin didn’t tease a tunnel of light, greet a golden angel, or feel a fluffy cloud:  he simply “left” himself and, in a state of utter weightlessness, peered down on his body as would an outside observer.  Maybe that’s why dad didn’t like us talking to Martin about death: Dad wanted us to hear about St. Peter and heaven and departed relatives.

The trouble with Martin was that one never knew when to believe him.  Heck, we barely knew who he was.  Ephemerally at least, he’d been my aunt’s boyfriend; then, when she dumped him, he’d never gone away: he moved in with my other aunt, a single mother, and helped care for my young cousin.  Martin was present every Thanksgiving and Christmas, but neither got nor gave gifts.  A transplant from North Carolina, he had daughters somewhere—either the Carolinas or Virginia—and had graduated from the University of North Carolina at Chapel Hill, an achievement he was quite proud of.  He didn’t work.  Didn’t own a car.  And didn’t seem to have money.  His singular ability to access death could’ve been, for all we knew, lifted from a sci-fi novel.  Nevertheless, I believed him.

Ten.  That’s how old I was when I saw a dead body I wasn’t supposed to see.  A right turn on I-85, heading north, highway stretching to where sky and land sandwiched together.  I was in my school outfit, backpack in my lap.  Mom was in her tennis getup, checking the rearview mirror.  Traffic was slowing and stopping.  To my left was a vast gray sheet held up by blank-faced men.  Behind it, a woman.  Or what was left of a woman.  Arms and legs bent at impossible angles; head sagging, possibly unattached; a bloodied skirt lifted by the breeze.  Someone’s mom.  Or sister.  Or wife.  Or girlfriend.  Or daughter.  Here one minute, gone the next.  This wasn’t dignity.  This was mean and messy.

Death, they say, is not only universal but also the great leveler: it befalls kings and paupers, rich and poor, wise and foolish.  Solomon, Caesar, Constantine, Charlemagne, Napoleon: all died despite their glory in life.  What I never understood, and, frankly, still don’t, is why folks pretend death doesn’t happen.  The person who ignores death is delusional at best, narcissistic at worst.  Death is our sole commonality, the thing in this world we all await, about which we may commiserate.  It’s what makes us human.  I daresay one can’t fully love a person without knowing that person is temporary.

Francis Bacon once declared, “The contemplation of death, as the wages of sin, and passage to another world, is holy and religious; but the fear of it, as a tribute due unto nature, is weak.”  Weak it may be to the healthy and fit, but to the ill and ailing it seems only natural.  The person who claims he doesn’t fear death is either a liar or an incorrigible maniac—or else a coward, too faint-of-heart to face the facts.  Bacon himself had the good fortune of dying in two to three days, having contracted pneumonia while conducting an experiment in the snow.  Willfully blind to his fate, lying on his deathbed, he penned a letter to his friend, Thomas Howard, expressing relief that he hadn’t suffered the fortune of Caius Plinius, “who lost his life by trying an experiment about the burning of Mount Vesuvius.”

After surgery, I, like Bacon, was bedridden.  Soon a phone call would tell me one of two things: that I was okay, my cancer hadn’t metastasized, or else I wasn’t okay, I needed chemotherapy and my chances of living another two years were below fifteen percent.  A glued-together wound, resembling fat, blue, puckered-up lips, took up the length of my chest.  Visitors asked to see it and then regretted their request when I rolled up my shirt, revealing a moon-shaped, smurfy smile.  When the visitors left, and I was alone again, alone and quiet, I imagined what the malignancy would look like as it spread through my body, which I conceived of as a mini mine field: tunneled with small explosive cancer clusters about to be detonated.  How could this shell—which once ran a mile in under four-and-a-half minutes—expire?

I’m not in my brain but somewhere lower: near the chest, maybe, or the gut.  I couldn’t, for instance, stop a dream even if I wanted to.  Which is odd, because it’s my brain that’s dreaming—not someone else’s.  The brain works independently of me, or, to be precise, of what I perceive to be me: it’s like an unmanned motor boat zipping on the water.  Occasionally one of my siblings, or an old friend, will recall some long-ago event, which I’d otherwise forgotten, and then, suddenly, I’ll remember.  The brain has stored this memory somewhere—somewhere not readily accessible—but I, wherever I am in this shell, never felt compelled to find it.  The thought just exists up there, waiting.

It’s the soul, I suppose, that’s me.  When I lie awake at night and contemplate this interim body, which I inhabit the way a renter inhabits an apartment, I locate my self—that subjective knowing ego—whole and center, as though the brain, convenient as it is, has a mind of its own.  To be sure, I can borrow this organ when I study or otherwise require deep reflection; but when I tire of thinking, when I want a break, when I lean back from my desk, I’m very aware that I, my self, am moving from the head to just above the torso, where I belong.  And when I experience joy, compassion, anguish, despair—when, that is, I feel—it’s never with my head but with something deep within my bosom.  How does one explain this?  Perhaps we’re all antecedent to the body: little floating things confined to this definite, corporate form we didn’t choose, waiting, like thoughts, to be accessed—or released.    

Opossums, more commonly known, in the South, as “possums,” are, I’m told, a delicacy.  Nina’s got a cookbook that says so, though she claims she’s never cooked or eaten one.  I have my doubts, since my dad grew up eating squirrel, which, I think, is more revolting because squirrels are cute and handsome, whereas possums have that eerie look I associate with demons and devils—and masks.

At seven, I persuaded my brother to take a life.  A possum’s life.  It was a horrible affair, really.  One that, even today, is difficult to own up to.  Brett, being the gullible little brother he was—I convinced him once that the shadow-puppet giant who lived on the ceiling would kill him in his sleep—stomped on a squeaking pile of pine-straw while I looked on, presumably to punish him if he disobeyed.  Of course, the squeaking didn’t belong to the pine-straw, but to a tiny nest of baby possums underneath.

For some reason, I was initially proud of what I’d done, and, hours later, said as much to my mom.  Horrified, she made me show her the nest, since I’d “cried-wolf” before.  Sure enough, there, in the pine-straw, lay a bloody baby possum, whimpering, dying.

My first defense was I hadn’t done anything.  Brett had.  I’d simply stood by and watched.  Mom was smarter than that.  I don’t remember what she said—only that, once she said it, I began to cry.  And couldn’t stop.

It was this event, this murder of an innocent, that brought about my general appreciation for original sin, or least for the idea of innate human depravity.  Humans, you might say, are born rotten—so much so that most of us, in our youth, could stomp infant possums to death without understanding the wrongness of our action.  No doubt I regretted this behavior—this actus rea—but not because I felt guilty: it was, in effect, because I feared punishment—some combination of mom’s wrath and her spank-happiness.  A parent’s role is, among other things, to tame a child’s destructive impulses.  That’s what mom did—without succumbing to her own elemental aggressions.

She called the Chattahoochee Nature Center, a local environmental organization, and a worker there explained how to save the baby possum.  This, then, became my task, my agonizing punishment: to keep the possum alive.  Being intimate with death is one thing; being intimate with suffering quite another.  When I scooped the trembling creature up to my palm, it emitted a sad, pitiable squeak.  “Everything’s okay,” I whispered, “I’m not here to hurt you”—a funny assurance coming from the kid who’d just ordered its murder.

If truth be told, I wished I’d just destroyed the thing.  Better dead than in this wretched condition.  Still, the way it looked at me—its beady, searching eyes perusing my face—reminded me of how Ansley, my little sister, then only a year old, looked up at mom when she wanted to be fed.

I placed the creature in a shoe box, which I tucked beneath a shelf in my parents’ closet, the darkest place in the house.  More than anything, the possum needed darkness and silence.  I dug a hole in the backyard, tied two twigs together in the shape of a cross, and arranged a constellation of stones around what would’ve been a grave.  But the thing didn’t die.  It healed so well that, the next morning, it was squirming and scurrying and dad needed a net to contain it.  Even after the possum was free in the backyard, I left the grave untouched, a reminder that all things, even possums, eventually come to an end.

My Southern upbringing was all about learning how to die.  Like the Greek Stoics, Southerners believe in cultivating virtue, improving life, and, above all, accepting mortality.  Liberated from urban distractions, tied to land and home, they regard humans as custodians of the past; they keep gardens, preserve antiques, record lineage, mark battlefields, and salvage the efforts of planters, carpenters, raconteurs, and architects; they ensure, in short, the availability of history.  This can lead to nostalgia for times they never knew, bad times, ugly times, which is to say that this can cause Southerners to overlook—or, worse yet, revise—the inconveniences of history: slavery, for instance, or civil rights.  All the same, the Southern tradition, burdened as it is by various conflicts, retains virtues worth sustaining: community, family, religion, husbandry, stewardship.  These customs, however vulnerable, hardly need guardians.  They will, I suspect, persist, in some form or another, as long as humanity itself; for they are practical, permanent ideals—tested by generations—which people fall back on during disorienting times.  In a region haunted by racial brutality, these principles are, and have been, a unifying reference point, a contact zone where cultures—black, white, and Hispanic—share something spiritual despite their differences.

Living history, not just studying it, but consciously living it, is neither wicked nor wrong; the chronic, urgent awareness that everything you know and love will come undone, is not, I think, misguided, but utterly essential.  There’s something beautiful about facing the insurmountable.  When the world’s fleeting, death becomes a liberating, albeit terrifying, reality.  It throbs and pulsates and beats beneath the skin, inside of which we’re all raw skeleton.

For all this, however, I wasn’t ready.  Didn’t want to die.  Couldn’t even conceive of it.  The twenty-something years my family had been teaching me about death amounted to, not nothing, but not much, either.  Death, I suppose, is a hard thing to accept, and an even harder thing to fight, since fighting seems so pointless: deep down, you know you can’t win.  You might prevail once.  Maybe even twice.  But ultimately it’ll beat you.  It almost did me.

Friends ask how it feels to “beat” cancer.  I never can answer—not satisfactorily—for the experience is more like submission than competition: it’s a manifold process of coming to terms with the body, a thing doomed to decay.  When the doctor—Dr. Danaker—called to say the lymph nodes were benign, that the cancer hadn’t spread, I shocked him with a tired reply:  “Oh, good.”

“This is great news,” he assured me, as if I needed reminding, as if I hadn’t appreciated—indeed, hadn’t understood—how lucky I was.

“I know,” I said.

At this, the good doctor seemed annoyed.  “Ungrateful kid,” his tone implied.  But I wasn’t ungrateful.  Nor ecstatic.  I was, simply put, unbound—by life, by people, by things.  His take was that I had another chance, a fresh start, that I could put this nonsense behind me and move on.  My take was that, having embraced impermanence, I was done protecting myself from suffering, done seeking security through delusion, done dislocating from fate, destiny, providence, what have you.

Done: this, it is true, is weary resignation.  Yet it’s more than that: it’s a sweet but unhappy release, a deliverance, an unmasking.  Almost paradoxically, it’s freedom within—and despite—limitation.

What’s more exhilarating than that one should die?  What’s more mysterious, more horribly electrifying?  As one writer, Paul Theroux, has put it, “Death is an endless night so awful to contemplate that it can make us love life and value it with such passion that it may be the ultimate cause of all joy and all art.”  That is how you cope with this chilling, daunting, stupefying phenomenon: you do it every day until it’s serviceable and aesthetic, until at last you won’t know, can’t know, when it happens, until it’s pleasurable, a masterpiece, sublime in its regularity.  You keep it close, so close it becomes part of you, so close it’s at your disposal, so close that without it, you’re nothing, nothing if not boringly, thoughtlessly, mechanically alive, which is just another way of being dead.  You train and train and then it comes.

Law and Locality

In Arts & Letters, Humanities, Jurisprudence, Justice, Law, Libertarianism, Philosophy, Politics on August 7, 2013 at 8:45 am

Allen Mendenhall - Copy

On one common definition, law is a practice or set of rules based in custom and habit.  Law is not diktat.  It arises spontaneously through the interaction of human agents operating within and among social groups and precedes State promulgation.

Legislative enactment can reflect the law as it is constituted in the mores and traditions of groups, but it can also be the result of governmental usurpation.  The legislator does not embody the peoples he represents, and as society grows ever more complex and populations ever denser, as technologies link us more and more to one another in cyberspace and other virtual fora with disembodied communicants, the notion that the legislator speaks on behalf of his constituents becomes increasingly dubious if not downright absurd.

Local groups such as schools, clubs, community organizations, and churches have complex rules of exchange derived from shared mores and traditions.  They are more likely to speak accurately about the wants and needs of their community.  Their rules are not necessarily articulated, but tacitly understood.

These local groups recognize regulations not as monolithic, governmental impositions but as integrated schemes of social principles.  Group-members who fail or refuse to follow rules and regulations are punished.  On this local level, punishment can be simple: ostracism or public disapproval. A businessperson who violates another businessperson’s trust will lose business, just as he will lose clients by losing consumers’ trust; a church-member living in sin will likewise suffer from the judgment of his peers or, more appropriately, from the canon law pertaining to his sin.

In these examples, it is clear that the State should not intervene in punishing the wrongdoer; local custom and habit suffice to regulate conduct without resort to State violence or compulsion; therefore, private associations suffice to generate rules and their corresponding punishments.  Distant government bodies are not likely to conform to the intricate constitutions of local peoples and therefore are likely to exercise their disciplinary powers using punitive, exploitative, or arbitrary means.

William Lane Craig: Four Debates

In Arts & Letters, Christianity, Epistemology, Ethics, God, Humanities, Philosophy, Religion, Teaching on July 31, 2013 at 8:45 am

William Lane Craig

William Lane Craig, a philosopher and Christian apologist, is a member of Johnson Ferry Baptist Church, which my wife and I visited regularly when we lived in Atlanta and where my parents, siblings, grandmother, uncle, aunt, and cousins remain members.  Earlier this month, The Chronicle of Higher Education ran a profile piece on Dr. Craig.  Below are four high-profile debates in which Dr. Craig participated.  Enjoy.

1.  Dr. Craig debates Christopher Hitchens on the Existence of God.  The video has not been made available for embedding on external websites, so the best I can offer is a link.

2.  Dr. Craig debates Stephen Law on the Existence of God.

 

3.  Dr. Craig debates Peter Atkins on the existence of God.

 

4.  Dr. Craig debates Alex Rosenberg on the reasonableness of faith in God.

Pantry, 1982

In Arts & Letters, Creative Writing, Humanities, Poetry, Writing on July 24, 2013 at 8:45 am

Allen Mendenhall

 

A box of cereal, stale, ants running

Up the side, two brown bananas that

 

He says cleanse the pores

(If rubbed thoroughly),

 

An unwrapped chocolate bar

And a plethora of cans, unopened:

 

In a locked pantry, Little Maddy sits

Plucking the stems

 

Off Granny-Smiths.  Just ten more

Minutes.  Maddy, weary, wondering

 

Just when daddy would come home.

Time: the pantry is unlocked

 

And out comes light

And apples and, lastly, Maddy.

 

Daddy reaches

For the two rotting bananas,

 

Notes can upon unopened can,

Unwraps the chocolate bar,

 

Smears chocolate on his fingers,

Stops, thinks how unlikely it is

 

For apples to lose their stems.

Donna Meredith Reviews “Keep No Secrets,” by Julie Compton

In Arts & Letters, Books, Fiction, Humanities, Law, Law-and-Literature, Novels, Writing on July 17, 2013 at 8:45 am

Donna Meredith is a freelance writer living in Tallahassee, Florida. She taught English, journalism, and TV production in public high schools in West Virginia and Georgia for 29 years. Donna earned a BA in Education with a double major in English and Journalism from Fairmont State College, an MS in Journalism from West Virginia University, and an EdS in English from Nova Southeastern University. She has also participated in fiction writing workshops at Florida State University and served as a newsletter editor for the Florida State Attorney General’s Office. The Glass Madonna was her first novel. It won first place for unpublished women’s fiction in the Royal Palm Literary Awards, sponsored by the Florida Writers Association, and runner up in the Gulf Coast novel writing contest. Her second novel, The Color of Lies, won the gold medal for adult fiction in 2012 from the Florida Publishers Association and also first place in unpublished women’s fiction from the Florida Writers Association. Her latest book is nonfiction, Magic in the Mountains, the amazing story of how a determined and talented woman revived the ancient art of cameo glass in the twentieth century in West Virginia.  She is currently working on a series of environmental thrillers featuring a female hydrogeologist as the lead character.

Julie Compton

Above: Julie Compton

The following review is appearing simultaneously in Southern Literary Review.

Keep No Secrets, Julie Compton’s powerful sequel to Tell No Lies, is guaranteed to keep readers turning pages into the wee hours of the morning. Both of Compton’s courtroom thrillers are set in St. Louis, Missouri, where she grew up.

Like Jodi Picoult’s best works, Compton’s novels sizzle with all the trust, betrayal, love, and forgiveness family relationships entail—especially when you expose their private conflicts in a public courtroom. Her books seem to pose this question: how well can you know even those people closest to you?

Read Tell No Lies first. Though the sequel provides enough backstory to be a great read on its own, without understanding the first book you’d miss the riveting psychological development of the primary characters, all of whom star in the sequel as well.

In Tell No Lies, idealistic lawyer Jack Hilliard leaves behind a lucrative private practice to run for district attorney. The plot centers around a high-profile murder case. Jack is easy to like because he tries so hard to do the right thing. But there wouldn’t be a story if he were perfect. He yields to one temptation, which hurls his life on a downward spiral that nearly ends his marriage and his career.

The final plot twist leaves you wondering if Jack has been manipulated. Compton is that rare author who trusts her readers’ intelligence. She allows us to figure things out for ourselves, to experience the same doubts as Jack Hilliard. It makes the novel more like our own lives, where we can’t always tell what people’s motives are or know when they are lying.

Keep No Secrets begins four and a half years after the events of Tell No Lies. During that time, Jack Hilliard has worked arduously to repair the damage caused by his mistakes—and has largely succeeded. Until the night he finds his teenage son Michael having sex with his girlfriend. They are drunk. Being a white knight kind of guy, Jack gives the girl a ride home. In an effort to win back his son’s love and respect, Jack doesn’t tell his wife about Michael’s transgressions. That car ride sets off an unforeseeable chain of events that threaten to wreck Jack’s career and marriage once again.

Think that’s enough dirt to dump on a nice guy like Jack? Not a chance. The already untenable situation deteriorates further when Jenny Dodson, the woman involved in his earlier downfall, reappears after all these years, asking for his help. He can’t say no, but he vows to keep his wife truthfully informed of everything that happens. He does. Sort of. “The lies aren’t what he says; they’re what he doesn’t say”—this is a refrain Compton artfully employs several times.

This novel deals with social issues like the impact of adultery and sexual assault on families. Most readers are going to put themselves in the various characters’ situations and ask themselves if they would have behaved differently. Would we lie to protect a loved one? What if you knew something that would put the one you love in jail or in danger? Would you tell the truth? What if not telling keeps an innocent person imprisoned? How far should we trust the legal system? If a spouse gave us reason to doubt, could we forgive and trust again? When is it time to give a marriage another chance—and when is it time to walk away?

Compton’s novels are as fine as any courtroom thrillers out there. Though her use of present tense can be a bit distracting, the well-plotted series sparkles with psychologically complex characters.

For both undergraduate work and law school, Compton attended Washington University in Missouri. She began her legal career there, but last practiced in Wilmington, Delaware, as a trial attorney for the U.S. Department of Justice. She now lives near Orlando with her husband and two daughters and writes full-time. She is also the author of Rescuing Olivia, a novel of suspense, romance, and family drama.

Below: Donna Meredith

Donna Meredith