Below is the next installment in the lecture series on literary theory and criticism by Paul H. Fry. The previous lectures are here, here, here, here, here, here, here, here, here, here, here, here, and here.
Archive for the ‘Conservatism’ Category
Miles Smith IV is a visiting assistant professor at Hillsdale College and a historian of the Old South and Atlantic World. He took his B.A. from the College of Charleston and holds a Ph.D. in History from Texas Christian University. He is a native of Salisbury, North Carolina.
Last week saw the alignment of a peculiar set of anniversaries: The Fiftieth anniversary of Churchill’s death, the seventieth of the liberation of Auschwitz by the Soviet Army, and the 208th birthday anniversary of Robert E. Lee. Sir Winston Leonard Spencer-Churchill died in 1965. One century earlier General Robert E. Lee surrendered the Army of Northern Virginia to his Federal counterpart U.S. Grant. Churchill and Lee enjoyed widespread acclaim for their conduct—Lee in the late nineteenth and both he and Churchill in the latter half of the twentieth century. In recent years deconstructing both men enjoyed being the vogue of both academic and popular commentators. Both Churchill and Lee lived their lives as traditionalists. Neither embraced the social or moral innovation of their own eras. Modern commentators degrade both for their seemingly reactionary ideals. Unsurprisingly, Churchill adored Lee (and Abraham Lincoln as well). A recent historian opined that Lee’s “tragic flaw” was that he upheld the genteel values of eighteenth century Virginia “in a society that left older ideals of nobility and privilege behind.” One might grant that Lee’s aristocratic and heavy-handed slaveholding would understandably guarantee him a fair share of detractors in the early twentieth century, but this commentator offered as his reason for deconstructing Lee a calamitous rationale:
In the long run, Lee’s decision to follow Virginia out of the Union and resign his commission from the US Army further reveals his eighteenth century sensibilities which emphasize state over country and a parochial interest in defending home and family rather than one’s nation. In choosing loyalty to his state over loyalty to his country, Lee ensured that his destiny would be tainted by defeat and the specter of treason.
The disturbing notion that one’s parochial interest in defending his home and family constitutes a “fatal flaw” ultimately saw its hellish culmination in the totalitarian nationalist regimes of the twentieth century. It was Lee’s very cultured localism, tragically tinged as it was with slaveholding, that endeared him to Winston Churchill.
Before Winston Churchill assumed the premiership of the United Kingdom and before he battled the nationalist brutes ruling Germany and Italy, he wrote history. In History of the English Speaking Peoples: The Great Democracies, the fourth volume of his history of the Anglosphere, his view of American history reflected a patrician education and disposition. Never comfortable in the twentieth century, Churchill kept the values of a bygone Victorian Era well into the middle of the twentieth century. In Lee he found a similarly anachronistic gentleman of the eighteenth century living in the nineteenth. Churchill wrote that Lee’s “noble presence and gentle, kindly manner were sustained by religious faith and an exalted character.” He “weighed carefully, while commanding a regiment of cavalry on the Texan border, the course which duty and honour would require from him.” Churchill overstated Lee’s antipathy towards slavery but nonetheless seized on the Virginian’s conservative Whiggish politics. Lee knew secession to be dangerous and ill-advised “but he had been taught from childhood that his first allegiance was to the state of Virginia.” Churchill found Lee’s Old South an admirable but flawed reflection of British gentry. “There was,” said Churchill, “a grace and ease about the life of the white men in the South that was lacking in the bustling North. It was certainly not their fault that these unnatural conditions had arisen.” Churchill’s denotation of white men underscores his innate humanity. White men, he knew, built their civilization on the backs of enslaved people held in human bondage. “The institution of negro slavery,” Churchill knew, “had long reigned almost unquestioned.” Upon the basis of slavery “the whole life of the Southern states had been erected.” Churchill saw a “strange, fierce, old-fashioned life. An aristocracy of planters, living in rural magnificence and almost feudal state, and a multitude of smallholders, grew cotton for the world by slave-labour.” Churchill’s empathy for the planter class stemmed from his willingness to conceive them as a class that “ruled the politics of the South as effectively as the medieval baronage had ruled England.” Southerners who by varying degrees colluded with the capitalist system became feudal agrarians and misplaced Englishmen in Churchill’s romantic imagination. 
Southerners engaged in the capitalist system in the antebellum era. Not all southerners were equally capitalist, however, and the Whig planters of Mississippi and Louisiana embraced the economic, expansionistic, and modernizing nationalism of the United States in a way that horrified old planters in Virginia and Carolina. Nonetheless, the Old aristocratic Anglo-American planter communities provided Churchill with set pieces as he wrote his histories. Of Lee, Churchill somberly wrote that he “wrestled earnestly with his duty” during the secession crisis. “By Lincoln’s authority he was offered the chief command of the great Union army now being raised. He declined at once…” The immediacy of Lee’s refusal supplied Churchill with a heroically long-suffering but duty-bound Anglophone hero. Churchill made much of how Lee resigned, “and in the deepest sorrow rode across the Potomac bridge for Richmond. Here he was immediately offered the chief command of all the military and naval forces of Virginia.” Lee’s decision, thought Churchill, seemed beautiful and tragic. “Some of those who saw him in these tragic weeks, when sometimes his eyes filled with tears, emotion which he never showed after the gain or loss of great battles, have written about his inward struggle. But there was no struggle; he never hesitated.” Lee’s choice, declared Churchill, “was for the state of Virginia. He deplored that choice [and] foresaw its consequences with bitter grief; but for himself he had no doubts at the time, nor ever after regret or remorse.” Writing in 1858, Lee appeared as a forerunner of Churchill himself: warning of the disaster befalling England, but fighting determinedly when the conflict came. 
Sensitive to the political differences between Imperial Britain and the United States, Churchill nonetheless tried to make sense of the American Civil War and its aftermath. Churchill saw that “Radical vindictiveness” in Republican ranks “sprang from various causes. The most creditable was a humanitarian concern for the welfare of the negro.” Belief in the God-given humanity of African Americans was “shared only by a minority.” Churchill believed that “more ignoble motives were present in the breasts of such Radical leaders as Zachariah Chandler and Thaddeus Stevens.” Because they loved “the negro less than they hated his master, these ill-principled men wanted to humiliate the proud Southern aristocracy, whom they had always disliked, and at whose door they laid the sole blame for the Civil War.” But Churchill argued that “there was another and nearer point.”
The Radicals saw that if the negro was given the vote they could break the power of the Southern planter and preserve the ascendancy over the Federal Government that Northern business interests had won since 1861. To allow the Southern states, in alliance with Northern Democrats, to recover their former voice in national affairs would, the Radicals believed, be incongruous and absurd. It would also jeopardise the mass of legislation on tariffs, banking, and public land which Northern capitalists had secured for themselves during the war. To safeguard these laws the Radicals took up the cry of the negro vote, meaning to use it to keep their own party in power.
Churchill conceived of the Civil War from a perspective of a Briton deeply suspicious of the effects of modernizing industrial nationalism. His best known Liberal biographer, Lord Jenkins, painted him as a champion of Free-trade economic libertarianism and of workers as well. William Manchester, a far more conservative biographical voice, likewise understood Churchill as essentially a Free-trader whose conservatism remained confined to foreign policy. Free-trade economic views never allowed Churchill to entirely embrace the relationship between corporation and nation that characterized post-Civil War American politics. 
Capitalism accompanied Free-trade in Churchill’s mind, and he affirmed capitalism in his ideals about society. But he likewise displayed antipathy for the wedding of corporation and nation that followed the American Civil War. Of the captains of industry he wrote that “Carnegie and Rockefeller, indeed, together with Morgan in finance and Vanderbilt and Harriman in railroads, became the representative figures of the age,” when compared to the “colourless actors upon the political scene.” “Though the morality” of the captains of industry “has often been questioned, these men made industrial order out of chaos. They brought the benefits of large-scale production to the humblest home.” Still, Churchill saw the Gilded Age American Union as racked “by severe growing pains” and unrest. “There was much poverty in the big cities, especially among recent immigrants. There were sharp, sudden financial panics, causing loss and ruin, and there were many strikes, which sometimes broke into violence.” Most disturbing to Churchill the free trader, “Labour began to organize itself in Trade Unions and to confront the industrialists with a stiff bargaining power. These developments were to lead to a period of protest and reform in the early twentieth century.” Churchill’s deep ambivalence about the wedding of capitalism and nationalism led him the recognize “gains conferred by large-scale industry” but also to lament that “the wrongs that had accompanied their making were only gradually righted.”
Churchill’s British perspective offered a nuanced perspective that stood outside the intemperate screeds of Lost Cause southerners, and the more numerous and far more influential hyper-nationalist hagiography devoted to the white northern liberators. Churchill understood that slavery constituted the great systemic evil of the nineteenth century United States and caused the Civil War. His libertarian proclivities left him unconvinced of the necessity of 800,000 dead. In this he prefigured agrarian Wendell Berry who noted in his essay “American Imagination and the Civil War” that a botched emancipation was far batter than no imagination. But Berry also noted that history demands that a botched emancipation be criticized for what was botched. David Goldfield, former president of the Southern Historical Association, declared in his America Aflame that his work was “neither pro-southern nor pro-northern. It is anti-war, particularly the Civil War.”
To his credit, Abraham Lincoln regretted the Civil War’s violence in 1865 and subsequently proposed an expeditious readmission criterion for the seceded states, only to have it scuttled by Radical Republicans after his assassination. Unbeknownst to Lincoln, who genuinely seemed interested in restoring the status quo ante bellum, the war unleashed the ideological monstrosity of modern industrial nationalism on the American polity. Harry Stout recognized that industrial nationalism tarnished the war’s consequence of liberating African Americans from chattel slavery. Elliott West’s history of the Nez Perce War of 1877 posited the idea of a greater Reconstruction, whereby the Republican Party remade the entirety of the continental American polity in the image of white capitalistic, militaristic, Evangelistic Protestant nationalism. Native Americans stood in the way of the American nation, and the U.S. Army ruthlessly destroyed the last free Indian societies in the Far West. Societal transmutation on that scale necessitated violence in the name of the nation. Jackson Lears pointed out in his Rebirth of a Nation that racism on a societal scale (southern and northern) fed this nationalism driven by a political organization formally committed to black liberty. By 1900, four decades of almost uninterrupted Republican government turned the United States into an imperialistic nation-state. Though to a small degree mitigated institutionally in the United States by a lingering federalism, nationalism with its muscular industrial core eventually threw Europe into the nightmare of two world wars.
Few American historians have offered an anti-nationalist vision of the Civil War. The camps seemed too rigidly defined for works such as Churchill’s to remain valid. Churchill’s vision of the American Civil War Era is at once not southern enough for Lost Cause partisans, nor is it sufficiently pro-northern for Neo-Abolitionists. Churchill saw the conflict as a tragedy. Nationalist historians and political philosophers generally counted the war a blessing; to think it a tragedy negated the benefits of union and emancipation. British Marxist Robin Blackburn exasperatedly asked why “a willingness on the part of the United States to admit the possibility that the war was not the best response” to secession or slavery was seen as condoning either.
Conservative historians understandably co-opted Churchill into the pantheon of Anglo-American heroes committed to the maintenance of the Western World and to its transcendent expression of human liberty. Much of the resilience involved in Churchill revolves around the image of a nationalist military chieftain committed to Britain’s place in the world. That image is true—Churchill biographer Carlo d’Este argued that his subject was one of the humans truly born for war—but not complete. John Keegan once described Churchill as a true libertarian, and this seems an appropriate corrective given the multitude of remembrances published on this fiftieth anniversary of his passing.
 Glenn W. LaFantasie, “Broken Promise,” Civil War Monitor 13 (Fall, 2014): 37
 Winston S. Churchill, A History of the English-Speaking Peoples Vol. 4: The Great Democracies.
 Churchill, Great Democracies.
 Roy Jenkins, Churchill: A Biography (New York: Farrar, Straus, & Giroux, 2001), 398-401; William Manchester, Last Lion: Winston Spencer Churchill, Visions of Glory (New York: Little & Brown, 1989), 361.
 Churchill, The Great Democracies.
 Wendell Berry, “American Imagination and the Civil War,” in Imagination in Place (Berkeley, CA: Counterpoint, 2010), 27; David Goldfield, America aflame: How the Civil War Created a Nation (New York: Bloomsbury, 2011).
 Elliott West, The Last Indian War: The Nez Perce Story (Oxford and New York: Oxford University Press, 2009); Jackson Lears, Rebirth of a Nation: The Making of Modern America (New York: HarperCollins, 2009).
 Robin Blackburn, “Why the Muted Anniversary? An Erie Silence,” CounterPunch (18th April 2011):
 Carlo d’Este, Warlord: A Life Winston Churchill at War, 1874-1945 (HarperCollins, 2008); John Keegan, Winston Churchill: A Life (New York: Penguin, 2002), 27.
An earlier version of this essay appeared here at Fronch Porch Republic.
Remember the printed prose is always
half a lie: that fleas plagued patriots,
that greatness is an afterthought
affixed by gracious victors to their kin.
—John William Corrington
It was the spring of 2009. I was in a class called Lawyers & Literature. My professor, Jim Elkins, a short-thin man with long-white hair, gained the podium. Wearing what might be called a suit—with Elkins one never could tell—he recited lines from a novella, Decoration Day. I had heard of the author, John William Corrington, but only in passing.
“Paneled walnut and thick carpets,” Elkins beamed, gesturing toward the blank-white wall behind him, “row after row of uniform tan volumes containing between their buckram covers a serial dumb show of human folly and greed and cruelty.” The students, uncomfortable, began to look at each other, registering doubt. In law school, professors didn’t wax poetic. But this Elkins—he was different. With swelling confidence, he pressed on: “The Federal Reporter, Federal Supplement, Supreme Court Reports. Two hundred years of our collective disagreements and wranglings from Jay and Marshall through Taney and Holmes and Black and Frankfurter—the pathetic often ill-conceived attempts to resolve what we have done to one another.”
Elkins paused. The room went still. Awkwardly profound, or else profoundly awkward, the silence was like an uninvited guest at a dinner party—intrusive, unexpected, and there, all too there. No one knew how to respond. Law students, most of them, can rattle off fact-patterns or black-letter-law whenever they’re called on. But this? What were we to do with this?
What I did was find out more about John Willliam Corrington. Having studied literature for two years in graduate school, I was surprised to hear this name—Corrington—in law school. I booted up my laptop, right where I was sitting, and, thanks to Google, found a few biographical sketches of this man, who, it turned out, was perplexing, riddled with contradictions: a Southerner from the North, a philosopher in cowboy boots, a conservative literature professor, a lawyer poet. This introduction to Corrington led to more books, more articles, more research. Before long, I’d spent over $300 on Amazon.com. And I’m not done yet.
Born in Cleveland, Ohio, on October 28, 1932, Corrington—or Bill, as his friends and family called him—passed as a born-and-bred Southerner all of his life. As well he might, for he lived most of his life below the Mason-Dixon line, and his parents were from Memphis and had moved north for work during the Depression. He moved to the South (to Shreveport, Louisiana) at the age of 10, although his academic CV put out that he was, like his parents, born in Memphis, Tennessee. Raised Catholic, he attended a Jesuit high school in Louisiana but was expelled for “having the wrong attitude.” The Jesuit influence, however, would remain with him always. At the beginning of his books, he wrote, “AMDG,” which stands for Ad Majorem Dei Gloriam—“for the greater glory of God.” “It’s just something that I was taught when I was just learning to write,” he explained in an interview in 1985, “taught by the Jesuits to put at the head of all my papers.”
Bill was, like the late Mark Royden Winchell, a Copperhead at heart, and during his career he authored or edited, or in some cases co-edited, twenty books of varying genres. He earned a B.A. from Centenary College and M.A. in Renaissance literature from Rice University, where he met his wife, Joyce, whom he married on February 6, 1960. In September of that year, he and Joyce moved to Baton Rouge, where Bill became an instructor in the Department of English at Louisiana State University (LSU). At that time, LSU’s English department was known above all for The Southern Review (TSR), the brainchild of Cleanth Brooks and Robert Penn Warren, but also for such literary luminaries as Robert Heilman, who would become Bill’s friend.
In the early 1960s, Bill pushed for TSR to feature fiction and poetry and not just literary criticism. He butted heads with then-editors Donald E. Stanford and Lewis P. Simpson, who thought of the journal as scholarly, not creative, as if journals couldn’t be both scholarly and creative. A year after joining the LSU faculty, Bill published his first book of poetry, Where We Are. With only 18 poems and 225 first edition printings, the book hardly established Bill’s reputation as Southern man of letters. But it invested his name with recognition and gave him confidence to complete his first novel, And Wait for the Night (1964).
Bill and Joyce spent the 1963-64 academic year in Sussex, England, where Bill took the D.Phil. from the University of Sussex in 1965. In the summer of 1966, at a conference at Northwestern State College, Mel Bradford, that Dean of Southern Letters, pulled Bill aside and told him, enthusiastically, that And Wait for the Night (1964) shared some of the themes and approaches of William Faulkner’s The Unvanquished. Bill agreed. And happily.
Of Bill and Miller Williams, Bill’s colleague at LSU, Jo LeCoeur, poet and literature professor, once submitted, “Both men had run into a Northern bias against what was perceived as the culturally backward South. While at LSU they fought back against this snub, editing two anthologies of Southern writing and lecturing on ‘The Dominance of Southern Writers.’ Controversial as a refutation of the anti-intellectual Southern stereotype, their joint lecture was so popular [that] the two took it on the road to area colleges.”
In this respect, Bill was something of a latter-day Southern Fugitive—a thinker in the tradition of Donald Davidson, Allan Tate, Andrew Nelson Lytle, and John Crowe Ransom. Bill, too, took his stand. And his feelings about the South were strong and passionate, as evidenced by his essay in The Southern Partisan, “Are Southerners Different?” (1984). Bill’s feelings about the South, however, often seemed mixed. “[T]he South was an enigma,” Bill wrote to poet Charles Bukowski, “a race of giants, individualists, deists, brainy and gutsy: Washington, Jefferson, Madison, Jackson (Andy), Davis, Calhoun, Lee, and on and on. And yet the stain of human slavery on them.” As the epigraph (above) suggests, Bill was not interested in hagiographic renderings of Southern figures. He was interested in the complexities of Southern people and experience. In the end, though, there was no doubt where his allegiances lay. “You strike me as the most unreconstructed of all the Southern novelists I know anything about,” said one interviewer to Bill. “I consider that just about the greatest compliment anyone could give,” Bill responded.
While on tour with Williams, Bill declared, “We are told that the Southerner lives in the past. He does not. The past lives in him, and there is a difference.” The Southerner, for Bill, “knows where he came from, and who his fathers were.” The Southerner “knows still that he came from the soil, and that the soil and its people once had a name.” The Southerner “knows that is true, and he knows it is a myth.” And the Southerner “knows the soil belonged to the black hands that turned it as well as it ever could belong to any hand.” In short, the Southerner knows that his history is tainted but that it retains virtues worth sustaining—that a fraught past is not reducible to sound bites or political abstractions but is vast and contains multitudes.
In 1966, Bill and Joyce moved to New Orleans, where the English Department at Loyola University, housed in a grand Victorian mansion on St. Charles Avenue, offered him a chairmanship. Joyce earned the M.S. in chemistry from LSU that same year. By this time, Bill had written four additional books of poetry, the last of which, Lines to the South and Other Poems (1965), benefited from Bukowski’s influence. Bill’s poetry earned a few favorable reviews but not as much attention as his novels—And Wait for the Night (1964), The Upper Hand (1967), and The Bombardier (1970). Writing in The Massachusetts Review, Beat poet and critic Josephine Miles approvingly noted two of Bill’s poems from Lines, “Lucifer Means Light” and “Algerien Reveur,” alongside poetry by James Dickey, but her comments were more in passing than in depth. Dickey himself, it should be noted, admired Bill’s writing, saying, “A more forthright, bold, adventurous writer than John William Corrington would be very hard to find.”
Joyce earned her PhD in chemistry from Tulane in 1968. Her thesis, which she wrote under the direction of L. C. Cusachs, was titled, “Effects of Neighboring Atoms in Molecular Orbital Theory.” She began teaching chemistry at Xavier University, and her knowledge of the hard sciences brought about engaging conservations, between her and Bill, about the New Physics. “Even though Bill only passed high school algebra,” Joyce would later say, “his grounding in Platonic idealism made him more capable of understanding the implications of quantum theory than many with more adequate educations.”
By the mid-70s, Bill had become fascinated by Eric Voeglin. A German historian, philosopher, and émigré who had fled the Third Reich, Voegelin taught in LSU’s history department and lectured for the Hoover Institution at Stanford University, where he was a Salvatori Fellow. Voeglin’s philosophy, which drew from Friedrich von Hayek and other conservative thinkers, inspired Bill. In fact, Voegelin made such a lasting impression that, at the time of Bill’s death, Bill was working on an edition of Voegelin’s The Nature of the Law and Related Legal Writings. (After Bill’s death, two men—Robert Anthony Pascal and James Lee Babin—finished what Bill had begun. The completed edition appeared in 1991.)
By 1975, the year he earned his law degree from Tulane, Bill had penned three novels, a short story collection, two editions (anthologies), and four books of poetry. But his writings earned little money. He also had become increasingly disenchanted with the political correctness on campus:
By 1972, though I’d become chair of an English department and offered a full professorship, I’d had enough of academia. You may remember that in the late sixties and early seventies, the academic world was hysterically attempting to respond to student thugs who, in their wisdom, claimed that serious subjects seriously taught were “irrelevant.” The Ivy League gutted its curriculum, deans and faculty engaged in “teach-ins,” spouting Marxist-Leninist slogans, and sat quietly watching while half-witted draft-dodgers and degenerates of various sorts held them captive in their offices. Oddly enough, even as this was going on, there was a concerted effort to crush the academic freedom of almost anyone whose opinions differed from that of the mob or their college-administrator accessories. It seemed a good time to get out and leave the classroom to idiots who couldn’t learn and didn’t know better, and imbeciles who couldn’t teach and should have known better.
Bill joined the law firm of Plotkin & Bradley, a small personal injury practice in New Orleans, and continued to publish in such journals as The Sewanee Review and The Southern Review, and in such conservative periodicals as The Intercollegiate Review and Modern Age. His stories took on a legal bent, peopled as they were with judges and attorneys. But neither law nor legal fiction brought him fame or fortune.
So he turned to screenplays—and, at last, earned the profits he desired. Viewers of the recent film I am Legend (2007), starring Will Smith, might be surprised to learn that Bill and Joyce wrote the screenplay for the earlier version, Omega Man (1971), starring Charlton Heston. And viewers of Battle for the Planet of the Apes (1973) might be surprised to learn that Bill wrote the film’s screenplay while still a law student. All told, Bill and Joyce wrote five screenplays and one television movie. Free from the constraints of university bureaucracy, Bill collaborated with Joyce on various television daytime dramas, including Search for Tomorrow, Another World, Texas, Capitol, One Life to Live, Superior Court, and, most notably, General Hospital. These ventures gained the favor of Hollywood stars, and Bill and Joyce eventually moved to Malibu.
Bill constantly molded and remolded his image, embracing Southern signifiers while altering their various expressions. His early photos suggest a pensive, put-together gentleman wearing ties and sport coats and smoking pipes. Later photos depict a rugged man clad in western wear. Still later photos conjure up the likes of Roy Orbison, what with Bill’s greased hair, cigarettes, and dark sunglasses.
Whatever his looks, Bill was a stark, provocative, and profoundly sensitive writer. His impressive oeuvre has yet to receive the critical attention it deserves. That scholars of conservatism, to say nothing of scholars of Southern literature, have ignored this man is almost inconceivable. There are no doubt many aspects of Bill’s life and literature left to be discovered. As Bill’s friend William Mills put it, “I believe there is a critique of modernity throughout [Bill’s] writing that will continue to deserve serious attentiveness and response.”
On Thanksgiving Day, November 24, 1988, Bill suffered a heart attack and died. He was 56. His last words, echoing Stonewall Jackson, were, “it’s all right.”
The following is part of a series of chapter-by-chapter analyses of Friedrich Hayek’s The Road to Serfdom, conducted as part of The Mendenhall’s expanding Capitalist Reader’s Guide project. Previous entries can be found here: Introduction, Chapter 1, 2, 3, 4, 5, and 6.
In “Economic Control and Totalitarianism”, the subject of Hayek’s seventh chapter, we find him at his best, with a clarity and reason that we have not seen since chapter two, “The Great Utopia.” In chapter seven, Hayek expounds upon numerous themes within the titular subject: the inextricability of dictatorial control and economic planning, the fallacy of believing that economic controls can be separated from broader political controls, the inevitability in a planned economy of controls extending to individuals’ choice of profession, and the interrelation of economic and political freedom. What aspects of the chapter we might find to criticize arise either from a desire for him to take his line of thinking a step further than he does or already established mistakes carried over from previous chapters. Despite a few minor missteps, however, Hayek’s chapter is, overall, an exceedingly positive contribution.
He begins by stating what is, to many self-deceiving advocates of socialism, a jarring observation: that planned economies, following their natural course, ultimately always require dictatorial rule. “Most planners who have seriously considered the practical aspects of their task,” Hayek writes, “have little doubt that a directed economy must be run on more or less dictatorial lines” (66). Without fully restating the argument here, Hayek implicitly rests upon the description of this tendency that he spelled out in chapter 5, “Planning and Democracy”: power in a planned system gradually consolidates into a central committee or single dictator as a matter of organizational efficiency, with a decisive central leadership winning out over the gridlock and inefficiencies of a democratic body. The point is as valid and well made here as it was then.
Where Hayek expounds upon this is in refuting one of the false promises often made by planners as they reach for the reins of a country’s economy: “the consolation… that this authoritarian direction will apply ‘only’ to economic matters” (66). Contrary to the suggestion that controls will be limited to economic affairs, Hayek asserts that economic controls in the absence of broader political controls are not simply unlikely, but impossible. Rather than simply detailing in a typical way the interrelationship of economic and other activities, Hayek acknowledges the inseparability of the two, writing, “It is largely a consequence of the erroneous belief that there are purely economic ends separate from the other ends of life” (66). He later elaborates:
“The authority directing all economic activity would control not merely the part of our lives which is concerned with inferior things; it would control the allocation of the limited means for all our ends. And whoever controls all economic activity controls the means for all our ends, and must therefore decide which are to be satisfied and which not. This is really the crux of the matter. Economic control is not merely control of a sector of human life which can be separated from the rest; it is the control of the means for all our ends” (68).
Hayek’s point is, in the context of modern economic education, a largely underappreciated and mishandled one. Economics instructors have, with time, lost the important skill of contextualizing economic interests within the broader scope of other human pursuits, instead treating them either as abstract ideas toyed with in a vacuum without real-world ramifications or preaching the ‘economics is everything’ doctrine to the exclusion of other analytical tools and frameworks.
Hayek, whether by virtue of writing at a time less bound by such false dichotomization of the field or simply due to his exceptional qualities as an economic thinker, successfully avoids both traps. “Strictly speaking,” he writes,
“there is no ‘economic motive’ but only economic factors conditioning our striving for other ends. What in ordinary language is misleadingly called the ‘economic motive’ means merely the desire for general opportunity, the desire for power to achieve unspecified ends. If we strive for money it is because it offers us the widest choice in enjoying the fruits of our efforts” (67).
Hayek rightly acknowledges money as a profoundly empowering economic good, calling it “one of the greatest instruments of freedom ever invented by man” that “opens an astounding range of choice to the poor man, a range greater than that which not many generations ago was open to the wealthy” (67).
Chapter seven goes on to briefly characterize the pervasiveness of central planning, and its propensity to spread to all areas of a society. Hayek recognizes that the much-eluded question of socialism-versus-capitalism is not simply one of which decisions individuals are to make for their lives, but whether the decision is to be theirs at all:
“The question raised by economic planning is, therefore, not merely whether we shall be able to satisfy what we regard as our more or less important needs in the way we prefer. It is whether it shall be we who decide what is more, and what is less, important for us, or whether this is to be decided by the planner” (68).
Those on both sides of the aisle in the United States today, who fail in so many matters to appreciate the distinction between individuals choosing the right thing for their lives and a government official imposing their choice (be it right or wrong) upon them, would do well to heed Hayek’s warning. Modern American political thinking, caught between an increasingly authoritarian left (taken directly from Marx and Rousseau, or updated via modern incarnations like Krugman, Sunstein, and Stiglitz) and a right that has yet to extend its limited government spirit to all areas of economics—much less censorship and social issues—has a great deal to learn from an Austrian economist’s words written some seventy years ago.
One element of central planning that utopian-minded young socialist idealists evade is that labor, being an input, must, in a controlled economy be as controlled as any other good—if not more so. This does not mean simply the control of wages or the maintenance of union. Ultimately, it means government control over the quantity of individuals in a given profession, conducted in the interest of keeping wages in a given field high and ensuring that there is an adequate supply of expertise to meet all of the economy’s needs. This means at some point dictating who can and cannot enter a given field of work.
“Most planners, it is true, promise that in the new planned world free choice of occupation will be scrupulously preserved or even increased. But there they promise more than they can possibly fulfill. If they want to plan they must control the entry into the different trades and occupations, or the terms of remuneration, or both” (71).
How many young socialists on college campuses across the country would not object to being torn from their chosen course of study and compelled to study for degrees in which they had no interest, to spend their lives in careers they did not love? That is the fate that they ask for, whether they recognize it as such or not. Would they accept it willingly? Would they “become a mere means, to be used by the authority in the service of such abstractions as the ‘social welfare’ or the ‘good of the community’” (72), bowing their heads subserviently to spend a life on a path that was chosen for them, for the good of society? Perhaps some. And perhaps others would recognize the nature of what they profess to believe in and renounce it. Either way, it is a reality that should be presented to them in those terms by those who see socialism for what it is.
Towards the end of the chapter, Hayek makes several key observations that would prove all the more true in the decades after his writing. He notes the decline of references by advocates of socialism to the functional superiority of socialism. Gradually witnessing their system being discredited, but doubling-down on their dogma, the socialists of the mid-20th century came to look less and less like those of the early 20th century, who believed in the system as a technically superior model for society. Instead, their arguments turned egalitarian in nature, “advocat[ing] planning no longer because of its superior productivity but because it will enable us to secure a more just and equitable distribution of wealth” (74). Little did Hayek know how far that trend would go with the rise of the New Left and its legacies, stretching up to the present and the current American administration.
Finally, in another point that has proven all the more true since the time of his writing, Hayek recognizes that the extent of planning proposed by socialism, empowered by modern modes of control, is that much greater than the control and subjugation that occurred under the days of monarchy and feudalism. In reading it, one is brought to wonder how much greater that mechanism of control is today, with NSA surveillance, a growing regulatory state, and ever more executive agencies maintaining armed units to impose their rules, than at Hayek’s writing in 1943.
Hayek’s seventh chapter is a valuable and, for the same reasons, saddening one for the way that it makes us reflect upon the applicability of his words and ideas to our current political environment. Though our current condition is far from totalitarian in nature, the same principles apply, to a lesser extent, in all areas where government intrudes to control markets, alter incentives, or provide special advantages to some at the expense of others.
Human beings are rational animals. We respond to the incentives around us. In the presence of a government that seems increasingly, explicitly willing to toy with those incentives to alter our behavior to suit models and ideals for our lives that are not our own, how much do we lose that we never knew we had? In what ways are our options limited? Need it be by a government edict that tells a young man who would study to be a doctor that doctors are no longer needed, and he should apply to be an engineer instead? No. It may be as subtle as inflating the price of his education through government loan programs, regulating the field he seeks to enter, and subjecting him to entitlement programs that tell him that his life’s work is not his own; that he works and exists in the service of society as a whole. And at that point, the difference between our condition and the ill fate that Hayek describes becomes one not of kind, but of degree.
This first appeared here at The American Spectator.
One of the Supreme Court opinions everyone is buzzing about — last Monday’s decision in Fisher v. University of Texas at Austin, a case involving that school’s affirmative action program — will not be monumental in our canons of jurisprudence.
The petitioner, Abigail Noel Fisher, a young white woman, applied to the university in 2008 and was denied admission. She challenged the decision, arguing that she would have been admitted under a colorblind system. The high court has now remanded the case back to the Fifth Circuit, holding that the lower court failed to properly ascertain whether the affirmative action program was the most narrowly tailored means to achieve the university’s diversity goal. In legal terms, the Fifth Circuit had failed to subject the program to “strict scrutiny.” Thus, additional litigation lies ahead; the case is not even over.
What will be remembered from Monday’s proceedings, though, is Justice Thomas’ concurrence, which treats affirmative action as paternalism — a word he implies but doesn’t use explicitly, at least not here.
The dichotomies “liberal” versus “conservative,” “left” versus “right,” complicate rather than clarify issues such as affirmative action. A better choice of words, if a dichotomy must be maintained, is “paternalism” versus “non-paternalism.” Viewing diversity in this light, as Justice Thomas does, enables us to understand and appreciate the forms that racism and discrimination take.
Those forms often are paternalistic: Person A assumes to understand the plight of person X and undertakes to care for and control him as a father would his children. Even if X were one day to achieve relative equality with A in real terms — opportunity, education, earning capacity — this dominance would persist so long as A views X as a needy inferior, and so long as X allows that presumption to persist.
Thomas’s concurrence places such toxic ideas under a microscope, and exposes the ironic double standards of those who resort to paternalism. For instance, the bulk of his concurrence describes how the university’s arguments in favor of affirmative action are the same or substantially similar to those once used to justify racial segregation and even slavery. “There is no principled distinction,” Thomas writes, “between the University’s assertion that diversity yields educational benefits and the segregationists’ assertion that segregation yielded those same benefits.”
Likewise, he adds, “Slaveholders argued that slavery was a ‘positive good’ that civilized Blacks and elevated them in every dimension of life.” Advocates of slavery and segregationists both argued, in other words, that their policies bettered the conditions of Blacks and minimized racial hostility on the whole. The form of these racist arguments is now being used to justify state discrimination through affirmative action programs.
The segregationists argued that integrated public schools would suffer from white flight; proponents of affirmative action argue that universities will suffer from a lack of diversity if discrimination is not allowed.
The segregationists argued that blacks would become the victims of desegregation once white children withdrew from public schools en masse and that separate but equal schools improved interracial relations; proponents of affirmative action likewise argue that minorities will be the victims if affirmative action programs are deemed unconstitutional and that diversity on campus improves interracial relations.
The segregationists argued that separate but equal schools allowed blacks to enjoy more leadership opportunities; proponents of affirmative action likewise argue that affirmative action programs empower minorities to become leaders in a diverse society.
The segregationists argued that although separate but equal schools were not a perfect remedy for racial animosity, such schools were nevertheless a practical step in the right direction; proponents of affirmative action likewise argue that it, although not ideal, nevertheless generates race consciousness among students.
In the face of these surprising parallels, Justice Thomas maintains that “just as the alleged educational benefits of segregation were insufficient to justify racial discrimination” during the Civil Rights Era, so “the alleged educational benefits of diversity cannot justify racial discrimination today.”
He should not be misunderstood as equating affirmative action with the discrimination unleashed upon blacks and other minorities throughout American history. Although he acknowledges that affirmative action does harm whites and Asians, he is chiefly concerned with how such discrimination harms its intended beneficiaries: above all, blacks and Hispanics. “Although cloaked in good intentions,” Thomas submits, “the University’s racial tinkering harms the very people it claims to be helping.” He adds that “the University would have us believe that its discrimination is…benign. I think the lesson of history is clear enough: Racial discrimination is never benign.”
Why aren’t affirmative action programs — which Justice Thomas at one point refers to as “racial engineering” — benign? He gives several reasons: They admit blacks and Hispanics who aren’t as prepared for college as white and Asian students; they do not ensure that blacks and Hispanics close the learning gap during their time in college; they do not increase the overall number of blacks and Hispanics who attend college; and they encourage unqualified applicants to graduate from great schools as mediocre students instead of good schools as exceptional students. Moreover, Justice Thomas cites studies showing that minorities interested in science and engineering are more likely to choose different paths when they are forced to compete with other students in those disciplines at elite universities. What Justice Thomas considers most damning of all, however, is the “badge of inferiority” stamped on racial minorities as a result of affirmative action.
Just one small personal example: When I was in law school, a few of the guys in my study group began comparing professors, as students do regularly, and they were quite open in their opinion that our black professor could not have been as intelligent, because she had benefited from affirmative action programs. Read the rest of this entry »
The rationalist lawyer does not disparage an ideal on the grounds that it does not work or cannot be tried. “He has no sense of the cumulation of experience,” Michael Oakeshott bemoaned of the rationalist, “only of the readiness of experience when it has been converted into a formula: the past is significant to him only as an encumbrance.” The lawyer is a rationalist insofar as he is interested in a past that supplies him with the precedents and procedures that steer his practice and win his battles; such a past is an encumbrance because it never exists in the pure form that the lawyer seeks and needs. Therefore, the lawyer must push against the past, reinvent it, stretch it, mold it into a usable form; the past, for him, is a religion of malleability: to be faithful to it is to rewrite or reinterpret it.
The lawyer, being a rationalist, minces words and retards conventions to achieve the goals that benefit him and his client, paying little regard to whether his chosen grammar and syntax will impair the harmony of the community. He is trained, not educated; progressive, not conservative. His aim is to innovate in the service of short-lived victories. To be a good lawyer is not necessarily or even usually to be a moral or thoughtful person; it is to zealously represent the client by aligning the law with the facts of the case as they have been filtered through the minds and mouths of the parties. It is to prevail by fusing abstract rules with secondhand information. The lawyer, accordingly, is intelligent—highly so—but not honorable or ethical. He is, in short, a repository into which filtered discourse flows, and through which discourse is enunciated into the machine of the system for further processing.
“[H]aving cut himself off from the traditional knowledge of his society, and denied the value of education more extensive than a training in a technique of analysis,” Oakeshott persists of the rationalist—or, for my purposes, the lawyer—“is apt to attribute to mankind a necessary inexperience in all the critical moments of life.” Hence the trouble with the lawyer: his ambition is rarely tempered by his inadequacies, his analytic mind seeks out models for the mastery of human behavior, his poise in the face of adversity betrays his naiveté, his reliance on his own intents and purposes for action (rather than on those of his ancestors or immediate community) reveals a grave shortsightedness that can lead only to subtle and progressive harm.
Do not misunderstand me: what I call “the lawyer” is an archetype, not a group of named individuals. The common legal practitioner is not an Iago bent on weaving webs of wickedness with motives only sinister. But the lawyer archetype, like all archetypes, contains truth. It is because Atticus Finch is so unlike the typical lawyer that he stands out in our memory and is said to have redeemed the law. Lawyer jokes did not arise in a vacuum; and the rules of ethics and professional responsibility did not come about because the public considered lawyers to be noble and upright. So, when I refer to “the lawyer,” I do not mean any one man or woman, nor each and every lawyer, but I do mean to signal (1) the symbol of the lawyer that is based on real patterns of behavior, which are passed from one generation of lawyers to the next; (2) a personality type that can and has been observed in lawyers in different times and places; and (3) a model that lawyers have emulated and perpetuated to their own detriment.
The following review first appeared here at The University Bookman.
John Randolph of Roanoke
by David Johnson.
Baton Rouge: Louisiana State University Press, 2012
“I am an aristocrat. I love liberty, I hate equality.” Thus spoke John Randolph of Roanoke (1773–1833), one of the most curious, animated figures ever to grace American soil. That David Johnson’s biography of Randolph is the first of its kind since Russell Kirk published John Randolph of Roanoke in 1951 suggests how deteriorated American memory and education have become. Randolph ought to be studied by all American schoolchildren, if not for his politics then for the vital role he played in shaping the nation’s polity. Dr. Kirk declared that in writing about Randolph, he was summoning him from the shades. If so, Johnson has gone a step farther and brought Randolph into the sunshine to reveal just how spectacular a man he really was.
Kirk’s biography of Randolph was in fact his first book. Kirk dubbed the colorful Virginian a “genius,” “the prophet of Southern nationalism,” and the “architect of Southern conservatism.” In The Conservative Mind, Kirk treats Randolph as a necessary link between George Mason and John C. Calhoun and proclaims that Randolph should be remembered for “the quality of his imagination.” Randolph enabled the proliferation and preservation of the conservative tradition in America. He became an icon for decentralization and localism.
Why would a scandalous, sickly, go-it-alone, riotous rabble-rouser appeal to the mild-mannered Dr. Kirk? The answer, in short, is that Randolph was as conservative a politician as America has ever produced, and he was, despite himself, a gentleman and a scholar. Eccentric though he appeared and often acted, Randolph celebrated and defended tradition, championed small government and agrarianism, sacrificed careerism and opportunism for unwavering standards, professed self-reliance and individualism, took pains to preserve the rights of the states against the federal government, delighted in aristocratic tastes and manners, read voraciously the great works of Western civilization, cultivated the image of a statesman even as he attended to the wants and needs of his yeomen constituents, discoursed on weighty topics with wit and vigor, and adhered to firm principles rather than to partisan pandering. Admired by many, friend to few, he made a prominent display of his wild personality and unconditional love for liberty, and he devoted himself, sometimes at great cost, to the ideals of the American Revolution, which had, he claimed, marked him since childhood.
Remembered chiefly (and, in the minds of some progressives, unfortunately) for his contributions to states’ rights doctrines and to the judicial hermeneutics of strict constructionism, Randolph was responsible for so much more. The son of a wealthy planter who died too young, Randolph became the stepson of St. George Tucker, a prominent lawyer who taught at the College of William and Mary and served as a judge on the Virginia General District Court and, eventually, on the Virginia Court of Appeals, the United States District Court for the District of Virginia, and the United States District Court for the Eastern District of Virginia. A cousin to Thomas Jefferson, Randolph studied under George Wythe and his cousin Edmond Randolph. A boy who was forced to flee his home from the army of Benedict Arnold, Randolph later played hooky from college to watch the orations of Fisher Ames, the stout Federalist from New York, and Madison. He served in the U.S. House of Representatives as well as the U.S. Senate, and was, for a brief time, Minister to Russia. A supporter of Jefferson before he became Jefferson’s tireless adversary, he criticized such individuals as Patrick Henry, Washington, Madison, Monroe, John Adams, Henry Clay, and Daniel Webster. He was sickened by the Yazoo Land Scandal, opposed the War of 1812 in addition to the Missouri Compromise, and promoted nullification.
Many conservatives, Kirk among them, have tended to overlook the more unpalatable aspects of Randolph’s life, whether personal or political. For instance, Randolph was, more than Jefferson, enthralled by the French Revolution and supportive of its cause. He manufactured a French accent, used a French calendar, and called his friends “Citizens.” In his twenties, he referred to himself as a deist “and by consequence an atheist,” and he acquired, in his own words, “a prejudice in favor of Mahomedanism,” going so far as to proclaim that he “rejoiced in all its [Islam’s] triumphs over the cross.” One might excuse these infelicities as symptoms of youthful indiscretion and impetuosity, but they do give one pause.
Not for lack of trying, Randolph could not grow a beard, and although he spoke well, his voice was, by most accounts, awkward, piping, off-putting, and high-pitched. His critics have painted him as a villain of the likes of Shakespeare’s Richard III: resentful, obstinate, loudmouthed, and as deformed in the mind as he was in the body. Yet Randolph cannot be made into a monster. More than others of his station in that time and place, Randolph was sensitive to the problems of slavery, which had only intensified rather than diminished since the Founding. He freed his slaves in his will, granted them landholdings in Ohio, and provided for their heirs. Slavery was incompatible with liberty, and Randolph, despite being a product of his time, appears to have worried much about the paradox of a nation conceived in liberty but protective of institutional bondage. Randolph asserted, in some way or another, over and over again, that his politics were based on a presumption of liberty, which was (and is) the opposite of slavery and governmental tyranny. Read the rest of this entry »
“[C]ontemporary theory […] has, among other things, been committed to the mission of criticizing and discrediting this very hermeneutic model of the inside and the outside and of stigmatizing such models as ideological and metaphysical. But what is today called contemporary theory—or better still, theoretical discourse—is also, I want to argue, itself very precisely a postmodernist phenomenon. It would therefore be inconsistent to defend the truth of its theoretical insights in a situation in which the very concept of ‘truth’ itself is part of the metaphysical baggage which postructuralism seeks to abandon. What we can at least suggest is that the poststructuralist critique of the hermeneutic, of what I will shortly call the depth model, is useful for us as a very significant symptom of the very postmodernist culture which is our subject here.”
—Fredric Jameson, from Postmodernism, or, the Cultural Logic of Late Capitalism
Fredric Jameson’s Postmodernism, or, the Cultural Logic of Late Capitalism is a defining work about definition—specifically, about what “postmodernism” is. Rather than reducing postmodernism to one quality or characteristic, Jameson lays out several qualities or characteristics as manifest in works of literature, architecture, painting, and so forth. To say that postmodernism is a single thing is to ignore various flows, assemblages, networks, contradictions, tensions, and trajectories summoned forth by this slippery signifier. It is, in short, to be non-postmodern.
Jameson dislikes postmodernism and does not set out to be postmodern, even if he is, or cannot help but be, postmodern; he seeks to describe postmodernism in order to generate a working, demarcating explanation. Criticizing the “camp-following celebration” of the postmodern aesthetic, the “current fantasies about the salvational nature of high technology,” and the “vulgar apologias for postmodernism,” Jameson views the postmodern as penetrated and constituted by late capitalism. For Jameson, the postmodern is less a political program than a moment in time with certain defining characteristics; the postmodern is not an ideological agenda, but something we are in, whether we like it or not.
The trouble with describing the postmodern, as Jameson suggests in the passage above, is that it can refer to various phenomena, from artistic and cultural developments to social and political organization. One thing seems clear from the prefix “post”: postmodernism replaces (or displaces) the modern. It comes after. Therefore, postmodernism must be marked by a break from its predecessor. What this break is, and how it materializes among social forces, determines what postmodernism means, or at least what it looks like.
“Contemporary theory” or “theoretical discourse,” which has arisen alongside mass mechanical reproductions in the arts as well as commodity culture in every realm of human experience, and which, moreover, is neither unified nor uniform, is a product of postmodernism. This “Theory” (with a capital T) is splintered into numerous methodologies and logics, but generally holds that meaning is fluid, fragmented, and indeterminate. That statement does not do justice to the nuance and complexity of the subject.
At any rate, one has, as Jameson points out, trouble defending the “truth” of postmodernism’s “theoretical insights” because “the very concept of ‘truth’ itself is part of the metaphysical baggage which poststructuralism seeks to abandon.” Jameson rejects a wholesale and unquestioning commitment to poststructuralism, which he tends to conflate with postmodernism. Perhaps it is more accurate to say that Jameson sees in the postmodern the propensity toward domination, a decidedly essentialized (and essentializing) category of discourse. Too much reliance on the postmodern, according to Jameson, leads to relativism or nihilism. Jameson does not use those terms, but he does say that if “we do not achieve some general sense of a cultural dominant, then we fall back into a view of present history as sheer heterogeneity, random difference, a coexistence of a host of distinct forces whose effectivity is undecidable.”
Representing an arguably conservative shift away from other theorists—conservative with regard to ontology or metaphysics, not social politics—Jameson seeks to recover concepts like “dominance,” which are central to Marxist criticism, by arguing that critical theory such as poststructuralism is symptomatic of capitalism itself. Accordingly, we ought to study postmodernism as a result of capitalism’s rise to maturity. In this respect, Jameson revives Marxist criticism, which in many ways stands in contradistinction to postmodernism. Marxist criticism, after all, seems to subsume and encompass other theories, especially those that purport to explode all meanings; it is overarching and paradigmatic. It cultivates ideas about fixed categories—like dominance—that signify definite and resolved concepts.
Although what or who dominates is always changing, the idea of domination remains relatively stable. Marxism is therefore incompatible with postmodern theories that would do away with any and all “historicity”—to say nothing of essentializing concepts such as the bourgeois or even the self. For Jameson, Marxist theory remains useful and instructive. It is not just the constant play of simulacra or the mere trace of signification. Rather, it offers a viable and effective method for critiquing globalized consumer culture and ideology, both of which are evident in the frantic insistences on the superiority of the postmodern condition.
Hyperspecialization in the academy is an enemy of the permanent things. It has caused scholars to become bogged down in particular eras and woefully constrained in their knowledge of people, events, and ideas from periods outside of their specialization. The result is that scholars tend to see the world through the lens of their narrow academic focus. A historian of 19th century American slavery will try to find the residue of slavery in all features of the present era. He may not realize how distorted his interpretation of the present is in light of his immersion in his scholarly field. Moreover, his data are atomized; therefore, he cannot have a comprehensive sense of the trajectory of history.
A related problem is over-infatuation with present ideas. There was a time when philosophies prevailed for centuries, but lately new philosophies seem to spring up every decade. For thinkers to commit unreservedly to a present philosophical fad is to guarantee their intellectual obsolescence. Close association with fleeting fancies will blind thinkers to the different manifestations of traditional theories, and it is an awareness of the varying manifestations of similar theories that characterizes the great thinker.
There are benefits to specialization to the extent that it generates efficiency in the way that, in economic terms, division of labor generates efficiency: one scholar works on details that supplement the details provided by another scholar and so on until all of the details in the aggregate enable us to draw general conclusions. But this process occurs to the detriment of the individual scholar, who becomes alienated from the general conclusions because his profession diverts his activities to the details and minutiae. We need more scholars who are aware of the general conclusions and can identify and illuminate the permanent things.
A rigorous study of the permanent things provides the lodestar for evaluating particular ideas against that which has been tested and tried already. Ideas that seem new have traceable historical antecedents, and individuals equipped with a fundamental knowledge of the permanent things are able to put seemingly novel ideas into their proper context. Such individuals recognize that change is not always evolution; sometimes it is deterioration. They also acknowledge the value of intellectual flexibility: to spot and utilize ideas with which one disagrees enables the integration of information that, in turn, enables a more thorough understanding.