See Disclaimer Below.

Posts Tagged ‘Walt Whitman’

Review of Stephen Budiansky’s “Oliver Wendell Holmes Jr.”

In Academia, America, American History, American Literature, Arts & Letters, Book Reviews, Books, Historicism, History, Humanities, Jurisprudence, Law, liberal arts, Oliver Wendell Holmes Jr., Philosophy, Pragmatism, Scholarship, Western Philosophy on September 25, 2019 at 6:45 am

This review originally appeared here in Los Angeles Review of Books.

Do we need another biography of Oliver Wendell Holmes Jr., who served nearly 30 years as an Associate Justice of the United States Supreme Court and nearly 20 years before that on the Massachusetts Supreme Judicial Court? He has been the subject of numerous biographies since his death in 1935. We have not discovered new details about him since Harvard made his papers available to researchers in 1985, so why has Stephen Budiansky chosen to tell his story?

The answer may have to do with something Holmes said in The Common Law, his only book: “If truth were not often suggested by error, if old implements could not be adjusted to new uses, human progress would be slow. But scrutiny and revision are justified.”

Indeed, they are — both in the law and in the transmission of history. Holmes has been so singularly misunderstood by jurists and scholars that his life and thought require scrutiny and revision. Because his story is bound up with judicial methods and tenets — his opinions still cited regularly, by no less than the US Supreme Court as recently as this past term — we need to get him right, or at least “righter,” lest we fall into error, sending the path of the law in the wrong direction.

A veritable cottage industry of anti-Holmes invective has arisen on both the left and the right side of the political spectrum. No one, it seems, of any political persuasion, wants to adopt Holmes. He’s a giant of the law with no champions or defenders.

For some critics, Holmes is the paragon of states’ rights and judicial restraint who upheld local laws authorizing the disenfranchisement of blacks (Giles v. Harris, 1903) and the compulsory sterilization of individuals whom the state deemed unfit (Buck v. Bell, 1927). This latter decision he announced with horrifying enthusiasm: “Three generations of imbeciles are enough.” For other critics, he’s the prototypical progressive, decrying natural law, deferring to legislation that regulated economic activity, embracing an evolutionary view of law akin to living constitutionalism, and bequeathing most of his estate to the federal government.

The truth, as always, is more complicated than tendentious caricatures. Budiansky follows Frederic R. Kellogg — whose Oliver Wendell Holmes Jr. and Legal Logic appeared last year — in reconsidering this irreducible man who came to be known as the Yankee from Olympus.

Not since Mark DeWolfe Howe’s two-volume (but unfinished) biography, The Proving Years and The Shaping Years, has any author so ably rendered Holmes’s wartime service. Budiansky devotes considerable attention to this period perhaps because it fundamentally changed Holmes. Before the war, Holmes, an admirer of Ralph Waldo Emerson, gravitated toward abolitionism and volunteered to serve as a bodyguard for Wendell Phillips. He was appalled by a minstrel show he witnessed as a student. During the war, however, he “grew disdainful of the high-minded talk of people at home who did not grasp that any good the war might still accomplish was being threatened by the evil it had itself become.”

Holmes had “daddy issues” — who wouldn’t with a father like Oliver Wendell Holmes Sr., the diminutive, gregarious, vainglorious, and sometimes obnoxious celebrity, physician, and author of the popular “Breakfast Table” series in The Atlantic Monthly? — that were exacerbated by the elder Holmes’s sanctimonious grandstanding about his noble, valiant son. For the aloof father, the son’s military service was a status marker. For the son, war was gruesome, fearsome, and real. The son despised the father’s flighty ignorance of the on-the-ground realities of bloody conflict.

Holmes fought alongside Copperheads as well, a fact that might have contributed to his skepticism about the motives of the war and the patriotic fervor in Boston. His friend and courageous comrade Henry Abbott — no fan of Lincoln — died at the Battle of the Wilderness in a manner that Budianksy calls “suicidal” rather than bold. The war and its carnage raised Holmes’s doubts regarding “the morally superior certainty that often went hand in hand with belief: he grew to distrust, and to detest, zealotry and causes of all kinds.”

This distrust — this cynicism about the human ability to know anything with absolute certainty — led Holmes as a judge to favor decentralization. He did not presume to understand from afar which rules and practices optimally regulated distant communities. Whatever legislation they enacted was for him presumptively valid, and he would not impose his preferences on their government. His disdain for his father’s moralizing, moreover, may have contributed to his formulation of the “bad man” theory of the law. “If you want to know the law and nothing else,” he wrote, “you must look at it as a bad man, who cares only for the material consequences which such knowledge enables him to predict, not as a good one, who finds his reasons for conduct, whether inside the law or outside of it, in the vaguer sanctions of conscience.”

Budiansky’s treatment of Holmes’s experience as a trial judge — the Justices on the Massachusetts Supreme Judicial Court in those days presided over trials of first instance — is distinctive among the biographies. Budisansky avers,

[I]n his role as a trial justice, Holmes was on the sharp edge of the law, seeing and hearing firsthand all of the tangled dramas of the courtroom, sizing up the honesty of often conflicting witnesses, rendering decisions that had immediate and dramatic consequences — the breakup of families, financial ruin, even death — to the people standing right before him.

Holmes’s opinions as a US Supreme Court Justice have received much attention, but more interesting — perhaps because less known — are the salacious divorce cases and shocking murder trials he handled with acute sensitivity to evidence and testimony.

Budiansky skillfully summarizes Holmes’s almost 30-year tenure on the US Supreme Court, the era for which he is best known. He highlights Holmes’s dissenting opinions and his friendship with Justice Louis Brandeis, who was also willing to dissent from majority opinions — and with flair. For those looking for more detailed narratives about opinions Holmes authored as a Supreme Court Justice, other resources are available. Thomas Healy’s The Great Dissent, for example, dives more deeply into Holmes’s shifting positions on freedom of speech. Healy spends a whole book describing this jurisprudential development that Budiansky clears in one chapter.

Contemptuous of academics, Budiansky irrelevantly claims that “humorless moralizing is the predominant mode of thought in much of academia today.” He adds, “A more enduring fact about academic life is that taking on the great is the most reliable way for those who will never attain greatness themselves to gain attention for themselves.” Harsh words! Budianksy accuses the French historian Jules Michelet of rambling “on for pages, as only a French intellectual can.” Is this playful wit or spiteful animus? Is it even necessary?

Budiansky might have avoided occasional lapses had he consulted the academics he seems to despise. For instance, he asserts that the “common law in America traces its origins to the Middle Ages in England […] following the Norman invasion in 1066,” and that the “Normans brought with them a body of customary law that, under Henry II, was extended across England by judges of the King’s Bench who traveled on circuit to hold court.” This isn’t so. Writing in The Genius of the Common Law, Sir Frederick Pollock — “an English jurist,” in Budiansky’s words, “whose friendship with Holmes spanned sixty years” — mapped the roots of the common law “as far back as the customs of the Germanic tribes who confronted the Roman legions when Britain was still a Roman province and Celtic.” In other words, Budiansky is approximately one thousand years off. Rather than supplanting British customs, the Normans instituted new practices that complemented, absorbed, and blended with British customs.

The fact that Budiansky never mentions some of the most interesting researchers working on Holmes — Susan Haack, Seth Vannatta, and Catharine Wells come to mind — suggests willful ignorance, the deliberate avoidance of the latest scholarship. But to what end? For what reason?

It takes years of study to truly understand Holmes. The epigraph to Vannatta’s new edition, The Pragmatism and Prejudice of Oliver Wendell Holmes Jr., aptly encapsulates the complexity of Holmes’s thought with lines from Whitman’s Song of Myself: “Do I contradict myself? / Very well then I contradict myself, / (I am large, I contain multitudes.)” Budiansky recognizes, as others haven’t, that Holmes was large and contained multitudes. Holmes’s contradictions, if they are contradictions, might be explained by the famous dictum of his childhood hero, Emerson: “A foolish consistency is the hobgoblin of little minds.”

Holmes was consistently inconsistent. His mind was expansive, his reading habits extraordinary. How to categorize such a wide-ranging man? What were the defining features of his belief? Or did he, as Louis Menand has alleged, “lose his belief in beliefs”? Budiansky condenses Holmes’s philosophy into this helpful principle: “[T]hat none of us has all the answers; that perfection will never be found in the law as it is not to be found in life; but that its pursuit is still worth the effort, if only for the sake of giving our lives meaning.”

Holmes was intellectually humble, warning us against the complacency that attends certainty. Driving his methods was the sober awareness that he, or anyone for that matter, might be incorrect about some deep-seated conviction. During this time of polarized politics, self-righteous indignation, widespread incivility, and rancorous public discourse, we could learn from Holmes. How civil and respectful we could be if we all recognized that our cherished ideas and working paradigms might, at some level, be erroneous, if we were constantly mindful of our inevitable limitations, if we were searchers and seekers who refuse to accept, with utter finality, that we’ve figured it all out?

What, Then, is Creativity?

In Arts & Letters, Creativity, Humanities, liberal arts, Philosophy, Teaching on January 9, 2019 at 6:45 am

This piece originally appeared here in The Imaginative Conservative.

Last week a student asked me, “What is creativity?” I was unsure how to respond. I felt like the speaker from Leaves of Grass musing about a child, who, fetching a handful of grass, asks him what the grass is. “How could I answer the child?” the speaker wonders. “I do not know what it is any more than he.”

What is creativity? How could I answer the student? I did not know what it was any more than he. My ignorance on this subject nevertheless inspired me to seek understanding, perhaps even a definition, and then to proffer brief, explanatory remarks. Here they are, principally for his benefit but also for mine—and for that of anyone, I suppose, who cares to consider them.

Every human, I think, is the handiwork of God. If humans are created in God’s image, and God is our creator, then humanity’s creativity is, or might be, a limited, earthly, imperfect glimpse into the ways and workings of God. “We too,” said Paul Elmore More, “as possessors of the word may be called after a fashion children of the Most High and sons of the Father, but as creatures of His will we are not of His substance and nature, however we may be like Him.”[1]

Inherently flawed and sinful, humans cannot create what or as God creates and cannot be divine. Our imagination can be powerfully dark, dangerous, and wicked. The Lord proclaimed in the Noahic covenant that “the imagination of man’s heart is evil from his youth.”[2] Construction of the Tower of Babel demonstrated that the unified power of ambitious men laboring together may engender impious unrestraint.[3]

Humans, however, being more rational and intelligent than animals, are supreme among God’s creation and bear the divine image of God. “What is man, that thou are mindful of him?” asks the psalmist, adding, “and the son of man, that thou visitest him?  For thou hast made him a little lower than the angels, and hath crowned him with glory and honour.”[4]

Saint Peter—or the author of Second Peter if that book is pseudographical—called humans potential partakers in the divine nature who have escaped the corruption of the world.[5] Saint Paul implied that followers of Christ enjoy something of Christ’s mind, some special understanding of Christ’s instructions.[6] He also suggested that followers of Christ, the saints, will judge not only the world but the angels,[7]beneath whom, in substance, we consist.[8]

What these passages mean, exactly, is subject to robust academic and theological debate, but surely humanity’s crowning artistic achievements—our paintings, sculptures, philosophies, architecture, poetry, theater, novels, and music—are starting points for exploration. What evidence have we besides these tangible products of our working minds that we who are not divine somehow partake in divinity?

Humans are moral, spiritual, social, creative, and loving, unlike the rest of God’s animate creation, only some of which, the animals, are also sentient. Aristotle and Aquinas, to say nothing of the author of Genesis, rank animals lower than humans in the hierarchy of living beings because, although sentient, they lack a discernable will, conscientiousness, consciousness, and capacity for reason that humans definitively possess. Moreover, animals provide humans with the necessary sustenance to survive, and our survival is indispensable to the advancement of knowledge and intelligence, themselves essential to the enjoyment and preservation of God’s creation.

All human life is sacred because of humanity’s godly nature,[9] which is a privilege with coordinate duties and responsibilities: to be fruitful and multiply and to subdue, or care for, the inferior creatures of the earth.[10] However awesome humanity’s creative faculties are, they are not themselves divine, and cannot be. “As the heavens are higher than the earth,” intones the prophet, “so are my ways higher than your ways and my thoughts than your thoughts.”[11]

Russell Kirk titled his autobiography The Sword of Imagination. A sword is a bladed weapon with a sharp, lethal point and sharp, lethal edges. It’s the symbol of medieval warriors and Romantic knights. The imagination, powerful like a sword, can be wielded for the forces of good or evil. It’s unsafe. But it can be channeled for moral and virtuous purposes.

Only God can have created something from nothing. That the cosmos exists at all is proof of an originating, ultimate cause, of some supreme power that is antecedent to all material life and form. Human creativity, by contrast, is iterative and mimetic, not the generation of perceptible substance out of an absolute void.

Human creativity builds on itself, repurposing and reinvigorating old concepts and fields of knowledge for new environments and changed conditions. We learn to be creative even if we are born with creative gifts and faculties. Imitative practice transforms our merely derivative designs and expressions into awesome originality and innovation.

Creativity, then, is the ability of human faculties to connect disparate ideas, designs, and concepts to solve actual problems, inspire awe, heighten the emotions and passions, or illuminate the complex realities of everyday experience through artistic and aesthetic expression. The most creative among us achieve their brilliance through rigorous training and a cultivated association with some master or teacher who imparts exceptional techniques and intuitions to the pupil or apprentice; every great teacher was a student once.

Or so I believe, having thought the matter through. It may be that I know no more about creativity than I do about grass. But I know, deeply and profoundly, that we are fearfully and wonderfully made, and for that I am infinitely and earnestly grateful.

 

Notes:

[1] Paul Elmore More, The Essential Paul Elmer More (New Rochelle, N.Y.: Arlington House, 1972), p. 55.

[2] Genesis 8:21.

[3] Genesis 11:5-7.

[4] Psalm 8:4-5.

[5] 2 Peter 1:4.

[6] 1 Corinthians 2:16.

[7] 1 Corinthians 6: 2-3.

[8] Psalm 8:4-5.

[9] Genesis 9:6.

[10] Genesis 28.

[11] Isaiah 55:9.

Emersonian Individualism

In America, American History, Art, Arts & Letters, Creativity, Emerson, Epistemology, Essays, Humanities, Liberalism, Libertarianism, Literary Theory & Criticism, Literature, Nineteenth-Century America, Philosophy, Poetry, Politics, Pragmatism, Rhetoric, Santayana, Western Civilization, Western Philosophy, Writing on April 4, 2012 at 6:48 am

Allen Mendenhall

The following essay originally appeared here at Mises Daily.

Ralph Waldo Emerson is politically elusive. He’s so elusive that thinkers from various schools and with various agendas have appropriated his ideas to validate some activity or another. Harold Bloom once wrote, “In the United States, we continue to have Emersonians of the Left (the post-Pragmatist Richard Rorty) and of the Right (a swarm of libertarian Republicans, who exalt President Bush the Second).”[1] We’ll have to excuse Bloom’s ignorance of political movements and signifiers — libertarians who exalt President Bush, really? — and focus instead on Bloom’s point that Emerson’s influence is evident in a wide array of contemporary thinkers and causes.

Bloom is right that what “matters most about Emerson is that he is the theologian of the American religion of Self-Reliance.”[2] Indeed, the essay “Self-Reliance” remains the most cited of Emerson’s works, and American politicians and intellectuals selectively recycle ideas of self-reliance in the service of often disparate goals.

Emerson doesn’t use the term “individualism” in “Self-Reliance,” which was published in 1841, when the term “individualism” was just beginning to gain traction. Tocqueville unintentionally popularized the signifier “individualism” with the publication of Democracy in America. He used a French term that had no counterpart in English. Translators of Tocqueville labored over this French term because its signification wasn’t part of the English lexicon. Emerson’s first mention of “individualism” was not until 1843.

It is clear, though, that Emerson’s notion of self-reliance was tied to what later would be called “individualism.” Emerson’s individualism was so radical that it bordered on self-deification. Only through personal will could one realize the majesty of God. Nature for Emerson was like the handwriting of God, and individuals with a poetical sense — those who had the desire and capability to “read” nature — could understand nature’s universal, divine teachings.

Lakes, streams, meadows, forests — these and other phenomena were, according to Emerson, sources of mental and spiritual pleasure or unity. They were what allowed one to become “part and parcel with God,” if only one had or could become a “transparent eyeball.” “Nothing at last is sacred,” Emerson said, “but the integrity of your own mind.” That’s because a person’s intellect translates shapes and forms into spiritual insights.

We cannot judge Emerson exclusively on the basis of his actions. Emerson didn’t always seem self-reliant or individualistic. His politics, to the extent that they are knowable, could not be called libertarian. We’re better off judging Emerson on the basis of his words, which could be called libertarian, even if they endow individualism with a religiosity that would make some people uncomfortable.

Emerson suggests in “Self-Reliance” that the spontaneous expression of thought or feeling is more in keeping with personal will, and hence with the natural world as constituted by human faculties, than that which is passively assumed or accepted as right or good, or that which conforms to social norms. Emerson’s individualism or self-reliance exalted human intuition, which precedes reflection, and it privileged the will over the intellect. Feeling and sensation are antecedent to reason, and Emerson believed that they registered moral truths more important than anything cognition could summon forth.

Emerson’s transcendentalism was, as George Santayana pointed out in 1911, a method conducive to the 19-century American mindset.[3] As a relatively new nation seeking to define itself, America was split between two mentalities, or two sources of what Santayana called the “genteel tradition”: Calvinism and transcendentalism.

The American philosophical tradition somehow managed to reconcile these seeming dualities. On the one hand, Calvinism taught that the self was bad, that man was depraved by nature and saved only by the grace of God. On the other hand, transcendentalism taught that the self was good, that man was equipped with creative faculties that could divine the presence of God in the world. The Calvinist distrusted impulses and urges as sprung from an inner evil. The transcendentalist trusted impulses and urges as moral intuition preceding society’s baseless judgments and prevailing conventions.

What these two philosophies had in common was an abiding awareness of sensation and perception: a belief that the human mind registers external data in meaningful and potentially spiritual ways. The Calvinist notion of limited disclosure — that God reveals his glory through the natural world — played into the transcendentalists’ conviction that the natural world supplied instruments for piecing together divinity.

The problem for Santayana is that transcendentalism was just a method, a way of tapping into one’s poetical sense. What one did after that was unclear. Santayana thought that transcendentalism was the right method, but he felt that Emerson didn’t use that method to instruct us in practical living. Transcendentalism was a means to an end, but not an end itself.

According to Santayana, Emerson “had no system” because he merely “opened his eyes on the world every morning with a fresh sincerity, marking how things seemed to him then, or what they suggested to his spontaneous fancy.”[4] Emerson did not seek to group all senses and impressions into a synthetic whole. Nor did he suggest a politics toward which senses and impressions ought to lead. Santayana stops short of accusing Emerson of advancing an “anything-goes” metaphysics. But Santayana does suggest that Emerson failed to advance a set of principles; instead, Emerson gave us a technique for arriving at a set of principles. Emerson provided transportation, but gave no direction. This shortcoming — if it is a shortcoming — might explain why Bloom speaks of the “paradox of Emerson’s influence,” namely, that “Peace Marchers and Bushians alike are Emerson’s heirs in his dialectics of power.”[5]

For Emerson, human will is paramount. It moves the intellect to create. It is immediate, not mediate. In other words, it is the sense or subjectivity that is not yet processed by the human mind. We ought to trust the integrity of will and intuition and avoid the dictates and decorum of society.

“Society,” Emerson says, “everywhere is in conspiracy against the manhood of every one of its members.” Society corrupts the purity of the will by forcing individuals to second-guess their impulses and to look to others for moral guidance. Against this socialization, Emerson declares, “Whoso would be a man, must be a nonconformist.”

Emerson’s nonconformist ethic opposed habits of thinking, which society influenced but did not determine. Emerson famously stated that a foolish consistency is the hobgoblin of little minds. What he meant, I think, is that humans ought to improve themselves by tapping into intuitive truths. Nature, with her figures, forms, and outlines, provides images that the individual can harness to create beauty and energize the self. Beauty therefore does not exist in the world; rather, the human mind makes beauty out of the externalities it has internalized. Beauty, accordingly, resides within us, but only after we create it.

Here we see something similar to Ayn Rand’s Objectivism stripped of its appeals to divinity. Rand believed that reality existed apart from the thinking subject, that the thinking subject employs reason and logic to make sense of experience and perception, and that the self or will is instrumental in generating meaning from the phenomenal world. Read the rest of this entry »

Book Review: Paul Cantor and Stephen Cox’s Literature and the Economics of Liberty

In Arts & Letters, Austrian Economics, Book Reviews, Communism, Conservatism, Economics, Essays, Fiction, History, Humane Economy, Humanities, Law-and-Literature, Liberalism, Libertarianism, Literary Theory & Criticism, Literature, Novels, Philosophy, Politics, Western Civilization, Western Philosophy on January 23, 2012 at 4:53 am

Allen Mendenhall

The following book review originally appeared here in the Fall 2010 issue of The Independent Review.

Humans are not automated and predictable, but beautifully complex and spontaneous. History is not linear. Progress is not inevitable. Our world is strangely intertextual and multivocal. It is irreducible to trite summaries and easy answers, despite what our semiliterate politicians would have us believe. Thinking in terms of free-market economics allows us to appreciate the complicated dynamics of human behavior while making sense of the ambiguities leading to and following from that behavior. With these realities in mind, I applaud Paul Cantor and Stephen Cox for compiling the timely collection Literature and the Economics of Liberty, which places imaginative literature in conversation with Austrian economic theory.

Cantor and Cox celebrate the manifold intricacies of the market, which, contrary to popular opinion, is neither perfect nor evil, but a proven catalyst for social happiness and well-being. They do not recycle tired attacks on Marxist approaches to literature: they reject the “return to aesthetics” slogans of critics such as Allan Bloom, Harold Bloom, and John M. Ellis, and they adopt the principles, insights, and paradigms of the Austrian school of economics. Nor do Cantor and Cox merely invert the privilege of the terms Marxist and capitalist (please excuse my resort to Derridean vocabulary), although they do suggest that one might easily turn “the tables on Marxism” by applying “its technique of ideology critique to socialist authors, questioning whether they have dubious motives for attacking capitalism.” Cantor and Cox are surprisingly the first critics to look to Austrian economics for literary purposes, and their groundbreaking efforts are sure to ruffle a few feathers—but also to reach audiences who otherwise might not have heard of Austrian economics.

Cantor and Cox submit that the Austrian school offers “the most humane form of economics we know, and the most philosophically informed.” They acknowledge that this school is heterodox and wide ranging, which, they say, are good things. By turning to economics in general, the various contributors to this book—five in all—suggest that literature is not created in a vacuum but rather informs and is informed by the so-called real world. By turning to Austrian economics in particular, the contributors seek to secure a place for freedom and liberty in the understanding of culture. The trouble with contemporary literary theory, for them, lies not with economic approaches, but with bad economic approaches. An economic methodology of literary theory is useful and incisive so long as it pivots on sound philosophies and not on obsolete or destructive ideologies. Austrian economics appreciates the complexity and nuance of human behavior. It avoids classifying individuals as cookiecutter caricatures. It champions a humane-economy counter to mechanistic massproduction, central planning, and collectivism. Marxism, in contrast, is collectivist, predictable, monolithic, impersonal, linear, reductive–in short, wholly inadequate as an instrument for good in an age in which, quite frankly, we know better than to reduce the variety of human experience to simplistic formulae. A person’s creative and intellectual energies are never completely products of culture or otherwise culturally underwritten. People are rational agents who choose between different courses of action based on their reason, knowledge, and experience. A person’s choices, for better or worse, affect lives, circumstances, and communities. (“Ideas have consequences,” as Richard Weaver famously remarked.) And communities themselves consist of multiplicities that defy simple labels. It is not insignificant, in light of these principles, that Michel Foucault late in his career instructed his students to read the collected works of Ludwig von Mises and F. A. Hayek. Read the rest of this entry »

How I Taught Sustainability

In Arts & Letters, Communication, Emerson, Fiction, Humanities, Literature, Nineteenth-Century America, Pedagogy, Rhetoric, Rhetoric & Communication, Teaching, Writing on January 9, 2012 at 1:12 am

Allen Mendenhall

Last spring I learned that I had been assigned to teach a freshman writing course on sustainability.  I don’t know much about sustainability, at least not in the currently popular sense of that term, and for many other reasons I was not thrilled about having to teach this course.  So I decided to put a spin on the subject.  What follows is an abridged version of my syllabus.  I owe more than a little gratitude to John Hasnas for the sections called “The Classroom Experience,” “Present and Prepared Policy,” and “Ground Rules for Discussion.”  He created these policies, and, with a few exceptions, the language from these policies is taken from a syllabus he provided during a workshop at a July 2011 Institute for Humane Studies conference on teaching and pedagogy.

Sustainability and American Communities

What is sustainability?  You have registered for this course about sustainability, so presumably you have some notion of what sustainability means.  The Oxford English Dictionary treats “sustainability” as a derivative of “sustainable,” which is defined as

  1. Capable of being borne or endured; supportable, bearable.
  2. Capable of being upheld or defended; maintainable.
  3. Capable of being maintained at a certain rate or level.

Recently, though, sustainability has become associated with ecology and the environment.  The OED dates this development as beginning in 1980 and trending during the 1990s.  The OED also defines “sustainability” in the ecological context as follows: “Of, relating to, or designating forms of human economic activity and culture that do not lead to environmental degradation, esp. avoiding the long-term depletion of natural resources.”  With this definition in mind, we will examine landmark American authors and texts and discuss their relationship to sustainability.  You will read William Bartram, Thomas Jefferson, Emerson, Thoreau, Hawthorne, Whitman, Mark Twain, and others.  Our readings will address nature, community, place, stewardship, husbandry, and other concepts related to sustainability.  By the end of the course, you will have refined your understanding of sustainability through the study of literary texts. 

Course Objectives

I have designed this course to help you improve your reading, writing, and thinking skills.  In this course, you will learn to write prose for general, academic, and professional audiences.  ENGL 1120 is a writing course, not a lecture course.  Plan to work on your writing every night.  You will have writing assignments every week. Read the rest of this entry »

%d bloggers like this: