See Disclaimer Below.

Archive for the ‘Writing’ Category

James Elkins and the Lawyer Poets

In Arts & Letters, Creative Writing, Creativity, Humanities, Law, Legal Education & Pedagogy, Literary Theory & Criticism, News Release, Poetry, Writing on November 14, 2013 at 8:45 am

Lawyer Poets and That World We Call Law

James Elkins of West Virginia University College of Law has edited Lawyer Poets and That World We Call Law (Pleasure Boat Studio, 2013), an anthology of poems about the practice of law.  Professor Elkins has been the longtime editor of Legal Studies Forum.  Contributors to the anthology include Lee Wm. Atkinson, Richard Bank, Michael Blumenthal, Ace Boggess, David Bristol, Lee Warner Brooks, MC Bruce, Laura Chalar, James Clarke, Martin Espada, Rachel Contreni Flynn, Katya Giritsky, Howard Gofreed, Nancy A. Henry, Susan Holahan, Paul Homer, Lawrence Joseph, Kenneth King, John Charles Kleefeld, Richard Krech, Bruce Laxalt, David  Leightty, John Levy, Greg McBride, James McKenna, Betsy McKenzie, Joyce Meyers, Jesse Mountjoy, Tim Nolan, Simon Perchik, Carl Reisman, Charles Reynard,  Steven M. Richman, Lee Robinson, Kristen Roedell, Barbara B. Rollins, Lawrence Russ, Michael Sowder, Ann Tweedy, Charles Williams, Kathleen Winter, and Warren Wolfson.

James Elkins

James Elkins
Professor of Law and Benedum Distinguished Scholar, West Virginia University College of Law

Service in St. Paul’s

In Arts & Letters, Creative Writing, Creativity, Humanities, Literature, Poetry, Writing on November 6, 2013 at 8:45 am

Allen 2

 

This poem originally appeared in The Echo.

Service in St. Paul’s

 

            —London, 2003

 

Acrophobia turned

upside down:

fear floating away,

gravity deciding

to suddenly

give up.

 

There’s a dome

overhead, a glowing

Jesus over the altar,

and too much space

to pray

comfortably.

 

Imagination

among the scaffolding,

eye to eye with Joseph,

now falling facing up:

heaven does

not seem so high.

John William Corrington, A Literary Conservative

In American History, Arts & Letters, Conservatism, Creative Writing, Essays, Fiction, History, Humanities, John William Corrington, Joyce Corrington, Law, Literary Theory & Criticism, Literature, Modernism, Southern History, Southern Literature, Television, Television Writing, The Novel, The South, Western Philosophy, Writing on October 23, 2013 at 8:45 am

 

Allen 2

 

An earlier version of this essay appeared here at Fronch Porch Republic.

Remember the printed prose is always

half a lie: that fleas plagued patriots,

that greatness is an afterthought

affixed by gracious victors to their kin.

 

—John William Corrington

 

It was the spring of 2009.  I was in a class called Lawyers & Literature.  My professor, Jim Elkins, a short-thin man with long-white hair, gained the podium.  Wearing what might be called a suit—with Elkins one never could tell—he recited lines from a novella, Decoration Day.  I had heard of the author, John William Corrington, but only in passing.

“Paneled walnut and thick carpets,” Elkins beamed, gesturing toward the blank-white wall behind him, “row after row of uniform tan volumes containing between their buckram covers a serial dumb show of human folly and greed and cruelty.”  The students, uncomfortable, began to look at each other, registering doubt.  In law school, professors didn’t wax poetic.  But this Elkins—he was different.  With swelling confidence, he pressed on: “The Federal Reporter, Federal Supplement, Supreme Court Reports.  Two hundred years of our collective disagreements and wranglings from Jay and Marshall through Taney and Holmes and Black and Frankfurter—the pathetic often ill-conceived attempts to resolve what we have done to one another.”

Elkins paused.  The room went still.  Awkwardly profound, or else profoundly awkward, the silence was like an uninvited guest at a dinner party—intrusive, unexpected, and there, all too there.  No one knew how to respond.  Law students, most of them, can rattle off fact-patterns or black-letter-law whenever they’re called on.  But this?  What were we to do with this?

What I did was find out more about John Willliam Corrington.  Having studied literature for two years in graduate school, I was surprised to hear this name—Corrington—in law school.  I booted up my laptop, right where I was sitting, and, thanks to Google, found a few biographical sketches of this man, who, it turned out, was perplexing, riddled with contradictions: a Southerner from the North, a philosopher in cowboy boots, a conservative literature professor, a lawyer poet.  This introduction to Corrington led to more books, more articles, more research.  Before long, I’d spent over $300 on Amazon.com.  And I’m not done yet.

***

Born in Cleveland, Ohio, on October 28, 1932, Corrington—or Bill, as his friends and family called him—passed as a born-and-bred Southerner all of his life.  As well he might, for he lived most of his life below the Mason-Dixon line, and his parents were from Memphis and had moved north for work during the Depression.  He moved to the South (to Shreveport, Louisiana) at the age of 10, although his academic CV put out that he was, like his parents, born in Memphis, Tennessee.  Raised Catholic, he attended a Jesuit high school in Louisiana but was expelled for “having the wrong attitude.”  The Jesuit influence, however, would remain with him always.  At the beginning of his books, he wrote, “AMDG,” which stands for Ad Majorem Dei Gloriam—“for the greater glory of God.”  “It’s just something that I was taught when I was just learning to write,” he explained in an interview in 1985, “taught by the Jesuits to put at the head of all my papers.”

Bill was, like the late Mark Royden Winchell, a Copperhead at heart, and during his career he authored or edited, or in some cases co-edited, twenty books of varying genres.  He earned a B.A. from Centenary College and M.A. in Renaissance literature from Rice University, where he met his wife, Joyce, whom he married on February 6, 1960.  In September of that year, he and Joyce moved to Baton Rouge, where Bill became an instructor in the Department of English at Louisiana State University (LSU).  At that time, LSU’s English department was known above all for The Southern Review (TSR), the brainchild of Cleanth Brooks and Robert Penn Warren, but also for such literary luminaries as Robert Heilman, who would become Bill’s friend.

In the early 1960s, Bill pushed for TSR to feature fiction and poetry and not just literary criticism.  He butted heads with then-editors Donald E. Stanford and Lewis P. Simpson, who thought of the journal as scholarly, not creative, as if journals couldn’t be both scholarly and creative.  A year after joining the LSU faculty, Bill published his first book of poetry, Where We Are.  With only 18 poems and 225 first edition printings, the book hardly established Bill’s reputation as Southern man of letters.  But it invested his name with recognition and gave him confidence to complete his first novel, And Wait for the Night (1964).

Bill and Joyce spent the 1963-64 academic year in Sussex, England, where Bill took the D.Phil. from the University of Sussex in 1965.  In the summer of 1966, at a conference at Northwestern State College, Mel Bradford, that Dean of Southern Letters, pulled Bill aside and told him, enthusiastically, that And Wait for the Night (1964) shared some of the themes and approaches of William Faulkner’s The Unvanquished.  Bill agreed.  And happily.

***

Of Bill and Miller Williams, Bill’s colleague at LSU, Jo LeCoeur, poet and literature professor, once submitted, “Both men had run into a Northern bias against what was perceived as the culturally backward South.  While at LSU they fought back against this snub, editing two anthologies of Southern writing and lecturing on ‘The Dominance of Southern Writers.’  Controversial as a refutation of the anti-intellectual Southern stereotype, their joint lecture was so popular [that] the two took it on the road to area colleges.”

In this respect, Bill was something of a latter-day Southern Fugitive—a thinker in the tradition of Donald Davidson, Allan Tate, Andrew Nelson Lytle, and John Crowe Ransom.  Bill, too, took his stand.  And his feelings about the South were strong and passionate, as evidenced by his essay in The Southern Partisan, “Are Southerners Different?” (1984).  Bill’s feelings about the South, however, often seemed mixed.  “[T]he South was an enigma,” Bill wrote to poet Charles Bukowski, “a race of giants, individualists, deists, brainy and gutsy:  Washington, Jefferson, Madison, Jackson (Andy), Davis, Calhoun, Lee, and on and on.  And yet the stain of human slavery on them.”  As the epigraph (above) suggests, Bill was not interested in hagiographic renderings of Southern figures.  He was interested in the complexities of Southern people and experience.  In the end, though, there was no doubt where his allegiances lay.  “You strike me as the most unreconstructed of all the Southern novelists I know anything about,” said one interviewer to Bill.  “I consider that just about the greatest compliment anyone could give,” Bill responded.

While on tour with Williams, Bill declared, “We are told that the Southerner lives in the past.  He does not.  The past lives in him, and there is a difference.”  The Southerner, for Bill, “knows where he came from, and who his fathers were.”  The Southerner “knows still that he came from the soil, and that the soil and its people once had a name.”  The Southerner “knows that is true, and he knows it is a myth.”  And the Southerner “knows the soil belonged to the black hands that turned it as well as it ever could belong to any hand.”  In short, the Southerner knows that his history is tainted but that it retains virtues worth sustaining—that a fraught past is not reducible to sound bites or political abstractions but is vast and contains multitudes.

***

In 1966, Bill and Joyce moved to New Orleans, where the English Department at Loyola University, housed in a grand Victorian mansion on St. Charles Avenue, offered him a chairmanship.  Joyce earned the M.S. in chemistry from LSU that same year.  By this time, Bill had written four additional books of poetry, the last of which, Lines to the South and Other Poems (1965), benefited from Bukowski’s influence.  Bill’s poetry earned a few favorable reviews but not as much attention as his novels—And Wait for the Night (1964), The Upper Hand (1967), and The Bombardier (1970).  Writing in The Massachusetts Review, Beat poet and critic Josephine Miles approvingly noted two of Bill’s poems from Lines, “Lucifer Means Light” and “Algerien Reveur,” alongside poetry by James Dickey, but her comments were more in passing than in depth.  Dickey himself, it should be noted, admired Bill’s writing, saying, “A more forthright, bold, adventurous writer than John William Corrington would be very hard to find.”

Joyce earned her PhD in chemistry from Tulane in 1968.  Her thesis, which she wrote under the direction of L. C. Cusachs, was titled, “Effects of Neighboring Atoms in Molecular Orbital Theory.”  She began teaching chemistry at Xavier University, and her knowledge of the hard sciences brought about engaging conservations, between her and Bill, about the New Physics.  “Even though Bill only passed high school algebra,” Joyce would later say, “his grounding in Platonic idealism made him more capable of understanding the implications of quantum theory than many with more adequate educations.”

By the mid-70s, Bill had become fascinated by Eric Voeglin.  A German historian, philosopher, and émigré who had fled the Third Reich, Voegelin taught in LSU’s history department and lectured for the Hoover Institution at Stanford University, where he was a Salvatori Fellow.  Voeglin’s philosophy, which drew from Friedrich von Hayek and other conservative thinkers, inspired Bill.  In fact, Voegelin made such a lasting impression that, at the time of Bill’s death, Bill was working on an edition of Voegelin’s The Nature of the Law and Related Legal Writings.  (After Bill’s death, two men—Robert Anthony Pascal and James Lee Babin—finished what Bill had begun.  The completed edition appeared in 1991.)

By 1975, the year he earned his law degree from Tulane, Bill had penned three novels, a short story collection, two editions (anthologies), and four books of poetry.  But his writings earned little money.  He also had become increasingly disenchanted with the political correctness on campus:

By 1972, though I’d become chair of an English department and offered a full professorship, I’d had enough of academia. You may remember that in the late sixties and early seventies, the academic world was hysterically attempting to respond to student thugs who, in their wisdom, claimed that serious subjects seriously taught were “irrelevant.” The Ivy League gutted its curriculum, deans and faculty engaged in “teach-ins,” spouting Marxist-Leninist slogans, and sat quietly watching while half-witted draft-dodgers and degenerates of various sorts held them captive in their offices. Oddly enough, even as this was going on, there was a concerted effort to crush the academic freedom of almost anyone whose opinions differed from that of the mob or their college-administrator accessories. It seemed a good time to get out and leave the classroom to idiots who couldn’t learn and didn’t know better, and imbeciles who couldn’t teach and should have known better.

Bill joined the law firm of Plotkin & Bradley, a small personal injury practice in New Orleans, and continued to publish in such journals as The Sewanee Review and The Southern Review, and in such conservative periodicals as The Intercollegiate Review and Modern Age.  His stories took on a legal bent, peopled as they were with judges and attorneys.  But neither law nor legal fiction brought him fame or fortune.

So he turned to screenplays—and, at last, earned the profits he desired.  Viewers of the recent film I am Legend (2007), starring Will Smith, might be surprised to learn that Bill and Joyce wrote the screenplay for the earlier version, Omega Man (1971), starring Charlton Heston.  And viewers of Battle for the Planet of the Apes (1973) might be surprised to learn that Bill wrote the film’s screenplay while still a law student.  All told, Bill and Joyce wrote five screenplays and one television movie.  Free from the constraints of university bureaucracy, Bill collaborated with Joyce on various television daytime dramas, including Search for Tomorrow, Another World, Texas, Capitol, One Life to Live, Superior Court, and, most notably, General Hospital.  These ventures gained the favor of Hollywood stars, and Bill and Joyce eventually moved to Malibu.

Bill constantly molded and remolded his image, embracing Southern signifiers while altering their various expressions.  His early photos suggest a pensive, put-together gentleman wearing ties and sport coats and smoking pipes.  Later photos depict a rugged man clad in western wear.  Still later photos conjure up the likes of Roy Orbison, what with Bill’s greased hair, cigarettes, and dark sunglasses.

Whatever his looks, Bill was a stark, provocative, and profoundly sensitive writer.  His impressive oeuvre has yet to receive the critical attention it deserves.  That scholars of conservatism, to say nothing of scholars of Southern literature, have ignored this man is almost inconceivable.  There are no doubt many aspects of Bill’s life and literature left to be discovered.  As Bill’s friend William Mills put it, “I believe there is a critique of modernity throughout [Bill’s] writing that will continue to deserve serious attentiveness and response.”

On Thanksgiving Day, November 24, 1988, Bill suffered a heart attack and died.  He was 56.  His last words, echoing Stonewall Jackson, were, “it’s all right.”

 

Is Hacking the Future of Scholarship?

In Arts & Letters, Communication, Humanities, Information Design, Law, Legal Research & Writing, Scholarship, Writing on October 16, 2013 at 7:45 am

Allen 2

This article appeared here in Pacific Standard.

Most attorneys are familiar with e-discovery, a method for obtaining computer and electronic information during litigation. E-discovery has been around a long time. It has grown more complex and controversial, however, with the rise of new technologies and the growing awareness that just about anything you do online or with your devices can be made available to the public. Emails, search histories, voicemails, instant messages, text messages, call history, music playlists, private Facebook conversations (not just wall posts)—if relevant to a lawsuit, these and other latent evidence, for better or worse, can be exposed, even if you think they’ve been hidden or discarded.

Anyone who has conducted or been involved with e-discovery realizes how much personal, privileged, and confidential information is stored on our devices. When you “delete” files and documents from your computer, they do not go away. They remain embedded in the hard drive; they may become difficult to find, but they’re there. Odds are, someone can access them. Even encrypted files can be traced back to the very encryption keys that created them.

E-discovery has been used to uncover registries and cache data showing that murderers had been planning their crimes, spouses had been cheating, perverts had been downloading illegal images, and employees had been stealing or compromising sensitive company data or destroying intellectual property. Computer forensics were even used to reveal medical documents from Dr. Conrad Murray’s computer during the so-called “Michael Jackson death trial.”

Computer forensics can teach you a lot about a person: the websites he visits, the people he chats with, the rough drafts he abandons, the videos he watches, the advertisements he clicks, the magazines he reads, the news networks he prefers, the places he shops, the profiles he views, the songs he listens to, and so on. It is fair to say that given a laptop hard drive, a forensic expert could nearly piece together an individual’s personality and perhaps come to know more about that person—secret fetishes, guilty pleasures, and criminal activities—than his friends and family do.

In light of this potential access to people’s most private activities, one wonders how long it will be until academics turn to computer forensics for research purposes. This is already being done in scientific and technology fields, which is not surprising because the subject matter is the machine and not the human, but imagine what it would mean for the humanities? If Jefferson had used a computer, perhaps we would know the details of his relationship with Sally Hemings. If we could get ahold of Shakespeare’s iPad, we could learn whether he wrote all those plays by himself. By analyzing da Vinci’s browsing history, we might know which images he studied and which people he interacted with before and during his work on the Mona Lisa—and thus might discover her identity.

There are, of course, government safeguards in place to prevent the abuse of, and unauthorized access to, computer and electronic data: the Wiretap Act, the Pen Registers and Trap and Trace Devices Statute, and the Stored Wired and Electronic Communication Act come to mind. Not just anyone can access everything on another person’s computer, at least not without some form of authorization. But what if researchers could obtain authorization to mine computer and electronic data for the personal and sensitive information of historical figures? What if computer forensics could be used in controlled settings and with the consent of the individual whose electronic data are being analyzed?

Consent, to me, is crucial: It is not controversial to turn up information on a person if he voluntarily authorized you to go snooping, never mind that you might turn up something he did not expect you to find. But under what circumstances could computer forensics be employed on a non-consensual basis? And what sort of integrity does computer or electronic information require and deserve? Is extracting data from a person’s laptop akin to drilling through a precious fresco to search for lost paintings, to excavating tombs for evidence that might challenge the foundations of organized religion and modern civilization, or to exhuming the bodies of dead presidents? Surely not. But why not?

We have been combing through letters by our dead predecessors for some time. Even these, however, were meant for transmission and had, to that end, intended audiences. E-discovery, by contrast, provides access to things never meant to be received, let alone preserved or recorded. It is the tool that comes closest to revealing what an individual actually thinks, not just what he says he thinks, or for that matter, how and why he says he thinks it. Imagine retracing the Internet browsing history of President Obama, Billy Graham, Kenneth Branagh, Martha Nussbaum, Salmon Rushdie, Nancy Pelosi, Richard Dawkins, Toni Morrison, Ai Weiwei, or Harold Bloom. Imagine reading the private emails of Bruno Latour, Ron Paul, Pope Francis, Noam Chomsky, Lady Gaga, Roger Scruton, Paul Krugman, Justice Scalia, or Queen Elizabeth II. What would you find out about your favorite novelists, poets, musicians, politicians, theologians, academics, actors, pastors, judges, and playwrights if you could expose what they did when no one else was around, when no audience was anticipated, or when they believed that the details of their activity were limited to their person?

This is another reason why computer and electronic data mining is not like sifting through the notes and letters of a deceased person: having written the notes and letters, a person is aware of their content and can, before death, destroy or revise what might appear unseemly or counter to the legacy he wants to promote. Computer and electronic data, however, contain information that the person probably doesn’t know exists.

More information is good; it helps us to understand our universe and the people in it. The tracking and amassing of computer and electronic data are inevitable; the extent and details of their operation, however, cannot yet be known. We should embrace—although we don’t have to celebrate—the technologies that enable us to produce this wealth of knowledge previously unattainable to scholars, even if they mean, in the end, that our heroes, idols, and mentors are demystified, their flaws and prejudices and conceits brought to light.

The question is, when will we have crossed the line? How much snooping goes too far and breaches standards of decency and respect? It is one thing for a person to leave behind a will that says, in essence, “Here’s my computer. Do what you want with it. Find anything you can and tell your narrative however you wish.” It is quite another thing for a person never to consent to such a search and then to pass away and have his computer scanned for revealing or incriminating data.

It’s hard to say what crosses the line because it’s hard to know where the line should be drawn. As Justice Potter Stewart said of hard-core pornography, “I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it.” Once scholars begin—and the day is coming—hacking devices to find out more about influential people, the courts and the academic community will be faced with privacy decisions to make. We will have to ask if computer and electronic data are substantially similar to private correspondence such as letters, to balance the need for information with the desire for privacy, to define what information is “private” or “public,” and to honor the requests of those savvy enough to anticipate the consequences of this coming age of research.

Amid this ambiguity, one thing will be certain: Soon we can all join with Princess Margaret in proclaiming, “I have as much privacy as a goldfish in a bowl.” That is good and bad news.

The Law Review Model as a Check against Bias?

In Academia, Arts & Letters, Essays, Humanities, Law, Scholarship, Writing on October 9, 2013 at 7:45 am

Allen 2

A version of this essay appeared in Academic Questions.

Could peer-reviewed humanities journals benefit by having student editors, as is the practice for law reviews? Are student editors valuable because they are less likely than peer reviewers to be biased against certain contributors and viewpoints?  I begin with a qualifier: What I am about to say is based on research, anecdotes, and experience rather than empirical data that I have compiled on my own. I do not know for sure whether student editors are more or less biased than professional academics, and I hesitate to displace concerns for expertise and experience with anxiety about editorial bias. There may be situations in which students can make meaningful contributions to reviewing and editing scholarship—and to scholarship itself—but to establish them as scholarly peers is, I think, a distortion and probably a disservice to them and their fields.

Student editors of and contributors to law reviews may seem to be the notable exception, but legal scholarship is different from humanities scholarship in ways I address below, and law reviews suffer from biases similar to those endemic to peer-reviewed journals. Nevertheless, law review submission and editing probably have less systemic bias than peer-reviewed journals, but not because students edit them. Rather, law review submission and editing make it more difficult for bias to occur. The system, not the students, facilitates editorial neutrality.

There are several factors about this system that preclude bias. Because editors are students in their second and third year of law school, editorial turnover is rapid. Every year a law review has a new editorial team composed of students with varied interests and priorities. What interested a journal last year will be different this year. Therefore, law reviews are not likely to have uniform, long-lasting standards for what and whom to publish—at least not with regard to ideology, political persuasion, or worldview.

Law review editors are chosen based on grades and a write-on competition, not because they are likeminded or pursuing similar interests. Therefore, law reviews are bound to have more ideological and topical diversity than peer-reviewed journals, which are premised upon mutual interest, and many of which betray the academic side of cronyism: friends and friends of friends become editors of peer-reviewed journals notwithstanding a record of scholarship. The composition of law review editorial boards is, by contrast, based upon merit determined through heated competition.

Once on board, law review student editors continue to compete with one another, seeking higher ranks within editorial hierarchies.[1] Being the editor-in-chief or senior articles editor improves one’s résumé and looks better to potential employers than being, say, the notes editor. Voting or evaluations of academic performance establish the hierarchies. Moreover, each year only a few student articles are published, so editors are competing with one another to secure that special place for their writing.[2] Finally, student editors usually receive grades for their performance on law review. The result of all of this competition is that law review editors are less able than peer reviewers to facilitate ideological uniformity or to become complacent in their duties—and law reviews will exhibit greater ideological diversity and publish more quickly and efficiently than peer-reviewed journals.

Because of the ample funding available to law schools, scores of specialized journals have proliferated to rival the more traditional law reviews. Many specialized law reviews were designed to compensate for alleged bias. There are journals devoted to women’s issues, racial issues, law and literature, law and society, critical legal studies, and so on. There are also journals aimed principally at conservatives: Harvard Journal of Law and Public Policy, Texas Review of Law & Politics, and Georgetown Journal of Law & Public Policy, to name three. Specialized journals give students and scholars a forum for the likeminded. On the other hand, such journals call for specialization, which students are unlikely to possess.[3]

For these reasons, I believe that bias is less prevalent among law reviews than among peer-reviewed journals. Part of the difficulty in determining bias, however, is that data collection depends upon the compliance of law review editors, who receive and weed through thousands of submissions per submission period and have neither the time nor the energy to compile and report data about each submission. Moreover, these editors, perhaps in preparation for likely careers as attorneys, are often required to maintain strict confidentiality regarding authors and submissions, thereby making “outside” studies of law reviews extremely difficult to conduct.

And then there is the problem of writing about bias at all: everyone can find bias in the system. I suspect that institutionalized bias against conservative legal scholars exists, but nonconservatives also complain about bias. Minna J. Kotkin has suggested that law reviews are biased against female submitters.[4] Rachel J. Anderson has suggested that law reviews are biased against “dissent scholarship,” which, she says, includes “civil rights scholarship, critical legal studies, critical race theory, feminist theory, public choice theory, queer theory, various ‘law ands’ scholarship that employs quantitative or humanistic methodologies, and other scholarship that, at one point in time or another, is not aligned with ideologies or methodologies that the reader values or considers legitimate.”[5] Finally, Jordan Leibman and James White discovered bias favoring authors with credentials, publication records, or experience.[6]

Law student bias seems, from my perspective, more likely to be weighted toward credentials and reputation rather than political persuasion.[7] An established professor with an endowed chair is therefore more likely to receive a publication offer from a law review than an unknown, young, or adjunct professor; and the name recognition of an author—regardless of personal politics—is more likely to guarantee that author a publication slot in a law review. One downside to this is that student editors will accept half-written or ill-formed articles simply because the author is, for want of a better word, renowned. It is common in these situations for students to then ghostwrite vast portions of the article for the author. Another more obvious downside is that professors from select institutions and with certain reputations will be published over authors who have submitted better scholarship. This is the primary reason why I advocate for a hybrid law review/peer review approach to editing.[8]

I’ve mentioned that legal scholarship differs from humanities scholarship. What makes it different is its attention to doctrinal matters, i.e., to the application of law to facts or the clarifying of legal principles and canons. After their first year of law school, students are equipped to study these sorts of matters. They are not unlike lawyers who approach a legal issue for the first time and must learn to analyze the applicable law in light of the given facts. Although the breadth and scope of legal scholarship have changed to reduce the amount of doctrinal scholarship produced and to incorporate interdisciplinary studies, doctrinal scholarship remains the traditional standard and the conventional norm.

Law students have the facility to edit doctrinal scholarship, but not to edit interdisciplinary articles.[9] This point is not necessarily to advance my argument about bias being less inherent in law review editing; rather, it is to circle back to my initial position that inexperienced and inexpert students should not be empowered to make major editorial decisions or to control the editing. As I have suggested, student editors are biased, just as professional peer reviewers are biased—the problem is that students are less prepared and qualified to make sound editorial judgments. If what is needed is an editorial system that diminishes bias, then student editors are not the solution. Law review editing, however, provides a clarifying model for offsetting widespread bias.

It would be difficult if not impossible to implement law review editing among humanities peer-reviewed journals for the disappointing reason that law reviews enjoy ample funding from institutions, alumni, and the legal profession whereas humanities journals struggle to budget and fight for funding. Therefore, I will not venture to say that peer-reviewed journals ought to do something about their bias problems by mimicking law review editing. Such a solution would not be practical. But by pointing out the benefits of law review editing—i.e., the result of less bias due to such factors as competition and turnover in editorial positions—I hope that more creative minds than mine will discover ways to reform peer-reviewed journals to minimize bias.

 


[1]I consider editor selection flawed for some of the reasons Christian C. Day describes in “The Case for Professionally-Edited Law Reviews,” Ohio Northern University Law Review 33 (2007): 570–74.

[2]How this competition works differs from journal to journal. In some cases, the students select which student articles to publish based on an elaborate voting process supposedly tied to blind review and authorial anonymity.  In other cases, faculty decide.

[3]“Many scholars feel that student editors of law review articles, while they were perhaps once competent to evaluate the merit of scholarly articles owing to the much narrower range of topics, have for the last few decades had great difficulty grappling with nondoctrinal scholarship (that is, scholarship dealing with the intersection of law and other disciplines). The authors of law journal articles now increasingly draw from areas such as economics, gender studies, literary theory, sociology, mathematics, philosophy, political theory, and so on, making the enterprise much too difficult for a group of generally young people, who are not only not specialists, but have barely entered the field of law.” Nancy McCormack, “Peer Review and Legal Publishing: What Law Librarians Need to Know about Open, Single-Blind, and Double-Blind Reviewing,” Law Library Journal 101, no. 1 (Winter 2009): 61–62.

[4]Minna J. Kotkin, “Of Authorship and Audacity: An Empirical Study of Gender Disparity and Privilege in the ‘Top Ten’ Law Reviews,” Women’s Rights Law Reporter 35 (Spring 2009).

[5]Rachel J. Anderson, “From Imperial Scholar to Imperial Student: Minimizing Bias in Article Evaluation by Law Reviews,” Hastings Women’s Law Journal 20, no. 2 (2009): 206.

[6]Jordan H. Leibman and James P. White, “How the Student-Edited Law Journals Make Their Publication Decisions,” Journal of Legal Education 39, no. 3 (September 1989): 396, 404.

[7]Many others share this view: “It appears to be generally assumed that, to a significant degree, Articles Editors use an author’s credentials as a proxy for the quality of her scholarship.” Jason P. Nance and Dylan J. Steinberg, “The Law Review Article Selection Process: Results from a National Study,” Albany Law Review 71, no. 2 (2008): 571.

[8]See my Spring 2013 Academic Questions article, “The Law Review Approach: What the Humanities Can Learn.” I am not alone on this score. Day suggests that “this bias can be defeated by blind submissions or having faculty members read the abstracts and articles of blind-submitted articles where the quality is unknown. The names and other identifying information should be obscured, which is common in other disciplines. This is easy to do with electronic submissions. It should be the rule in law reviews, at least at the initial stage of article selection.” “Case for Law Reviews,” 577.

[9]Hence Richard Posner’s suggestion that law reviews “should give serious consideration to having every plausible submission of a nondoctrinal piece refereed anonymously by one or preferably two scholars who specialize in the field to which the submission purports to contribute.” “The Future of the Student-Edited Law Review,” Stanford Law Review 47 (Summer 1995): 1136.

Unmasking

In Arts & Letters, Creative Writing, Essays, Humanities, The South, Writing on August 14, 2013 at 8:45 am

This essay first appeared here in Kestrel: A Journal of Literature and Art.

Allen Mendenhall

There is no remembrance of former things; neither shall there be any remembrance of things that are to come with those that shall come after.

                                                                                           Qoheleth 1 : 11

Southerners are particular about the way they preserve their loved ones; they encourage embalming, for instance, although at one time they shunned it as unconsented-to tampering with the body.  Eventually someone decided, rather wisely, that the deceased, had they a choice, would like a genteel display of their “shell.”  This meant more than sanitization: it meant dressing the dead like ladies or gentlemen on their way to church.  Which is precisely where they were going—just before they were buried in the ground.  For the most part, Southerners don’t cremate.  (A preacher once told me that the Bible discourages cremation.)

In the South—more than in other regions—funerals are hierarchical affairs: one’s nearness to the deceased signifies one’s importance to the family.  This holds for the church and burial service and is especially true if the departed was popular in life.  Being closest to the deceased, pallbearers shoulder the weightiest burden.

Nowhere is decorum more important than at a funeral procession.  It’s unseemly for one who’s not party to the procession to fail to bow his head and arrange a grave face as the procession passes.  If you’re in a vehicle, you pull over to the curb and, so long as it isn’t dangerous to do so, take up the sidewalk as if on foot.  Quitting the vehicle is, in general, inadvisable if by the time you encounter the procession the hearse is no longer in sight.  Or if, alternatively, the weather doesn’t permit.  If you’re in line, the modus operandi is ecclesiastic—ordered from clergy, to immediate kin, to next-of-kin, to distant family, to friends, to the rest.  Losing your place in line is, accordingly, like losing your intimacy with the family, for whom these rituals are carried out.

I was eight when Great-Granddaddy died.  Mom piloted me before his open-casket and whispered, “That’s not Great-Granddaddy.  That’s just a shell.  Great-Granddaddy’s gone to heaven.”

I looked down at the thing, the shell, the facsimile that seemed uncannily human, and said to myself—perhaps out loud—“That’s not Great-Granddaddy.  That’s something else.”  But the thing appeared real, strange, so nearly alive that it repulsed me.  Its eyes, thank God, were closed, but its mannequin face, vacant and plastic, nauseated me.

Mom prodded me away, hollering at my cousin to take me outside.  My first brush with death, while necessary, had not imparted a healthy understanding of mortality.

My grandmother, Nina, tried to familiarize me with the inescapable while I was still a boy.  Instead of taking me to playgrounds, she took me to cemeteries for what she called “Southern preparations.”  These outings usually occurred on warm spring afternoons, when azaleas bloomed bright white and pink, when yellow Jessamine vines crawled up walls and fences, when dogwoods yawned inflorescent, and when tulips, still un-beheaded, stretched with impeccable posture.  When, in short, nature was doing anything but dying.

Nina shared facts about various grave plots, giving the lowdown on so-and-so’s passing—“he died in Korea,” “he of aids,” “she during pregnancy,” and so forth.  When she finished, we fed the swans.

Which attacked me once.  I was standing on the riverbank, feeding the once-ugly ducklings by hand just as Nina had taught me, when, like Leda, I was enveloped by a feathered glory of beating white wings.  Traumatized, I no longer stood on shore but sat on the roof of the car.  To make me feel less sissy, Nina sat on the hood and pretended that she, too, was afraid.  It wasn’t their size exactly.  Nor the way they tussled with graceful wrath.  Maybe it was the mask about their swan eyes.  I’m sure it was that: the concealment, secret identity, veiled feelings.

Just before I got married, my fiancée, Giuliana, flew in from São Paulo to meet my family.  After supper, Nina insisted that I drive her through the cemetery.  I hadn’t been in years but instantly recognized the rod-iron gates that once seemed so colossal.  There was the river.  The ducks.  The swans.  In the distance, a family, their heads bowed, stood under a high green tent.

Giuliana was not disturbed by this detour.  Quite the contrary:  she felt in some way moved.  It was as if Nina had invited her into a private, intimate space: one that contradicted this modern world of medical science in which everyone tries to postpone or avert death.  In a cemetery one couldn’t help but think of decomposition, permanence, the soul.  One couldn’t help but track the beat of one’s heart, measure the inhales and exhales of one’s breathing.  One couldn’t help, that is, but cherish the fact that one’s alive.

My cell phone buzzed.  An unknown number flashed across the screen.  I answered, “Hello?”

“Mr. Mendenhall?”

“Yes.”

“Are you in the car?”

“No.”

“This is the cancer center at St. Joseph’s Hospital.  We need you to come in.”

I was twenty-four, and about to hear, “You have cancer.”

Nothing—not even a Southern upbringing—can prepare you for those three words.

The odd thing about preachers is that, depending on time and place, their company is either most welcome or most unwelcome.  When I got the call, the cancer call, my uncle, a preacher, was beside me, and I was, to that end, glad.  He made me feel the power of presence, to say nothing of companionship:  I was not alone.

My uncle—Uncle Steve—preaches in the only Southern Baptist church in Chicago.  Unlike most Southern Baptist preachers down South, he eschews the noisy and spectacular, preferring, instead, politesse and restraint.  Bookish and professorial, his voice nasal, his nose suitably sloped to hold up his saucer-sized spectacles, he loves theology and will tell you as much at the drop of a hat.  What with his general softness, he might, with a bit more age, have been mistaken for Truman Capote, with whom, incidentally, his father—my grandfather—had grown up in Monroeville, Alabama.

A man of custom, a student of Latin and Greek, fluent in Russian and French, a former lawyer and journalist, Uncle Steve is uncommonly qualified to carry on the sanctifying traditions of Western Civilization.  He is, in short, a gentleman and a scholar.  And he was in Atlanta that day, standing in the Varsity parking lot, his belly stuffed full of chili dogs, his ketchup-smudged face like an advertisement for this, the world’s largest drive-in restaurant.

I could feel his gaze moving over me and spared him the discomfort of asking what was the matter.

“I have cancer,” I said.

As the words issued from my mouth, my chest felt as though someone were driving a stake into it.  Cancer.  That thing other people got.  Old people.  Not young and healthy people.  Not me.

I tried to act normal, but in doing so betrayed what I really felt—terror.

Uncle Steve put his arm around me.  “Come on.  Let’s get to the hospital.”

Every hour on the hour, the employees of St. Joseph’s Hospital pray together.  These moments, though heavily orchestrated, bring peace to the ill and dying, the sick and suffering.  The nurses and doctors who wander the hallways pause while a disembodied, female voice recites the Lord’s Prayer, first in English, then in Spanish.  “Our Father, who art in heaven…”—the words echo off the cold, linoleum tiles—“hallowed be thy name.

This was happening when I walked into the waiting room.  A nurse, a heavyset black woman with the softest eyes I’d ever seen, was behind the counter, her necklace, weighed down by a tiny crucified Jesus, dangling at her pillow-like breasts.  She whispered, again and again, amen, amen, and then, looking up, took me in with those deep knowing eyes, spoke without speaking.  Sunlight streamed through the cool, trapezoid panes of glass in the ceiling, falling across her face and hair at a low angle.

At last the prayer ended.  She unfolded her hands and smiled formally.  “Good afternoon, how may I help you?”

Responding with “I have cancer” didn’t feel right, so I said, “I’m here to see Dr. Danaker.”

That was all she needed to know.

“Bless your heart, child,” she said.  And, for the first time, I got emotional.  She hugged me, calling me child again; then, right then, I wanted to be a child, wanted her to scoop me into her arms and cradle me, wanted her thick, strong body wrapped around me; but there, too, was Uncle Steve, dignified and collected.  I couldn’t break down in front of him.

The nurse ushered me into a white, windowless room with expansive tile walls and sat me on a tissue-papered chair, which swished and crackled whenever I readjusted my derrière.

There I was.  Conscious.  Being, yet trying to fathom not being.  I imagined myself in a coffin, like that horrid shell, Great-Granddaddy.  Which only made things worse, for I knew that, once in the coffin, I would have no notion of being there.  The problem was thinking itself.  I couldn’t imagine being dead because I couldn’t imagine not imagining.

On Sunday mornings, before church, dad had always made my siblings and me read from the obituaries.  This, he said, would acquaint us with the fragility of life.  He also thought the best way to learn was from experience.  But he’d known only one person who’d experienced death and, almost impossibly, lived to tell about it—Martin, a friend of the family, who’d apparently died three times and, on the operating table, been revived.  Martin loved cigarettes, which he called the backbone of Southern economy and which, he readily admitted, had brought about his three near-fatalities.

Except Martin didn’t put it in those terms.  To him, cigarettes had allowed him to float outside his body for a while, to see what death was like.  For better or worse, Martin didn’t tease a tunnel of light, greet a golden angel, or feel a fluffy cloud:  he simply “left” himself and, in a state of utter weightlessness, peered down on his body as would an outside observer.  Maybe that’s why dad didn’t like us talking to Martin about death: Dad wanted us to hear about St. Peter and heaven and departed relatives.

The trouble with Martin was that one never knew when to believe him.  Heck, we barely knew who he was.  Ephemerally at least, he’d been my aunt’s boyfriend; then, when she dumped him, he’d never gone away: he moved in with my other aunt, a single mother, and helped care for my young cousin.  Martin was present every Thanksgiving and Christmas, but neither got nor gave gifts.  A transplant from North Carolina, he had daughters somewhere—either the Carolinas or Virginia—and had graduated from the University of North Carolina at Chapel Hill, an achievement he was quite proud of.  He didn’t work.  Didn’t own a car.  And didn’t seem to have money.  His singular ability to access death could’ve been, for all we knew, lifted from a sci-fi novel.  Nevertheless, I believed him.

Ten.  That’s how old I was when I saw a dead body I wasn’t supposed to see.  A right turn on I-85, heading north, highway stretching to where sky and land sandwiched together.  I was in my school outfit, backpack in my lap.  Mom was in her tennis getup, checking the rearview mirror.  Traffic was slowing and stopping.  To my left was a vast gray sheet held up by blank-faced men.  Behind it, a woman.  Or what was left of a woman.  Arms and legs bent at impossible angles; head sagging, possibly unattached; a bloodied skirt lifted by the breeze.  Someone’s mom.  Or sister.  Or wife.  Or girlfriend.  Or daughter.  Here one minute, gone the next.  This wasn’t dignity.  This was mean and messy.

Death, they say, is not only universal but also the great leveler: it befalls kings and paupers, rich and poor, wise and foolish.  Solomon, Caesar, Constantine, Charlemagne, Napoleon: all died despite their glory in life.  What I never understood, and, frankly, still don’t, is why folks pretend death doesn’t happen.  The person who ignores death is delusional at best, narcissistic at worst.  Death is our sole commonality, the thing in this world we all await, about which we may commiserate.  It’s what makes us human.  I daresay one can’t fully love a person without knowing that person is temporary.

Francis Bacon once declared, “The contemplation of death, as the wages of sin, and passage to another world, is holy and religious; but the fear of it, as a tribute due unto nature, is weak.”  Weak it may be to the healthy and fit, but to the ill and ailing it seems only natural.  The person who claims he doesn’t fear death is either a liar or an incorrigible maniac—or else a coward, too faint-of-heart to face the facts.  Bacon himself had the good fortune of dying in two to three days, having contracted pneumonia while conducting an experiment in the snow.  Willfully blind to his fate, lying on his deathbed, he penned a letter to his friend, Thomas Howard, expressing relief that he hadn’t suffered the fortune of Caius Plinius, “who lost his life by trying an experiment about the burning of Mount Vesuvius.”

After surgery, I, like Bacon, was bedridden.  Soon a phone call would tell me one of two things: that I was okay, my cancer hadn’t metastasized, or else I wasn’t okay, I needed chemotherapy and my chances of living another two years were below fifteen percent.  A glued-together wound, resembling fat, blue, puckered-up lips, took up the length of my chest.  Visitors asked to see it and then regretted their request when I rolled up my shirt, revealing a moon-shaped, smurfy smile.  When the visitors left, and I was alone again, alone and quiet, I imagined what the malignancy would look like as it spread through my body, which I conceived of as a mini mine field: tunneled with small explosive cancer clusters about to be detonated.  How could this shell—which once ran a mile in under four-and-a-half minutes—expire?

I’m not in my brain but somewhere lower: near the chest, maybe, or the gut.  I couldn’t, for instance, stop a dream even if I wanted to.  Which is odd, because it’s my brain that’s dreaming—not someone else’s.  The brain works independently of me, or, to be precise, of what I perceive to be me: it’s like an unmanned motor boat zipping on the water.  Occasionally one of my siblings, or an old friend, will recall some long-ago event, which I’d otherwise forgotten, and then, suddenly, I’ll remember.  The brain has stored this memory somewhere—somewhere not readily accessible—but I, wherever I am in this shell, never felt compelled to find it.  The thought just exists up there, waiting.

It’s the soul, I suppose, that’s me.  When I lie awake at night and contemplate this interim body, which I inhabit the way a renter inhabits an apartment, I locate my self—that subjective knowing ego—whole and center, as though the brain, convenient as it is, has a mind of its own.  To be sure, I can borrow this organ when I study or otherwise require deep reflection; but when I tire of thinking, when I want a break, when I lean back from my desk, I’m very aware that I, my self, am moving from the head to just above the torso, where I belong.  And when I experience joy, compassion, anguish, despair—when, that is, I feel—it’s never with my head but with something deep within my bosom.  How does one explain this?  Perhaps we’re all antecedent to the body: little floating things confined to this definite, corporate form we didn’t choose, waiting, like thoughts, to be accessed—or released.    

Opossums, more commonly known, in the South, as “possums,” are, I’m told, a delicacy.  Nina’s got a cookbook that says so, though she claims she’s never cooked or eaten one.  I have my doubts, since my dad grew up eating squirrel, which, I think, is more revolting because squirrels are cute and handsome, whereas possums have that eerie look I associate with demons and devils—and masks.

At seven, I persuaded my brother to take a life.  A possum’s life.  It was a horrible affair, really.  One that, even today, is difficult to own up to.  Brett, being the gullible little brother he was—I convinced him once that the shadow-puppet giant who lived on the ceiling would kill him in his sleep—stomped on a squeaking pile of pine-straw while I looked on, presumably to punish him if he disobeyed.  Of course, the squeaking didn’t belong to the pine-straw, but to a tiny nest of baby possums underneath.

For some reason, I was initially proud of what I’d done, and, hours later, said as much to my mom.  Horrified, she made me show her the nest, since I’d “cried-wolf” before.  Sure enough, there, in the pine-straw, lay a bloody baby possum, whimpering, dying.

My first defense was I hadn’t done anything.  Brett had.  I’d simply stood by and watched.  Mom was smarter than that.  I don’t remember what she said—only that, once she said it, I began to cry.  And couldn’t stop.

It was this event, this murder of an innocent, that brought about my general appreciation for original sin, or least for the idea of innate human depravity.  Humans, you might say, are born rotten—so much so that most of us, in our youth, could stomp infant possums to death without understanding the wrongness of our action.  No doubt I regretted this behavior—this actus rea—but not because I felt guilty: it was, in effect, because I feared punishment—some combination of mom’s wrath and her spank-happiness.  A parent’s role is, among other things, to tame a child’s destructive impulses.  That’s what mom did—without succumbing to her own elemental aggressions.

She called the Chattahoochee Nature Center, a local environmental organization, and a worker there explained how to save the baby possum.  This, then, became my task, my agonizing punishment: to keep the possum alive.  Being intimate with death is one thing; being intimate with suffering quite another.  When I scooped the trembling creature up to my palm, it emitted a sad, pitiable squeak.  “Everything’s okay,” I whispered, “I’m not here to hurt you”—a funny assurance coming from the kid who’d just ordered its murder.

If truth be told, I wished I’d just destroyed the thing.  Better dead than in this wretched condition.  Still, the way it looked at me—its beady, searching eyes perusing my face—reminded me of how Ansley, my little sister, then only a year old, looked up at mom when she wanted to be fed.

I placed the creature in a shoe box, which I tucked beneath a shelf in my parents’ closet, the darkest place in the house.  More than anything, the possum needed darkness and silence.  I dug a hole in the backyard, tied two twigs together in the shape of a cross, and arranged a constellation of stones around what would’ve been a grave.  But the thing didn’t die.  It healed so well that, the next morning, it was squirming and scurrying and dad needed a net to contain it.  Even after the possum was free in the backyard, I left the grave untouched, a reminder that all things, even possums, eventually come to an end.

My Southern upbringing was all about learning how to die.  Like the Greek Stoics, Southerners believe in cultivating virtue, improving life, and, above all, accepting mortality.  Liberated from urban distractions, tied to land and home, they regard humans as custodians of the past; they keep gardens, preserve antiques, record lineage, mark battlefields, and salvage the efforts of planters, carpenters, raconteurs, and architects; they ensure, in short, the availability of history.  This can lead to nostalgia for times they never knew, bad times, ugly times, which is to say that this can cause Southerners to overlook—or, worse yet, revise—the inconveniences of history: slavery, for instance, or civil rights.  All the same, the Southern tradition, burdened as it is by various conflicts, retains virtues worth sustaining: community, family, religion, husbandry, stewardship.  These customs, however vulnerable, hardly need guardians.  They will, I suspect, persist, in some form or another, as long as humanity itself; for they are practical, permanent ideals—tested by generations—which people fall back on during disorienting times.  In a region haunted by racial brutality, these principles are, and have been, a unifying reference point, a contact zone where cultures—black, white, and Hispanic—share something spiritual despite their differences.

Living history, not just studying it, but consciously living it, is neither wicked nor wrong; the chronic, urgent awareness that everything you know and love will come undone, is not, I think, misguided, but utterly essential.  There’s something beautiful about facing the insurmountable.  When the world’s fleeting, death becomes a liberating, albeit terrifying, reality.  It throbs and pulsates and beats beneath the skin, inside of which we’re all raw skeleton.

For all this, however, I wasn’t ready.  Didn’t want to die.  Couldn’t even conceive of it.  The twenty-something years my family had been teaching me about death amounted to, not nothing, but not much, either.  Death, I suppose, is a hard thing to accept, and an even harder thing to fight, since fighting seems so pointless: deep down, you know you can’t win.  You might prevail once.  Maybe even twice.  But ultimately it’ll beat you.  It almost did me.

Friends ask how it feels to “beat” cancer.  I never can answer—not satisfactorily—for the experience is more like submission than competition: it’s a manifold process of coming to terms with the body, a thing doomed to decay.  When the doctor—Dr. Danaker—called to say the lymph nodes were benign, that the cancer hadn’t spread, I shocked him with a tired reply:  “Oh, good.”

“This is great news,” he assured me, as if I needed reminding, as if I hadn’t appreciated—indeed, hadn’t understood—how lucky I was.

“I know,” I said.

At this, the good doctor seemed annoyed.  “Ungrateful kid,” his tone implied.  But I wasn’t ungrateful.  Nor ecstatic.  I was, simply put, unbound—by life, by people, by things.  His take was that I had another chance, a fresh start, that I could put this nonsense behind me and move on.  My take was that, having embraced impermanence, I was done protecting myself from suffering, done seeking security through delusion, done dislocating from fate, destiny, providence, what have you.

Done: this, it is true, is weary resignation.  Yet it’s more than that: it’s a sweet but unhappy release, a deliverance, an unmasking.  Almost paradoxically, it’s freedom within—and despite—limitation.

What’s more exhilarating than that one should die?  What’s more mysterious, more horribly electrifying?  As one writer, Paul Theroux, has put it, “Death is an endless night so awful to contemplate that it can make us love life and value it with such passion that it may be the ultimate cause of all joy and all art.”  That is how you cope with this chilling, daunting, stupefying phenomenon: you do it every day until it’s serviceable and aesthetic, until at last you won’t know, can’t know, when it happens, until it’s pleasurable, a masterpiece, sublime in its regularity.  You keep it close, so close it becomes part of you, so close it’s at your disposal, so close that without it, you’re nothing, nothing if not boringly, thoughtlessly, mechanically alive, which is just another way of being dead.  You train and train and then it comes.

Pantry, 1982

In Arts & Letters, Creative Writing, Humanities, Poetry, Writing on July 24, 2013 at 8:45 am

Allen Mendenhall

 

A box of cereal, stale, ants running

Up the side, two brown bananas that

 

He says cleanse the pores

(If rubbed thoroughly),

 

An unwrapped chocolate bar

And a plethora of cans, unopened:

 

In a locked pantry, Little Maddy sits

Plucking the stems

 

Off Granny-Smiths.  Just ten more

Minutes.  Maddy, weary, wondering

 

Just when daddy would come home.

Time: the pantry is unlocked

 

And out comes light

And apples and, lastly, Maddy.

 

Daddy reaches

For the two rotting bananas,

 

Notes can upon unopened can,

Unwraps the chocolate bar,

 

Smears chocolate on his fingers,

Stops, thinks how unlikely it is

 

For apples to lose their stems.

Donna Meredith Reviews “Keep No Secrets,” by Julie Compton

In Arts & Letters, Books, Fiction, Humanities, Law, Law-and-Literature, Novels, Writing on July 17, 2013 at 8:45 am

Donna Meredith is a freelance writer living in Tallahassee, Florida. She taught English, journalism, and TV production in public high schools in West Virginia and Georgia for 29 years. Donna earned a BA in Education with a double major in English and Journalism from Fairmont State College, an MS in Journalism from West Virginia University, and an EdS in English from Nova Southeastern University. She has also participated in fiction writing workshops at Florida State University and served as a newsletter editor for the Florida State Attorney General’s Office. The Glass Madonna was her first novel. It won first place for unpublished women’s fiction in the Royal Palm Literary Awards, sponsored by the Florida Writers Association, and runner up in the Gulf Coast novel writing contest. Her second novel, The Color of Lies, won the gold medal for adult fiction in 2012 from the Florida Publishers Association and also first place in unpublished women’s fiction from the Florida Writers Association. Her latest book is nonfiction, Magic in the Mountains, the amazing story of how a determined and talented woman revived the ancient art of cameo glass in the twentieth century in West Virginia.  She is currently working on a series of environmental thrillers featuring a female hydrogeologist as the lead character.

Julie Compton

Above: Julie Compton

The following review is appearing simultaneously in Southern Literary Review.

Keep No Secrets, Julie Compton’s powerful sequel to Tell No Lies, is guaranteed to keep readers turning pages into the wee hours of the morning. Both of Compton’s courtroom thrillers are set in St. Louis, Missouri, where she grew up.

Like Jodi Picoult’s best works, Compton’s novels sizzle with all the trust, betrayal, love, and forgiveness family relationships entail—especially when you expose their private conflicts in a public courtroom. Her books seem to pose this question: how well can you know even those people closest to you?

Read Tell No Lies first. Though the sequel provides enough backstory to be a great read on its own, without understanding the first book you’d miss the riveting psychological development of the primary characters, all of whom star in the sequel as well.

In Tell No Lies, idealistic lawyer Jack Hilliard leaves behind a lucrative private practice to run for district attorney. The plot centers around a high-profile murder case. Jack is easy to like because he tries so hard to do the right thing. But there wouldn’t be a story if he were perfect. He yields to one temptation, which hurls his life on a downward spiral that nearly ends his marriage and his career.

The final plot twist leaves you wondering if Jack has been manipulated. Compton is that rare author who trusts her readers’ intelligence. She allows us to figure things out for ourselves, to experience the same doubts as Jack Hilliard. It makes the novel more like our own lives, where we can’t always tell what people’s motives are or know when they are lying.

Keep No Secrets begins four and a half years after the events of Tell No Lies. During that time, Jack Hilliard has worked arduously to repair the damage caused by his mistakes—and has largely succeeded. Until the night he finds his teenage son Michael having sex with his girlfriend. They are drunk. Being a white knight kind of guy, Jack gives the girl a ride home. In an effort to win back his son’s love and respect, Jack doesn’t tell his wife about Michael’s transgressions. That car ride sets off an unforeseeable chain of events that threaten to wreck Jack’s career and marriage once again.

Think that’s enough dirt to dump on a nice guy like Jack? Not a chance. The already untenable situation deteriorates further when Jenny Dodson, the woman involved in his earlier downfall, reappears after all these years, asking for his help. He can’t say no, but he vows to keep his wife truthfully informed of everything that happens. He does. Sort of. “The lies aren’t what he says; they’re what he doesn’t say”—this is a refrain Compton artfully employs several times.

This novel deals with social issues like the impact of adultery and sexual assault on families. Most readers are going to put themselves in the various characters’ situations and ask themselves if they would have behaved differently. Would we lie to protect a loved one? What if you knew something that would put the one you love in jail or in danger? Would you tell the truth? What if not telling keeps an innocent person imprisoned? How far should we trust the legal system? If a spouse gave us reason to doubt, could we forgive and trust again? When is it time to give a marriage another chance—and when is it time to walk away?

Compton’s novels are as fine as any courtroom thrillers out there. Though her use of present tense can be a bit distracting, the well-plotted series sparkles with psychologically complex characters.

For both undergraduate work and law school, Compton attended Washington University in Missouri. She began her legal career there, but last practiced in Wilmington, Delaware, as a trial attorney for the U.S. Department of Justice. She now lives near Orlando with her husband and two daughters and writes full-time. She is also the author of Rescuing Olivia, a novel of suspense, romance, and family drama.

Below: Donna Meredith

Donna Meredith

Pragmatists Versus Agrarians?

In America, American History, Arts & Letters, Book Reviews, Books, Conservatism, Emerson, History, Humanities, Liberalism, Literary Theory & Criticism, Literature, Nineteenth-Century America, Philosophy, Politics, Pragmatism, Southern History, Southern Literature, Western Civilization, Western Philosophy, Writing on June 19, 2013 at 8:45 am

Allen Mendenhall

This review originally appeared here at The University Bookman.

John J. Langdale’s Superfluous Southerners paints a magnificent portrait of Southern conservatism and the Southern Agrarians, and it will become recognized as an outstanding contribution to the field of Southern Studies. It charts an accurate and compelling narrative regarding Southern, Agrarian conservatism during the twentieth century, but it erroneously conflates Northern liberalism with pragmatism, muddying an otherwise immaculate study.

Langdale sets up a false dichotomy as his foundational premise: progressive, Northern pragmatists versus traditionalist, Southern conservatives. From this premise, he draws several conclusions: that Southern conservatism offers a revealing context for examining the gradual demise of traditional humanism in America; that Northern pragmatism, which ushered in modernity in America, was an impediment to traditional humanism; that “pragmatic liberalism” (his term) was Gnostic insofar as it viewed humanity as perfectible; that the man of letters archetype finds support in Southern conservatism; that Southern conservatives eschewed ideology while Northern liberals used it to present society as constantly ameliorating; that Southern conservatives celebrated “superfluity” in order to preserve canons and traditions; that allegedly superfluous ways of living were, in the minds of Southern conservatives, essential to cultural stability; that Agrarianism arose as a response to the New Humanism; and that superfluous Southerners, so deemed, refined and revised established values for new generations.

In short, his argument is that Southern conservatives believed their errand was to defend and reanimate a disintegrating past. This belief is expressed in discussion of the work of six prominent Southern men of letters spanning two generations: John Crowe Ransom, Donald Davidson, Allen Tate, Cleanth Brooks, Richard Weaver, and M. E. Bradford.

Langdale ably demonstrates how the Southern Agrarians mounted an effective and tireless rhetorical battle against organized counterforces, worried that scientific and industrial progress would replace traditional faith in the unknown and mysterious, and fused poetry and politics to summon forth an ethos of Romanticism and chivalry. He sketches the lines of thought connecting the earliest Agrarians to such later Southerners as Weaver and Bradford. He is so meticulous in his treatment of Southern conservatives that it is surprising the degree to which he neglects the constructive and decent aspects of pragmatism.

Careful to show that “Agrarianism, far from a monolithic movement, had always been as varied as the men who devised it,” he does not exercise the same fastidiousness and impartiality towards the pragmatists, who are branded with derogatory labels throughout the book even though their ideas are never explained in detail. The result is a series of avoidable errors.

First, what Langdale treats as a monolithic antithesis to Southern conservatism is actually a multifaceted philosophy marked by only occasional agreement among its practitioners. C. S. Peirce was the founder of pragmatism, followed by William James, yet Peirce considered James’s pragmatism so distinct from his own that he renamed his philosophy “pragmaticism.” John Dewey reworked James’s pragmatism until his own version retained few similarities with James’s or Peirce’s. Oliver Wendell Holmes Jr. never identified himself as a pragmatist, and his jurisprudence is readily distinguishable from the philosophy of Peirce, James, and Dewey. Each of these men had nuanced interpretations of pragmatism that are difficult to harmonize with each other, let alone view as a bloc against Southern, traditionalist conservatism.

Second, the Southern Agrarians espoused ideas that were generally widespread among Southerners, embedded in Southern culture, and reflective of Southern attitudes. By contrast, pragmatism was an academic enterprise rejected by most Northern intellectuals and completely out of the purview of the average Northern citizen. Pragmatism was nowhere near representative of Northern thinking, especially not in the political or economic realm, and it is hyperbolic to suggest, as Langdale does, that pragmatism influenced the intellectual climate in the North to the extent that traditionalist conservatism influenced the intellectual climate in the South.

Third, the pragmatism of Peirce and James is not about sociopolitical or socioeconomic advancement. It is a methodology, a process of scientific inquiry. It does not address conservatism per se or liberalism per se. It can lead one to either conservative or liberal outcomes, although the earliest pragmatists rarely applied it to politics as such. It is, accordingly, a vehicle to an end, not an end itself. Peirce and James viewed it as a technique to ferret out the truth of an idea by subjecting concrete data to rigorous analysis based on statistical probability, sustained experimentation, and trial and error. Although James occasionally undertook to discuss political subjects, he did not treat pragmatism as the realization of political fantasy. Pragmatism, properly understood, can be used to validate a political idea, but does not comprise one.

The Southern Agrarians may have privileged poetic supernaturalism over scientific inquiry; it does not follow, however, that pragmatists like Peirce and James evinced theories with overt or intended political consequences aimed at Southerners or traditionalists or, for that matter, Northern liberals. Rather than regional conflict or identity, the pragmatists were concerned with fine-tuning what they believed to be loose methods of science and epistemology and metaphysics. They identified with epistemic traditions of Western philosophy but wanted to distill them to their core, knowing full well that humans could not perfect philosophy, only tweak it to become comprehensible and meaningful for a given moment. On the other hand, the Southern Agrarians were also concerned with epistemology and metaphysics, but their concern was invariably colored by regional associations, their rhetoric inflected with political overtones. Both Southern Agrarians and pragmatists attempted to conserve the most profitable and essential elements of Western philosophy; opinions about what those elements were differed from thinker to thinker.

Fourth, Langdale’s caricature (for that is what it is) of pragmatism at times resembles a mode of thought that is alien to pragmatism. For instance, he claims that “pragmatism is a distinctly American incarnation of the historical compulsion to the utopian and of what philosopher Eric Voegelin described as the ancient tradition of ‘gnosticism.’” Nothing, however, is more fundamental to pragmatism than the rejection of utopianism or Gnosticism. That rejection is so widely recognized that even Merriam-Webster lists “pragmatism” as an antonym for “utopian.”

Pragmatism is against teleology and dogma; it takes as its starting point observable realities rather than intangible, impractical abstractions and ideals. What Langdale describes is more like Marxism: a messianic ideology with a sprawling, utopian teleology regarding the supposedly inevitable progress of humankind.

Given that pragmatism is central to his thesis, it is telling that Langdale never takes the time to define it, explain the numerous differences between leading pragmatists, or analyze any landmark pragmatist texts. The effect is disappointing.

Landgale’s approach to “superfluity” makes Superfluous Southerners the inverse of Richard Poirier’s 1992 Poetry and Pragmatism: whereas Langdale relates “superfluity” to Southern men of letters who conserve what the modern era has ticketed as superfluous, Poirier relates “superfluity” to Emerson and his literary posterity in Robert Frost, Gertrude Stein, Wallace Stevens, T. S. Eliot, William Carlos Williams, and Ezra Pound. Both notions of superfluity contemplate the preservation of perennial virtues and literary forms; one, however, condemns pragmatism while the other applauds it.

For both Langdale and Poirier, “superfluity” is good. It is not a term of denunciation as it is usually taken to be. Langdale cites Hungarian sociologist Karl Mannheim to link “superfluity” to traditionalists who transform and adapt ideas to “the new stage of social and mental development,” thus keeping “alive a ‘strand’ of social development which would otherwise have become extinct.”

Poirier also links superfluity to an effort to maintain past ideas. His notion of “superfluity,” though, refers to the rhetorical excesses and exaggerated style that Emerson flaunted to draw attention to precedents that have proven wise and important. By reenergizing old ideas with creative and exhilarating language, Emerson secured their significance for a new era. In this respect, Emerson is, in Poirier’s words, “radically conservative.”

Who is right? Langdale or Poirier? Langdale seeks to reserve superfluity for the province of Southern, traditionalist conservatives. Does this mean that Poirier is wrong? And if Poirier is right, does not Langdale’s binary opposition collapse into itself?

These questions notwithstanding, it is strange that Langdale would accuse the Emersonian pragmatic tradition of opposing that which, according to Poirier, it represents. Although it would be wrong to call Emerson a political conservative, he cannot be said to lack a reverence for history. A better, more conservative criticism of Emerson—which Langdale mentions in his introduction—would involve Emerson’s transcendentalism that promoted a belief in innate human goodness. Such idealism flies in the face of Southern traditionalism, which generally abides by the Augustinian doctrine of innate human depravity and the political postures appertaining thereto.

What Langdale attributes to pragmatism is in fact a bane to most pragmatists. A basic tenet of pragmatism, for instance, is human fallibilism, which is in keeping with the doctrine of innate human depravity and which Peirce numbers as among his reasons for supporting the scientific method. Peirce’s position is that one human mind is imperfect and cannot by itself reach trustworthy conclusions; therefore, all ideas must be filtered through the logic and experimentation of a community of thinkers; a lasting and uniform consensus is necessary to verify the validity of any given hypothesis. This is, of course, anathema to the transcendentalist’s conviction that society corrupts the inherent power and goodness of the individual genius.

Langdale’s restricted view of pragmatism might have to do with unreliable secondary sources. He cites, of all people, Herbert Croly for the proposition that, in Croly’s words, “democracy cannot be disentangled from an aspiration toward human perfectibility.” The connection between Croly and pragmatism seems to be that Croly was a student of James, but so was the politically and methodologically conservative C. I. Lewis. And let us not forget that the inimitable Jacques Barzun, who excoriated James’s disciples for exploiting and misreading pragmatism, wrote an entire book—A Stroll with William James—which he tagged as “the record of an intellectual debt.”

Pragmatism is a chronic target for conservatives who haven’t read much pragmatism. Frank Purcell has written in Taki’s Magazine about “conservatives who break into hives at the mere mention of pragmatism.” Classical pragmatists are denominated as forerunners of progressivism despite having little in common with progressives. The chief reason for this is the legacy of John Dewey and Richard Rorty, both proud progressives and, nominally at least, pragmatists.

Dewey, behind James, is arguably the most recognizable pragmatist, and it is his reputation, as championed by Rorty, that has done the most to generate negative stereotypes and misplaced generalizations about pragmatism. Conservatives are right to disapprove of Dewey’s theories of educational reform and social democracy, yet he is just one pragmatist among many, and there are important differences between his ideas and the ideas of other pragmatists.

In fact, the classical pragmatists have much to offer conservatives, and conservatives—even the Southern Agrarians—have supported ideas that are compatible with pragmatism, if not outright pragmatic. Burkean instrumentalism, committed to gradualism and wary of ideological extremes, is itself a precursor to social forms of pragmatism, although it bears repeating that social theories do not necessarily entail political action.

Russell Kirk’s The Conservative Mind traces philosophical continuities and thus provides clarifying substance to the pragmatist notion that ideas evolve over time and in response to changing technologies and social circumstances, while always retaining what is focal or fundamental to their composition. The original subtitle of that book was “From Burke to Santayana,” and it is remarkable, is it not, that both Burke and Santayana are pragmatists in their own way? Santayana was plugged into the pragmatist network, having worked alongside James and Josiah Royce, and he authored one of the liveliest expressions of pragmatism ever written: The Life of Reason. Although Santayana snubbed the label, general consensus maintains that he was a pragmatist. It is also striking that Kirk places John Randolph of Roanoke and John C. Calhoun, both Southern conservatives, between these pragmatists on his map of conservative thought. There is, in that respect, an implication that pragmatism complements traditionalism.

Langdale relies on Menand’s outline of pragmatism and appears to mimic Menand’s approach to intellectual history. It is as though Langdale had hoped to write the conservative, Southern companion to The Metaphysical Club. He does not succeed because his representation of pragmatism is indelibly stamped by the ideas of Rorty, who repackaged pragmatism in postmodern lexica. Moreover, Langdale’s failure or refusal to describe standing differences between the classical pragmatists and neo-pragmatists means that his book is subject to the same critique that Susan Haack brought against Menand.

Haack lambasted Menand for sullying the reputation of the classical pragmatists by associating pragmatism with nascent Rortyianism—“vulgar Rortyianism,” in her words. Langdale seems guilty of this same supposition. By pitting pragmatism against Southern conservatism, he implies that Southern conservatism rejects, among other features, the application of mathematics to the scientific method, the analysis of probabilities derived from data sampling and experimentation, and the prediction of outcomes in light of statistical inferences. The problem is that the Agrarians did not oppose these things, although their focus on preserving the literary and cultural traditions of the South led them to express their views through poetry and story rather than as philosophy. But there is nothing in these methods of pragmatism (as opposed to the uses some later pragmatists may have put to them) that is antithetical to Southern Agrarianism.

Superfluous Southerners is at its best when it sticks to its Southern subjects and does not undertake comparative analyses of intellectual schools. It is at its worst when it resorts to incorrect and provocative phrases about “the gnostic hubris of pragmatists” or “the gnostic spirit of American pragmatic liberalism.” Most of its chapters do a remarkable job teasing out distinctions between its Southern conservative subjects and narrating history about the Southern Agrarians’ relationship to modernity, commitment to language and literature, and role as custodians of a fading heritage. Unfortunately, his book confounds the already ramified philosophy known as pragmatism, and at the expense of the Southern traditionalism that he and I admire.

Bartram’s Travels and the Erotica of Nature

In America, American History, Arts & Letters, History, Humanities, Literary Theory & Criticism, Literature, Philosophy, Southern History, The South, Writing on May 29, 2013 at 8:45 am

Allen Mendenhall

This post first appeared here at the Literary Table in 2010.

I’ll limit my discussion of Bartram’s cognitive originality to some finer points made by Michael Gaudio, whose article, “Swallowing the Evidence,” is a mostly on-the-mark interrogation of Bartram’s persistent use of metaphor.

Gaudio writes that Bartram’s Travels, with its imagery of swallowing, mouths, and voids, calls into question Enlightenment aesthetics while signaling glaring absences in the putatively public sphere. Although Gaudio argues convincingly that Bartram’s imagery signifies an “Enlightenment view of the cosmos in which the natural and the social operate according to the same rational principles,” he privileges a political over an erotic reading, thereby reducing the text to a series of subversive patterns of visual perception. In fact, Bartram’s text is less about movement politics than it is about scientific or social politics.

Travels describes a journey lasting from 1773 to 1777, arguably the most intense moment in American political history, yet Bartram makes no mention of the Revolution, the Continental Congress, the Declaration of Independence, or any other political signifier. As the war between Britain and America raged, Bartram rummaged through woods recording data and collecting specimens. He might have been interested in undermining Enlightenment ideals, as Gaudio suggests, but he probably was not keen on likening sink holes to doubts about the democratic project. A better reading would treat Bartram’s concave, hollow, and gaping imagery as vaginal and his nature aesthetics as sexual. Such a reading not only sheds light on Bartram’s aesthetic facility but also gives rise to a better reading of Bartram’s politics as understood through depictions of Natives, black men, or property-owning colonials. Gaudio is right to argue that, for Bartram, “the work of the naturalist is the recording of not only the visibility of nature’s surfaces but also the struggle that leads to that visibility,” but he is wrong to ignore the language of penetration and other pseudo-sexual insinuations. Attending to this sexual language might have allowed Gaudio to enlist Bartram in the “anti-Enlightenment” project in other, more interesting ways—for instance, by contrasting Bartram’s observations of Indian tribes with the unwarranted assumptions of Enlightenment thinkers who dismissed Natives as mere barbarians or worse.

Gaudio submits that because Bartram’s aim was to “exhibit the self-evidence of nature” and to “set the full presence of its surfaces before the viewer,” Bartram’s appeals were necessarily visual. That much, I think, we can grant. But Gaudio goes too far when he contrasts Bartram with Bacon by claiming that the latter employed “rhetoric of penetration” to peer beneath nature’s surfaces whereas Bartram looked precisely to nature’s surfaces because he preferred architectural forms to dissected taxonomies. Gaudio suggests, in other words, that Bartram seeks out rational forms, which share a visual logic, to show nature’s uniform and universal manifestations. Nevertheless, Bartram’s rhetoric (like Bacon’s) is rich in references to penetration. Gaudio’s formative analogy therefore does not stand up to close examination.

“Having some repairs to make in the tackle of my vessel, I paid my first attention to them,” Bartram says of a particularly cheerful morning, adding, “my curiosity prompted me to penetrate the grove and view the illuminated plains.” Similarly, Bartram speaks of “penetrating the groves,” “penetrating the Canes,” “penetrating the forests,” penetrating the “first line” of alligators, “penetrating a thick grove of oaks,” and penetrating “the projecting promontories.” All of this penetration flies in the face of Gaudio’s argument that Bartram’s “voids” signal the limits of Enlightenment thought. Rather than avoiding vocabulary of penetration, Bartram embraces it. Bartram may be interested in surfaces, but he is also interested in—one might say seduced by—what lies beneath. He even employs sexual innuendo and other erotic lexica to portray what lies beneath.

The sexual language in Travels serves to eroticize nature, which seduces with its enchanting if virginal charms. In a brilliant essay, Thomas Hallock speaks of botanic men (including William Bartram’s father, John) who turned “genteel ladies into fascinated subjects.” For these men, plants “served as a shorthand for intimate relationships that were transacted across vast space.” According to this logic, it follows that any “individual who interacts with the natural world takes on an ‘ecopersona,’ an identity or costume of manners that locates consumption of the natural within a given cultural code.” By ignoring the eros pouring forth from Bartram’s nature writings, Gaudio overlooks a very telling association between Native women, whom Bartram eroticizes, and nature, itself a sensual “organism.” More to the point, he misses Bartram’s odd constructions of eco-personae for Native women. Indeed, Bartram forges an association between nature and Native women in his “sylvan scene of primitive innocence,” which was “enchanting” and “perhaps too enticing for hearty young men long to continue idle spectators.”

In what Bartram calls a “joyous scene of action,” nature (read: passion) prevails over reason and European men are drawn helplessly—as if by Sirens—to the Native “nymphs” guarded by “vigilant” and “envious” matrons. The Native women are sensual and seductive because they seem in tune with Nature and the “Elysian fields.” In light of this analogy, Bartram speaks of Natives as “amorous topers,” “amorous and bacchanalian” dancers, amorous singers, and amorous and intriguing wives, just as he speaks of the “sweet enchanting melody of the feathered songsters” in their “varied wanton amorous chaces,” or of the “soothing love lays of the amorous cuckoo.” That is to say, Bartram effectively ties Native women to the carnal cravings of animal lust. For this reason, the desire to penetrate takes on a much stronger meaning than the one Gaudio describes vis-à-vis Bacon—it becomes not just about examinations of exterior surfaces but about the physical need and urge to thrust right through surfaces.

The land on and adjacent to a particular river “appears naturally fertile,” Bartram declares, “notwithstanding its arenaceous surface.” Surfaces can be deceiving, so Bartram digs deeper, so to speak, and identifies their sexual and reproductive possibilities. Similarly, he likens “many acres of surface” to a “delusive green wavy plain of the Nymphae Nelumbo,” a plant that represents sexual purity or virginity. In these and other instances, Bartram renders nature as a playground of erotic spaces for male pleasure. Simply put, Bartram’s nature is fertile and stimulates sexual arousal.

If, for Bartram, Native women were in harmony with nature and so were fertile and seductive—if they were hypersexualized—then Gaudio could have done far more with the vaginal motifs in Travels. Like countless others, he could have called into question the tropes, male gazing, and sexual power plays at work in the book and thereby achieved a “political” reading actually supported by the text. Gaudio is at his best when bringing to light metaphors that would seem easy to overlook, but his analysis fails for disregarding the obvious sexual and vaginal connotations evoked by these metaphors. At worst, his analysis fails for pivoting on a major assumption—that Bartram limited his analysis to surfaces and exteriors without regard to “the insides.” If anything, Bartram seems even more interested in “the insides” given his sexual renderings of a nature that invites penetration and carnal exploration.

See the following articles for more reading:

Abrams, Ann Uhry. The Pilgrims and Pocahontas: Rival Myths of American Origin. Boulder: Westview, 1999.

Fischer, Kirsten. “The Imperial Gaze: Native American, African American, and Colonial Women in European Eyes,” in A Companion to American Women’s History. Blackwell Publishing, 2002.

Fleming, E. McClung. “The American Image as Indian Princess.” Winterthur Portfolio. Vol. 2 (1965: 65-81).

Gaudio, Michael. “Swallowing the Evidence: William Bartram and the Limits of Enlightenment.” Winterthur Portfolio. Vol. 36, No. 1 (2001: 1-17).

Hallock, Thomas. “Male Pleasure and the Genders of Eighteenth-Century Botanic Exchange: A Garden Tour.” The William and Mary Quarterly 62.4 (2005): 32 pars. 13 Oct. 2009 .

The Travels of William Bartram. Ed. Mark Van Doren. New York: Dover Publications, 1928.

Schoelwer, Susan Prendergast. “The Absent Other,” in Discovered Lands, Inventing Pasts. Yale University Press, 1992.