See Disclaimer Below.

Archive for the ‘Communication’ Category

The News Makes You Dumb

In America, Arts & Letters, Books, Communication, Humanities, Literature, News and Current Events, Writing on August 19, 2020 at 6:45 am

This piece originally appeared here in Public Discourse.

A pernicious notion seems to have settled into the minds of my generation (I’m 37) when we were little boys and girls. It’s now an unquestioned “fact” that “staying informed,” “staying engaged,” and “following the news” are the obligatory duties of sensible, responsible people.

They’re not.

Reading and watching the news isn’t just unhelpful or uninstructive; it inhibits real learning, true education, and the rigorous cultivation of serious intellectual curiosity.

Simply Gathering Information Is Not Educational

When I was a child, my parents, quite rightly, restricted my television viewing. I could not, for instance, watch television after 5:00 p.m. or for more than an hour on weekdays. (Saturday morning cartoons ran for a permissible two hours, before my parents arose from bed.)

The glaring exception to these rules was “the news.” Watching the evening news was for my family a ritual in information gathering, the necessary means of understanding “current events.” Whatever else people said of it, the news was, by all accounts, educational.

Was it, though? U.S. Supreme Court Justice Oliver Wendell Holmes, Jr. famously refused to read newspapers. In The Theory of Education in the United States, Albert Jay Nock bemoaned “the colossal, the unconscionable, volume of garbage annually shot upon the public from the presses of the country, largely in the form of newspapers and periodicals.” His point was that a societal emphasis on literacy was by and large ineffectual if the material that most people read was stupid and unserious. Does one actually learn by reading the cant and carping insolence of the noisy commentariat?

“Surely everything depends on what he reads,” Nock said of the average person, “and upon the purpose that guides him in reading it.” What matters is not that one reads but what and how one reads. “You can read merely to pass the time,” the great Harold Bloom remarked, “or you can read with an overt urgency, but eventually you will read against the clock.”

The heart beats only so many beats; in one life, a person can read only so much. Why squander away precious minutes reading mediocre scribbling or watching rude, crude talking heads debate transitory political matters of ultimately insignificant import, when instead, in perfect solitude, you could expand your imagination, nurture your judgment and discernment, refine your logic and reasoning, and purge yourself of ignorance, by pursuing wisdom and objective knowledge, through the canon of great literature, with a magnanimous spirit of openness and humility?

Why let obsequious, unlettered journalists on CNN, Fox News, or MSNBC shape your conscience, determine your beliefs, or develop your dependency on allegedly expert opinion, as if you were a docile creature lacking the courage to formulate your own ideas, when you could, instead, empower yourself through laborious study, exert your own understanding, and free yourself from the cramped cage of contemporary culture by analyzing past cultures, foreign places, difficult texts, and profound ideas?

The Demise of Journalism

When I was in college, not so long ago, you could still find semicolons in The New York Times. I’m told they surface there every now and then, but journalistic writing, as a whole, across the industry, is not what it once was. I’m being hyperbolic, of course, and am not so pedantic as to link semicolon usage with across-the-board journalistic standards. Besides, the Kurt Vonneguts of the world would have been pleased to be rid of semicolons. All I’m saying is that popular media should be more challenging if it’s to have far-reaching, salubrious effects. Newspaper writing, print or online, seems to have dumbed down to the point of harming rather than helping society writ large, and the opinions aired on television and radio seem to have attached themselves to one political party or another rather than liberating themselves from groupthink and stodgy consensus.

Reading as an activity should lift of us up, not drag us down. It should inspire and require us to improve our cognitive habits and performance. The same goes for listening: how we listen and what we listen to affects our basic competency and awareness.

Not only have the grammar, vocabulary, and syntax displayed in “the news” diminished in sophistication, both in print and on television and radio, but also more generally the principal subject matter has moved from the complex and the challenging to the easy and simplistic. Media coverage focuses predominantly on contemporary partisan politics that occasion minimal cognitive energy.

There’s a reason why so many people pay attention to politics: it just isn’t that difficult to think about or discuss. It doesn’t demand rational labor or arduous engagement. It can be passively absorbed. Ratings of television news would not be so high if its content weren’t so simplistic and easy to process. People watch the news to take a break or relax, or to get a rise out of eye-catching scandals and circumstances. The distinction between journalism and tabloid journalism has blurred beyond recognition. In short, journalism is a dying art.

Dangers of a Digital Age

Smart phones and social media are part of the problem. Every age has anxieties about technology. We shouldn’t blame smart phones and social media for human sins. The discourse, not the medium through which it circulates, ultimately is the problem. Yet it’s a problem that smart phones and social media have enabled in a way that past technologies could not. To air an opinion, anyone anywhere can simply tweet or post on Facebook without channeling the message through editors or other mediators.

Digital and smart devices have accelerated editorial processes. The never-ending race to publish “breaking” news results in slipshod work. Online reporting is full of typos and errors. A few clever reporters employ terms like Orwellian, Kafkaesque, Machiavellian, or Dickensian to give the impression of literacy, but the truly literate aren’t fooled.

Have journalistic practices and standards declined as literacy rates have risen? Does an increase in readership necessitate a reduction in quality? Do editors and publishers compete for the lowest common denominator, forgoing excellence and difficulty in order to achieve broad appeal?

Demanding stories and accounts that enrich reading habits and exercise mental faculties aren’t merely salacious or sensationalized clickbait. So they’re difficult, these days, to find, unless you already know where to look.

In the 1980s, E. D. Hirsch, Jr. could write with confidence that newspapers assumed a common reader, i.e., “a person who knows the things known by other literate persons in the culture.” Neither journalists nor their readers today, however, seem literate in the traditional sense of that term. The culture of literacy—true literacy, again in the traditional sense of that term—has come under attack by the very scholars and professors who should be its eager champions.

Our popular pundits, mostly hired guns, supply unqualified, cookie-cutter answers to often manufactured problems; their job is not to inform but to entertain a daft and credulous public. “The liberally educated person,” by contrast, is, according to Allan Bloom, “one who is able to resist the easy and preferred answers, not because he is obstinate but because he knows others worthy of consideration.”

Seek Wisdom and Discernment over Politics and Personal Preference

If we wish to consume the news, we should treat it as junk food. The human body cannot healthily sustain itself on candy bars alone. It requires a balanced diet, nutrition, and exercise. So it is with the mind. Fed only junk, it’s malnourished.

Every now and then we may indulge the vice of chocolate or soda without impairing our overall, long-term health. Likewise we may watch without permanent or severe detriment the screeching cacophonies of semiliterate blatherskites like Sean Hannity, Wolf Blitzer, Chris Wallace, Anderson Cooper, Tucker Carlson, Jake Tapper, or, heaven help us, the worst of the worst, Chris Cuomo.

Just know that during the hour spent watching these prattling performers present tendentious interpretations of fresh facts, militantly employing tedious details to service ideological narratives, you could have read an informative book that placed the applicable subject matter into illuminating historical and philosophical context. The facts may be simple and quick, but interpreting them requires knowledge of the past, including the complexities and contingencies of the relevant religious movements, geographies, anthropologies, governments, literatures, and cultures. Devouring ephemeral media segments and sound bites in rapid succession is not learning. It is gluttonous distraction.

Do not misunderstand me: I do not advocate a Luddite lifestyle or a withdrawal from society and the workaday world. I just mean that too many of us, too much of the time, are enthralled by fleeting media trifles and trivialities, and ensnared in the trap of mindless entertainment disguised as vigorous edification.

Let’s stop telling little children what my generation heard when we were kids. They should stay away from the news lest they fall prey to its mania, foolishness, and stupidity. They should read books—difficult books—and be challenged to improve themselves and refine their techniques. Rather than settling on easy, preferred answers, they should accept tensions and contingencies, suspending judgment until all angles have been pursued and all perspectives have been considered. Let’s teach them to become, not activists or engaged citizens necessarily, but intelligent human beings who love knowledge and learning, and who pursue wisdom and discernment before mundane politics.

Review of Adam J. MacLeod’s “The Age of Selfies”

In Academia, Arts & Letters, Books, Civics, Communication, Humanities, Rhetoric, Rhetoric & Communication on August 5, 2020 at 6:45 am

This review originally appeared here in The University Bookman.

Salma Hayek makes headlines each time she posts a selfie on Instagram. I know this because years ago I set a “Google alert” for the name “Hayek” so that I wouldn’t miss new articles about the great economist and legal theorist Friedrich Hayek. Now, for better or worse, Salma Hayek updates from around the Internet appear in my inbox every morning. We truly live in the Age of Selfies.

That’s the title of the latest book by my colleague, Adam MacLeod, a professor of law at Faulkner University Thomas Goode Jones School of Law in Montgomery, Alabama. The Age of Selfies is a quick read with a straightforward argument about the importance of reasonable, principled disagreement to our civic discourse, institutions, and education. Underlying our passionate disagreements about fundamental principles, MacLeod suggests, is an abiding agreement about the reality of right and wrong, good and bad, truth and error. We quarrel over political issues, he claims, because we hold sincere beliefs about what is or is not moral, presupposing that morality is not only existent but knowable. Most of us, anyway, reject nihilism. Effective, constructive disagreement is, therefore, possible among those who realize this central commonality that holds together otherwise incompatible convictions.

“That many of us speak and act as if moral and political questions have right and wrong answers,” says MacLeod, “indicates that, for all of our fractious disagreement, a consensus is emerging that there is moral truth—right and wrong—about questions that occupy our public discourse.” You would be correct if you guessed that the New Natural Law (an Aristotelian-Thomistic approach to jurisprudence popularized by John Finnis and Robert P. George) rests beneath the surface of this seeming optimism. To oversimplify, the new natural lawyers exposit that practical reasoning enables us to recognize and pursue ends that are intrinsically good and desirable, and that, moreover, fulfill our rational nature as human beings (“every person you encounter,” explains MacLeod, “is an agent of reason and reasoned choice”). It is only a small step, from there, to propose that sensible human beings of good faith can reason together to achieve workable peace and productive civility regarding even controversial matters involving, say, marriage or abortion.

Are the new natural lawyers correct about human nature? Is the human capacity for reason overstated? Do the horrors of the French Revolution caution against the Cult of Reason? What if David Hume was right that reason is the slave of the passions? What if the mind is inherently limited, its memory only partial and selective and its understanding of truth necessarily circumscribed? What if we see only through a glass darkly even if we follow the light of the world? What if many philosophical positions are merely pre-textual rather than genuine? What if they are expounded solely and perversely for political power or personal gain? What if their very terms reject compromise, dissent, or negotiation? What if most people are unreasonable and irrational, motivated more by passion and emotion than by logic and good sense? What if the ordinary response to conflict is anger and outrage rather than patient contemplation? What if hubris is more common than humility? I don’t know the answers to these questions, but, whatever they are, they could diminish the force of MacLeod’s arguments.

Yet they are great and hopeful arguments, predicated against the fashionable notion that what “we” are is simply “a collection of selfies, which are carefully crafted, externally projected images of individual self-constitution.” A person identifies himself or herself—or itself or they or whatever—with community brands (and the concepts they entail) without subscribing or adhering to the principles, doctrines, or teachings that define and govern that community. So, for instance, one can, today, identify as both a Muslim and a Christian even if those two religions are by their own tenets mutually exclusive. Who are we, the naysayers, to criticize this apparent contradiction if it feels authentic to the person professing it?

When people argue over the meaning of a guiding externality—a religious text, a statute, the language of a constitution, a novel—their interpretive differences are rooted in a common source (the document under consideration). MacLeod calls this common source a “neutral ground.” In a sense, MacLeod’s book is, more or less, an attempt to supply “neutral ground” where it is currently lacking, pointing out where people of differing viewpoints agree about the primacy and reality of morality itself.

When people argue, however, over internalities—that is, purely subjective preferences, emotions, or feelings—there is no common source, no independently measurable basis for assessing the validity or invalidity of the views a person embraces. The fact that a person holds them is supposed to suffice by itself. “The fundamental problem is that, on the whole,” MacLeod submits, “young people have made their moral reasoning thoroughly personal.” Accordingly, “[w]hat matters most to them—the only thing that matters to some of them—is that they are true to themselves.”

The ultimate wrong, according to someone who thinks along these lines, is to be judgmental or discriminating or otherwise unaccepting of the allegedly authentic identity of another. The supposedly non-judgmental person nevertheless believes that some actions are not okay, are out of bounds, or, to employ moral vocabulary, wrong. To condemn a person as judgmental is, after all, to express a judgment, to call someone else wrong. Relativism isn’t at play. To deem someone else’s judgment wrong is to suggest that a different judgment is right.

What is to be done about this muddle? This question is a variation on what MacLeod dubs “The Practical Question”: “What shall I do?” Every thinking human being must ask The Practical Question to act to fulfill an objective. For starters, we can stop treating the past as a monolithic category of horrible wrongs and mine it for the good, the beautiful, and the useful. Rather than dismissing all history outright, wrestle with it, search out examples and analyze tensions and contradictions. For an audience of teachers and students, this means working through disagreement and accommodating diverse viewpoints for the sake of clarity and understanding—not because each view is equally strong or valid but because the test of its strength or validity depends upon its being studied, weighed, and refuted.

The nature of rights and duties, the meaning and idea of truth, the concept of sin, the power of indifference—these and other subjects prompt MacLeod into showing that dialogue and conversation break down when, instead of enumerating reasons and arguments in favor of some belief, an adherent simply cites internal preferences as a sort of trump card to end debate. He celebrates private ordering and pluralism as key to self-governance and community harmony absent unwarranted state coercion or government compulsion. “We can,” he avers, “lower the stakes of our public controversies, lower the temperature of our civic discourse, and avoid zero-sum contests over totalizing plans of action if we will simply allow the plural domains of society to do their work.” Such diversity recalls the Catholic doctrine of subsidiarity.

MacLeod’s urgent refrain-of-musts will echo in the thoughts and prayers of sensitive, conscientious readers: “If we are going to get anywhere in our discourse, then we must move beyond stereotypes and personal attacks. We must stop attributing to each other the worst motivations. We must instead seek to understand the reasonable, even admirable, motivations of people with whom we disagree.” This seems, and, I daresay, feels right.

And who knows? Maybe Salma Hayek, browsing her daily Google alerts, will discover her name in this very review, read The Age of Selfies, and then use her celebrity to promote practical reasoning about fundamental rights. A man can dream anyway. 

Students, Keep an Open Mind and Humble Heart in College

In Academia, Communication, Humanities, Pedagogy, Teaching on January 2, 2019 at 6:45 am

Why Universities Must Embrace Free Speech—Or Else

In Academia, America, Arts & Letters, Book Reviews, Books, Communication, Humanities, liberal arts, Liberalism, Pedagogy, Philosophy, Rhetoric & Communication, Scholarship on August 22, 2018 at 6:45 am

This review originally appeared here in The Federalist.

Keith E. Whittington, a professor of politics at Princeton University, calls his latest book, Speak Freely: Why Universities Must Defend Free Speech, a “reminder”—a term suggesting that we’ve forgotten something or that there’s something so important that we shouldn’t forget it. This something is the purpose of the modern university, which is, or should be, a refuge for open dialogue, rigorous debate, and the free exchange of ideas.

Safe spaces, trigger  warnings, speaker disinvitations, speech zones, no-platforming, physical assaults against speakers—these are sure signs that some university cultures have become illiberal and intolerant, prioritizing indoctrination, orthodoxy, conformity, narrow-mindedness, censorship, and dogmatism over the unfettered pursuit of knowledge and wide dissemination of ideas.

Universities are not one-size-fits-all. The multiplicity among and between institutions of higher education in the United States, from community colleges to liberal-arts colleges to state flagship universities, makes generalizations about them impossible. Modern universities, however, are decidedly committed to research on the nineteenth-century German model. Whittington’s chief subject is this modern university, not religiously affiliated colleges guided by a core mission to spread and inspire doctrinal faith through formal education.

This is a very different model than, say, the distinctly Catholic university contemplated by Cardinal John Henry Newman in The Idea of a University that is predicated on the belief that scientific and philosophical knowledge is intimately tied to the revealed truths of the church. Whittington’s key focus appears to be on those institutions classified as doctoral research universities by the Carnegie Classification of Institutions of Higher Education. The gravest problem at such institutions is their coercive restrictions on speech.

Newly Relevant Free Speech Concerns

“My concern here,” Whittington says, “is with a particular problem on college campuses that is not new but newly relevant,” namely that “we are in danger of giving up on the hard-won freedoms of critical inquiry that have been wrested from figures of authority over the course of a century.” An ascendant intolerance jeopardizes free speech at universities, which have as their principal objective the formation and transmission of knowledge that itself depends upon free speech and inquiry.

To cultivate a liberal atmosphere tolerant of diverse views, universities must make room for marginalized voices and controversial ideas, submit received customs and conventions to continuous and critical examination, and welcome good-faith arguments that challenge cherished cultural norms and undermine accepted wisdom. Only by subjecting their beliefs to sustained scrutiny may scholars sharpen and refine their claims and achieve mutual understanding. Only by protecting the speech of dissenters from the shaming and retaliation of those who hold majority or dominant views may universities nurture the empathy and humility necessary to maintain constructive, scholarly conversations.

“[T]he value of free speech,” submits Whittington, “is closely associated with the core commitments of the university itself. The failure to adequately foster an environment of free speech on campus represents a failure of the university to fully realize its own ideals and aspirations.” More than that, such failure “subverts the very rationale for having a university and hampers the ability of universities to achieve their most basic goals.” To value the university is to value the free speech that characterizes the university’s goal and function.

In four succinct chapters, Whittington maps the history of the modern American university, demonstrating how free speech is integral to its mission and indispensable to the search for knowledge and understanding. The Jeffersonians’ opposition to the Sedition Act, and John Stuart Mill’s case against compelled silence in On Liberty, present seminal defenses of free expression that gave substance to the modern university’s commitment to vigorous deliberation and civil debate.

Universities Must Decide Where They Stand

Whittington shows that the free-speech ideal has always been contested on campus, its concrete manifestations differing from school to school and context to context. The tension, moreover, between protecting provocative speech and providing for student safety isn’t new. University administrators have long struggled to balance the promise of robust speech with the need for security in light of potentially violent backlash to offensive, incendiary utterances.

To those who abuse the system by inviting notorious speakers to campus to shout odious words that lack intellectual content and are meant only to shock and incite, Whittington offers this wisdom: “When we are making decisions about whom to invite to campus to speak, the goal should be neither to stack the deck with our closest allies nor to sprinkle in the most extreme provocateurs. The goal should be to make available to the campus community thoughtful representatives of serious ideas.”

The Charles Murrays of the world might enjoy more campus appearances, and more serious attention, if there were fewer speaking invitations to those grandstanding Milo Yiannopouloses, whose (typically) puerile messages and (typically) sophomoric style lack substantive intellectual content. Rather than Milo, why not invite one of the many conservative scholars who seek with sincerity and integrity to contribute to the sum of knowledge, but have been disenfranchised and dismissed by left-leaning faculty?

It’s not contradictory to celebrate free speech while urging restraint in selecting competent, well-meaning speakers. A dedication to pushing the limits of acceptable discourse is not, after all, the same as a dedication to learning the true and the good. Discerning the difference, however, is a task for the informed audience, not the campus censors. Suppressing foolish and fallacious ideas deprives students of the opportunity to learn what constitutes foolishness and fallaciousness.

Universities must choose: “They must decide whether they are committed to a joint project of learning and the principles and practices that make learning possible. If universities are to operate at the outer boundaries of our state of knowledge and to push those boundaries further outward, they must be places where new, unorthodox, controversial, and disturbing ideas can be raised and scrutinized.”

If universities cannot be counted on to expand the frontiers of knowledge, who or what will? This weighty question should cut across partisan lines and ideological camps and unite those of disparate backgrounds in a common cause: that of human progress and achievement.

A Conversation Between Terry Eagleton and Roger Scruton

In Academia, Arts & Letters, Books, Britain, British Literature, Communication, Conservatism, Creativity, Fiction, History, Humanities, Liberalism, Literary Theory & Criticism, Literature, Pedagogy, Philosophy, Politics, Rhetoric, Rhetoric & Communication, Scholarship, The Academy, Western Civilization on September 21, 2016 at 6:45 am

In 2012, the Royal Institution of Great Britain hosted Terry Eagleton and Roger Scruton for an evening of conversation and debate.  Here is the footage of that event:

Is Hacking the Future of Scholarship?

In Arts & Letters, Books, Communication, Ethics, Historicism, History, Humanities, Information Design, Property, Scholarship on November 19, 2014 at 8:45 am

Allen 2

This piece originally appeared here in Pacific Standard in 2013.

Most attorneys are familiar with e-discovery, a method for obtaining computer and electronic information during litigation. E-discovery has been around a long time. It has grown more complex and controversial, however, with the rise of new technologies and the growing awareness that just about anything you do online or with your devices can be made available to the public. Emails, search histories, voicemails, instant messages, text messages, call history, music playlists, private Facebook conversations (not just wall posts)—if relevant to a lawsuit, these and other latent evidence, for better or worse, can be exposed, even if you think they’ve been hidden or discarded.

Anyone who has conducted or been involved with e-discovery realizes how much personal, privileged, and confidential information is stored on our devices. When you “delete” files and documents from your computer, they do not go away. They remain embedded in the hard drive; they may become difficult to find, but they’re there. Odds are, someone can access them. Even encrypted files can be traced back to the very encryption keys that created them.

E-discovery has been used to uncover registries and cache data showing that murderers had been planning their crimes, spouses had been cheating, perverts had been downloading illegal images, and employees had been stealing or compromising sensitive company data or destroying intellectual property. Computer forensics were even used to reveal medical documents from Dr. Conrad Murray’s computer during the so-called “Michael Jackson death trial.”

More information is good; it helps us to understand our universe and the people in it. The tracking and amassing of computer and electronic data are inevitable; the extent and details of their operation, however, cannot yet be known.

Computer forensics can teach you a lot about a person: the websites he visits, the people he chats with, the rough drafts he abandons, the videos he watches, the advertisements he clicks, the magazines he reads, the news networks he prefers, the places he shops, the profiles he views, the songs he listens to, and so on. It is fair to say that given a laptop hard drive, a forensic expert could nearly piece together an individual’s personality and perhaps come to know more about that person—secret fetishes, guilty pleasures, and criminal activities—than his friends and family do.

In light of this potential access to people’s most private activities, one wonders how long it will be until academics turn to computer forensics for research purposes. This is already being done in scientific and technology fields, which is not surprising because the subject matter is the machine and not the human, but imagine what it would mean for the humanities? If Jefferson had used a computer, perhaps we would know the details of his relationship with Sally Hemings. If we could get ahold of Shakespeare’s iPad, we could learn whether he wrote all those plays by himself. By analyzing da Vinci’s browsing history, we might know which images he studied and which people he interacted with before and during his work on the Mona Lisa—and thus might discover her identity.

There are, of course, government safeguards in place to prevent the abuse of, and unauthorized access to, computer and electronic data: the Wiretap Act, the Pen Registers and Trap and Trace Devices Statute, and the Stored Wired and Electronic Communication Act come to mind. Not just anyone can access everything on another person’s computer, at least not without some form of authorization. But what if researchers could obtain authorization to mine computer and electronic data for the personal and sensitive information of historical figures? What if computer forensics could be used in controlled settings and with the consent of the individual whose electronic data are being analyzed?

Consent, to me, is crucial: It is not controversial to turn up information on a person if he voluntarily authorized you to go snooping, never mind that you might turn up something he did not expect you to find. But under what circumstances could computer forensics be employed on a non-consensual basis? And what sort of integrity does computer or electronic information require and deserve? Is extracting data from a person’s laptop akin to drilling through a precious fresco to search for lost paintings, to excavating tombs for evidence that might challenge the foundations of organized religion and modern civilization, or to exhuming the bodies of dead presidents? Surely not. But why not?

We have been combing through letters by our dead predecessors for some time. Even these, however, were meant for transmission and had, to that end, intended audiences. E-discovery, by contrast, provides access to things never meant to be received, let alone preserved or recorded. It is the tool that comes closest to revealing what an individual actually thinks, not just what he says he thinks, or for that matter, how and why he says he thinks it. Imagine retracing the Internet browsing history of President Obama, Billy Graham, Kenneth Branagh, Martha Nussbaum, Salmon Rushdie, Nancy Pelosi, Richard Dawkins, Toni Morrison, Ai Weiwei, or Harold Bloom. Imagine reading the private emails of Bruno Latour, Ron Paul, Pope Francis, Noam Chomsky, Lady Gaga, Roger Scruton, Paul Krugman, Justice Scalia, or Queen Elizabeth II. What would you find out about your favorite novelists, poets, musicians, politicians, theologians, academics, actors, pastors, judges, and playwrights if you could expose what they did when no one else was around, when no audience was anticipated, or when they believed that the details of their activity were limited to their person?

This is another reason why computer and electronic data mining is not like sifting through the notes and letters of a deceased person: having written the notes and letters, a person is aware of their content and can, before death, destroy or revise what might appear unseemly or counter to the legacy he wants to promote. Computer and electronic data, however, contain information that the person probably doesn’t know exists.

More information is good; it helps us to understand our universe and the people in it. The tracking and amassing of computer and electronic data are inevitable; the extent and details of their operation, however, cannot yet be known. We should embrace—although we don’t have to celebrate—the technologies that enable us to produce this wealth of knowledge previously unattainable to scholars, even if they mean, in the end, that our heroes, idols, and mentors are demystified, their flaws and prejudices and conceits brought to light.

The question is, when will we have crossed the line? How much snooping goes too far and breaches standards of decency and respect? It is one thing for a person to leave behind a will that says, in essence, “Here’s my computer. Do what you want with it. Find anything you can and tell your narrative however you wish.” It is quite another thing for a person never to consent to such a search and then to pass away and have his computer scanned for revealing or incriminating data.

It’s hard to say what crosses the line because it’s hard to know where the line should be drawn. As Justice Potter Stewart said of hard-core pornography, “I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it.” Once scholars begin—and the day is coming—hacking devices to find out more about influential people, the courts and the academic community will be faced with privacy decisions to make. We will have to ask if computer and electronic data are substantially similar to private correspondence such as letters, to balance the need for information with the desire for privacy, to define what information is “private” or “public,” and to honor the requests of those savvy enough to anticipate the consequences of this coming age of research.

Amid this ambiguity, one thing will be certain: Soon we can all join with Princess Margaret in proclaiming, “I have as much privacy as a goldfish in a bowl.” That is good and bad news.

Paul H. Fry on “Semiotics and Structuralism”

In Arts & Letters, Books, Communication, Humanities, Literary Theory & Criticism, Literature, Pedagogy, Philosophy, Rhetoric, Scholarship, Semiotics, Teaching, The Academy, Western Philosophy, Writing on July 16, 2014 at 8:45 am

Below is the seventh installment in the lecture series on literary theory and criticism by Paul H. Fry.  The three two lectures are here, here, here, here, here, here, and here.

Allen Mendenhall Interviews Daniel J. Kornstein

In America, American History, Arts & Letters, Books, British Literature, Communication, Essays, Humanities, Literature, Oliver Wendell Holmes Jr., Politics, Rhetoric & Communication, Shakespeare, Writing on June 4, 2014 at 8:45 am
Dan Kornstein

Daniel J. Kornstein

Daniel J. Kornstein is a senior partner at the law firm of Kornstein Veisz Wexler & Pollard, LLP, in New York City.  He earned his law degree from Yale Law School in 1973 and has served as the president of the Law and Humanities Institute.  He has authored several books including Loose Sallies, Something Else: More Shakespeare and the Law, Unlikely Muse, Kill All the Lawyers? Shakespeare’s Legal Appeal, Thinking under Fire, and The Music of the Laws.  His writing has appeared in The New York Times, Wall Street Journal, Chicago Tribune, Baltimore Sun, and the Boston Globe.  In 2002, Dan received the Prix du Palais Littéraire from the Law and Literature Society of France.  In 2013, King Michael of Romania awarded him the Order of the Crown of Romania.

AM: Thanks for taking the time to discuss your new book with me, Dan. The name of the book is Loose Sallies, and as you state in your introduction, it’s not about fast women named Sally. For those who haven’t read the introduction or purchased the book yet, could you begin by discussing the book generally and say something in particular about your chosen genre: the essay.

Loose SalliesDJK: Thank you, Allen, for this opportunity. Those of us who occasionally write are, as you know from your own experience, always delighted to have a chance to explain a bit about how and why we scribble. Loose Sallies is a collection of essays written over the past 25 years mostly about topics of general interest. The first 75 pages is about the drafting of the U.S. Constitution in 1787 and why that remarkable process and its end result are still so important to us today. The rest of the book ranges over a wide variety of topics, from our precious civil liberties to profiles of some famous judges and lawyers to current controversies. It should, I hope, appeal to everyone.

AM: Phillip Lopate has said that the essay is a “diverting” type of literature and that its hallmark is intimacy. You call the essay “intimate, informal and reflective, as if you are sitting at home in your living room or dining room and having a pleasant, sometimes provocative, sometimes stimulating, but always, one hopes, insightful and enlightening conversation.” I agree. The essay is my favorite genre because it’s the genre of the person. You can’t know a person until you’ve met the persona he creates in his essays—and if you don’t write essays, you may not know yourself. Who are your favorite essayists, and what is it about their essays that you find compelling?

DJK: My favorite essayists are the obvious ones: Montaigne, Francis Bacon, Addison & Steele, Hazlitt, Lamb, Orwell, Mencken, Macaulay, Emerson, V.S. Pitchett, E.B. White, Lewis Thomas, George Will, Virginia Woolf, Edmund Wilson, and Joseph Wood Krutch. My favorite living essayists are Lopate and Joseph Epstein, the former editor of The American Scholar magazine. All these writers make their essays compelling by their clarity of thought and uniqueness of expression and their ability to communicate original, stimulating ideas, making us see familiar things in a new light. Epstein, for example, can write on literary personalities as well as personal topics we all think we know about but do not really. Everyone in my pantheon of great essayists is a superb writer with a distinctive and memorable style.

AM: I recently interviewed James Elkins, a law professor at West Virginia University, here on this site, and he talked about lawyer poets and said that “our iconic images of lawyer and of poet are put to the test when we think about one person writing poems and practicing law.” You have something to say about this seeming double life. “Writing,” you say, is “part of my double life. I have a life other than the lawyer’s life I lead on the surface. The two sides—law and writing—reinforce and complement each other.” I’ve heard the phrase “the two worlds” problem used to describe the lawyer who is also a writer. But this doesn’t seem to be a problem for you, does it?

DJK: A lawyer IS a writer. Writing is most of what a lawyer does. To be a good lawyer, one needs to be a good writer. Verbal facility, sensibility to language, and lucid thinking are prerequisites for both. A legal brief and a piece of expository writing have much in common. Both have a point to make to persuade the reader. Both rely on effectively marshaling evidence to demonstrate the correctness of a particular perspective. The topics may differ, but the skill and technique are similar. The problem facing the lawyer-writer is more one of time and energy and desire than anything else. Law is a demanding profession, which means taking time off to do anything else cuts into one’s otherwise free moments. But if you want to write, you make the time.

AM: I’m curious, when did your love of literature begin? Did you have an “aha!” moment, or did the love evolve over time?

DJK: I cannot recall ever not loving literature. My paternal grandfather was a printer at Scribner’s and when I was a little boy he gave me four books by Robert Louis Stevenson that my grandfather had himself set in type in 1907. I gave Treasure Island to my son and Kidnapped to my daughter, and still have the other precious two volumes on my shelves.

I remember my father taking me as a youngster to the Public Library at Fifth Avenue and 42nd Street to get my first library card. In those days, the main building had a circulation department, and my father’s choice for my first library book was, of course, Tom Sawyer, a good choice for a ten-year old boy.

I remember as a teenager reading as much as I could in addition to books assigned in school. There were nights spent, in classic fashion, with a flashlight under the covers after bed time.

Inspiring teachers helped too.

AM: You’ve written a lot on Shakespeare. How did your fascination with him come about?

DJK: Like most people, I first met Shakespeare in high school English classes. Luckily for me, around the same time New York had a summer program of free Shakespeare in Central Park, which continues to this day. Starting in the summer of my junior year in high school — 1963 — I began to see two of Shakespeare’s plays every summer. It was at one of those performances — Measure for Measure in 1985 — that the passion grabbed me. I was 37 years old and had been practicing law for 12 years. As I sat watching Measure for Measure, I realized for the first time how much the play was about law, and that recognition — the “fascination” you refer to — set me off on a project that would last years. First, I wrote a short essay about Measure for Measure for the New York Law Journal, our daily legal newspaper. Then, months later, I saw a production of The Merchant of Venice and wrote another essay. From there, one thing led to another, and before long, I had the makings of a book.

I reread the plays I had read as a student and read many others for the first time. Then I read as much as I could find about Shakespeare and the law. The result was my 1994 book called Kill All The Lawyers? Shakespeare’s Legal Appeal.

I am still fascinated by Shakespeare. Each time I read or see one of his great plays, I get something new out of it.

AM: Many essays in Loose Sallies concern politics, law, government, and current events. You discuss the Founders, Holmes, Bill Clinton, Hugo Black, Steve Jobs, Ayn Rand—all sorts of people and even some decisions of the U.S. Supreme Court. You manage to do so without coming across as overtly political, polemical, or tendentious. How and why?

DJK: It is a question of style and goal. Every one of the essays has a thesis, some of which may even be controversial. The idea is to persuade your reader to accept your thesis, and that requires care and sensitivity, logic and demonstration, not name-calling or verbal table-pounding. If I am “overtly political, polemical or tendentious,” I will probably not convince anyone who does not already agree with me. A writer has to be smoother and subtler. We live in a country right now riven by political and cultural partisanship. Public controversy today between “red” and “blue” is almost always shrill. A reader tires of it; it becomes almost an assault on our sensibilities. To reach people’s hearts and minds, you have to credit both sides of an issue but explain patiently and show convincingly why you think one side is more correct than another. I am not running for public office so I have no “base” to appeal to. But I can at least try to keep the tone of the debates I engage in civil and pleasant.

AM: Do you consider the essays on these topics literary essays?

DJK:Most of the essays in Loose Sallies are not about so-called “literary” topics. True, one is about the literary style of Supreme Court opinions, and two discuss Justice Holmes’s opinion-writing style. But they are exceptions. So I do not think the essays for the most part are “literary” in that narrow sense. Nor do I think they are “literary” by way of being precious or mannered. I genuinely hope, however, that they are “literary” in the sense of being clear, crisp, well-written statements on a variety of topics of interest to all Americans today.

AM: Thank you for taking the time to do this interview. Loose Sallies has been enjoyable for me. I keep it on my desk in the office so that, when I need a ten-minute break, I can open it and read an essay. I slowly made my way through the entire book in this manner: a break here, a break there, and then, one day, I was finished. I really appreciate all that you have done not just for the law, but for arts and literature. It’s nice to know there are lawyers out there like you.

Paul H. Fry’s “The New Criticism and Other Western Formalisms”

In Academia, American History, American Literature, Arts & Letters, Books, Communication, History, Humanities, Literary Theory & Criticism, Literature, Pedagogy, Philosophy, Poetry, Rhetoric, Scholarship, The Academy, Western Civilization, Western Philosophy, Writing on May 28, 2014 at 8:45 am

Below is the sixth installment in the lecture series on literary theory and criticism by Paul H. Fry.  The three two lectures are here, here, here, here, and here.

Outposts of Culture: Gerald Russello Reviews Jason Harding’s The Criterion

In Academia, Arts & Letters, Book Reviews, Books, Britain, British Literature, Communication, Essays, History, Humanities, Literary Theory & Criticism, Literature, Scholarship, Writing on April 2, 2014 at 8:45 am
Gerald Russello
 
Gerald Russello practices law in New York and edits The University Bookman. He is the author of The Postmodern Imagination of Russell Kirk (University of Missouri Press, 2007).  His articles, essays, and reviews have appeared in The National Review, The New CriterionCrisis Magazine, The American Conservative, Chronicles, The Imaginative Conservative, The American Spectator, City Journal, The Intercollegiate Review, Modern Age, First Things, and many other publications.
 
This review originally appeared here in The University Bookman in 2003.  It is republished here with the express permission of The University Bookman.  The book under review is Jason Harding’s The Criterion: Cultural Politics and Periodical Networks in Inter-War Britain (New York, New York: Oxford University Press, 2002).

 

In the final issue of the Criterion, which appeared in January 1939, T. S. Eliot wrote that “continuity of culture” was the primary responsibility of “the small and obscure papers and reviews.” It was they that would “keep critical thought alive” amidst troubled times. And so it has been, for a century and more. The vitality of the “little magazines” is one of the strongest indicators of a culture’s intellectual level. These journals, typically of small circulation and little revenue, serve a crucial function as the medium for the transmission of ideas among scholars, elites, and the larger population. it is perhaps a sign of our times that so many of our Masters of the Universe choose to endow business schools or fund independent films rather than to support the written word. Many of the journals themselves, unfortunately, have become so obscure and inward-minded that they may no longer be worth the trouble.

The British aptitude for starting small associations of like-minded folk was well expressed by the profusion of little magazines, especially in the nineteenth and early twentieth centuries. This proclivity was to bear further fruit across the Atlantic, where Americans followed the British model. Up until the Second World War, America had a thriving culture of little magazines that tradition survives, in a somewhat anemic form, in the independent so-called “zines” that clutter the bookshops of progressive enclaves like Manhattan or Berkeley. There have been two recent examples of the differing fates of such journals here in the United States. Lingua Franca was an energetic journal devoted to academic life, which it chronicled in a sharp, intelligent style. After less than four years of publication it went bankrupt and ceased publication, only to be partially revived in an Internet incarnation after being acquired by the Chronicle of Higher Education. On the other end of the scale is Poetry, which recently received a gift of $100 million from a philanthropist whose own poems it had rejected. The gift instantly made the small journal one of the best-endowed cultural institutions in the country.

The Criterion was perhaps the most important of the journals of the last century. The first issue, which appeared in October 1922 and contained (without epigraph or notes) Eliot’s poem The Waste Land, changed Western intellectual life, and it continues to define what an intellectual journal should be. However, study of the Criterion has been subsumed by the focus on Eliot’s development as a poet and thinker. The larger cultural importance of the journal has received insufficient attention. That has now changed. From such an improbable place as the department of foreign languages and literature in Feng Chia University in Taiwan, where Jason Harding is assistant director, comes The Criterion: Cultural Politics and Periodical Networks in Inter-War Britain. It is a work of polished scholarship on the role of the Criterion in British intellectual life.

Harding divides his analysis into three parts. Part I, “Cultural Networks,” deals with the Criterion as one of a number of small intellectual periodicals, such as the Adelphi and New Verse, which appeared in this period. The second section, titled “The Politics of Book Reviewing,” focuses on a number of regular Criterion contributors, and their relationship with, and treatment by, Eliot as their editor. The chapters include studies of almost forgotten figures like Bonamy Dobrée and Montgomery Belgion as well as more well-known figures such as John Maynard Keynes and the difficult but brilliant Ezra Pound. Harding shows that, while Eliot directed and organized every aspect of the journal, each of the contributors played their own part in establishing the Criterion’s preeminent position.

The final section, “Cultural Politics,” focuses on the purpose of the Criterion as Eliot came to see it in the dark days of the 1930s. As the influence of the journal increased, it became known not only as a showcase of modernism but also as a conduit for what Eliot called “the mind of all Europe” and a defense of the West. The author discusses Eliot’s attempts to persuade major Continental intellectual figures such as Ernst Robert Curtius to contribute to the journal, and his efforts consistently to review foreign periodicals for his British readership.

Harding presents a complex cultural picture in service of his goal of establishing the Criterion as part of “an ongoing cultural conversation, most immediately a dialogue with a shifting set of interlocking periodical structures and networks.” Eliot, as an editor, had to deal not only with his rival journals, but also with his sensitive patron, Lady Rothermere. There were also those occasionally truculent contributors, such as Wyndham Lewis or D. H. Lawrence, who sometimes abandoned the Criterion for other, better-paying reviews.

Among a number of fascinating episodes, Harding recounts here the controversy over classicism and romanticism between Eliot and John Middleton Murry, founder of the Adelphi. Murry launched the first salvo in 1923, claiming that there was no tradition of classicism in England. Although not the subject of the attack, Eliot felt obliged to respond and published in the Criterion the following month his famous defense of classicism, “The Function of Criticism.” Murry and Eliot were to have a limited rematch at the end of the decade over the humanism of Irving Babbitt. Other scholars have examined the substantive merits of their respective positions. Harding’s purpose is rather to show that the literary rivalries among serious journals spurred Eliot, as a writer and editor, to set out his critical and literary vision. They necessarily shaped the kind of journal Eliot was creating.

In his final sections, Harding examines the evidence for Eliot’s supposed anti-Semitic or fascist sympathies and finds them wanting. Under Eliot’s editorship, several writers documented the rise of Nazism in Germany, and the final issue contained a condemnation of Nazi racial theories. Harding concludes that: “Given the Criterion’s record on these matters, it is remarkable that recent critics have stigmatized the journal by suggesting that Eliot was sympathetic to the aims and methods of Nazism.” Harding realizes that Eliot’s conversion to Anglicanism and his efforts to “stitch together into some kind of unity the Latin-Christian elements of the otherwise diverse cultures of Western Europe” meant his rejection of the Nazi regime. And even though Eliot was somewhat sympathetic to fascism, that sympathy, as Harding demonstrates, was attenuated and did not cause him to suppress other viewpoints in the Criterion.

Drawing on a wealth of previously unexamined materials and private collections, Harding expands upon our knowledge of Eliot as a major twentieth-century figure. His careful research adds a new dimension not only to Eliot as a thinker and editor, but also to the entire period of British literary journalism.

%d bloggers like this: