Tag Archives: academia

Historians and irony, Part I

We historians have a long, intense, up-and-down relationship with irony, the kind that merits an “it’s complicated” tag. We argue with irony, shout, try going our own separate way– but the final break never comes, and eventually we and irony always wind up back in bed together. Like all stormy relationships, it’s worth some serious thought.  (Note for extra reading:  like pretty much any other historian who discusses irony, I’ve been hugely influenced by the great historian/critic Hayden White— when you have the time, check out his writing.)

Now, historians’ irony doesn’t quite track our standard contemporary uses of the word. It’s not about cliché hipsters saying things they don’t really mean, or about unexpected juxtapositions, like running into your ex at an awkward moment.

No, we historians go for the heavy-hitting version, as developed by the Ancient Greeks and exemplified by their ironist-in-chief Oedpius Rex. In the Greek play, you’ll remember, he’s a respected authority figure hot on the trail of a vicious killer– only to discover that he himself did the terrible deed, plus some other terrible deeds nobody even imagined. Like most of the Greek tragic stars, he thinks he’s in charge but really he’s clueless.

You can see how that kind of irony appeals to historians. After all, we spend a lot of our time studying people who misjudged their command of events– and anyway, we know the long-term story, how events played out after the instigators died. Most of the leaders who got Europe into World War I thought it would last a few weeks and benefit their countries. By 1918 four of the big player-states had been obliterated, and the ricochet damage was only beginning– Stalin, Hitler, the Great Depression, the atomic bomb, and a whole trail of other bad news can all be traced back to 1914.

That’s why our relationship to irony never makes it all the way to the divorce court. It’s basic to what we do.

But there are other sides to the relationship, and that’s where the shouting starts. We historians don’t just confront people’s ignorance of long-term consequences. There’s also the possibility they don’t understand what they’re doing while they’re doing it. That possibility takes lots of forms, and we encounter them in daily life as well as in the history books. There’s the psychological version, as when we explain tough-guy behavior (whether by a seventeenth-century king or twenty-first-century racists) in terms of childhood trauma or crises of masculinity. There’s the financial self-interest version, as when we believe political leaders subconsciously tailor their policies to their career needs.

And then there are the vast impersonal forces versions, what we might call ultra-irony, where historians see individuals as powerless against big processes of social change. That’s how the Russian novelist Leo Tolstoy described the Napoleonic wars, and how the French philosopher Alexis de Tocqueville described the advance of democracy— efforts to stop it just helped speed it up. Marxist and semi-Marxist historians have seen something similar in the great western revolutions. Those fighting tyrannical kings in 1640, 1776, and 1789 didn’t think they were helping establish global capitalism– many hated the whole idea of capitalism– but their policies had that effect all the same.

You can see why historians have such a fraught, high-voltage relationship with ultra-irony interpretations like these. On the one hand, sure– we all know that many social forces are bigger than we are; we laugh at those who try to stop new technologies or restore Victorian sex habits; we know we’re born into socio-cultural systems and can’t just opt out of them.

On the other hand, historical practice rests on evidence, documentation– and where do we find some president or union leader telling us he did it all because his childhood sucked? How do we document vast impersonal forces? Ironic interpretations require pushy readings of the documents– speculation, going beyond what the evidence tells us, inserting our own interpretive frameworks. Nothing makes us historians more jumpy.

There’s a deeper problem as well: interpretations like these diminish human dignity, by telling us that people in the past didn’t know what they were doing or even what they wanted to do. If we accept these interpretations, we deny agency to historical actors, belittle their ideas, dreams, and efforts, mock their honesty and intelligence. We dehumanize history– the human actors are the pawns, the vast impersonal forces run the game.

Those are serious criticisms, and they’ve been around since the nineteenth century.

But the interesting thing is, their persuasive force rises and falls over time. You’ll have a whole generation of historians who find ultra-irony persuasive and helpful; it feels right, and it seems to open up exciting new research questions. Then the tide shifts, and historians become more concerned with agency. They listen closely to historical actors’ own views of who they were and what they were doing.

By and large, the mid-twentieth century fell into Phase 1 of this cycle– it was a time when historians saw irony everywhere and paid lots of attention to big impersonal forces. Marxism was riding high, but so also were the other -isms: Freud-influenced historians saw unconscious drives pushing people to act as they did; Weberians saw the experience of modernization behind political and religious movements. “Underlying causes” were big, and we viewed participants’ own accounts with suspicion– we assumed they didn’t understand their own motives or circumstances.

But that changed in the 1970s, and for the past thirty years we’ve been deep in Phase 2, the no-irony phase. We’re concerned with taking historical actors seriously and with avoiding what a great Marxist historian called “the enormous condescension of posterity.” We believe in “agency”– meaning, from the top to the bottom of the social scale, people can help shape their own destinies.

What does it all mean? I have a few thoughts, but I’ll wait until the next post to lay them out– stay tuned!

Advertisements

Junior oligarchs in America: the Samantha Power case

My last post talked about some ways that power seeps into American intellectual life. It’s a complicated process, with lots of moving parts. But (I argued) it’s unwise to ignore any of those parts, even the ones that look like decorative frills. Ballet companies and English departments are remote from the world of policy think tanks and political candidacies– but funding the ones helps the others function effectively, by creating new spheres of influence and new sympathies in once-critical audiences.

Now I want to consider another example, which has more to say about the receivers of culture patronage. It’s the case of Samantha Power, currently US ambassador to the United Nations and the subject of a lengthy, fascinating New Yorker profile. It’s a case worth thinking about because Power so perfectly embodies the top echelons of American intellectual life. She has two Ivy League degrees, from Yale and Harvard, and she’s a professor at Harvard’s Kennedy School of Government; she writes a lot, three books over the last decade, one a Pulitzer Prize winner, plus a stream of articles and occasional pieces; her husband Cass Sunstein belongs to the same world– he’s an ultra-prominent Harvard Law School professor.

So Power gives us one glimpse into American intellectual life at the center, where it has the greatest potential leverage. As you’ve probably noticed, it’s twenty-seven years and counting since we had a president without at least one degree from either Harvard or Yale; like Samantha Power, George W. Bush has two.

Of course the New Yorker article has lots to say about Power’s intellectual gifts and her capacity for nonstop hard work, but it also says lots about her personal charm; she’s tall, athletic, dresses well, and apparently enjoys the galvanic effect she has on many people she meets. Sure, all these details are there partly because of journalistic sexism (it’s hard to imagine Cass Sunstein getting quite the same appearance report card), but I don’t think that’s the whole story — charm takes multiple forms, and it seems to be a real component of intellectual life at the level Power inhabits. After all, most private universities in the US select for personal qualities as well as for grades and test scores; and as the most selective of the bunch, the Ivies can put extra weight on that side of the application process.

Certainly Power’s charm isn’t just a matter of good looks and athleticism. It also includes a warm, healthy, happy home life. The profile describes at length her attachments to her parents and her story-book wedding; and it describes her husband and kids watching from the gallery as she undergoes her pre-confirmation Senate grilling. Her family’s warm support is a part of what she’s giving us.

Reading all this reminded me of two brief observations about this top-echelon world that I recently encountered– both of them from outsiders, both off-hand remarks, yet both sharp enough that they’ve stayed with me.   One’s from an anonymous blog commenter, apparently a non-Harvard philosophy graduate student or youngish professor (all he tells us is his gender), describing a chance encounter with some apparently Harvard-connected young scholars– whom he sums up as “these happy, wholesome, self-confident, new-vanguard, shiny people from great schools going great places together.”

The other observation comes from the science journalist Daniel Bergner, toward the end of his wonderful book about current research on women’s sexuality.  The research he describes includes some pretty out-there experiments, testing physiological responses to porn images, for instance, and it generates some out-there conclusions, for instance, that women by nature are just as non-monogamous, sexual, and potentially crazy as men. Toward the end of his inquiries, Bergner realizes that the scientists he’s been tracking tend to work at non-top-echelon institutions; and finally he asks one of them “why I never found myself phoning the psychology departments of Harvard or Yale or Princeton, why I never spent time with their professors, why so few of America’s most elite universities devoted any attention to her field.”  The researcher replies that the Ivies just don’t do this kind of thing. Exploring sex in these ways is too weird, too potentially upsetting; it may contribute to happier homes, but it’s just as likely to blow up the whole concept.

All this to say, it’s not just Samantha Power. A certain vision of American-style middle-of-the-road happiness, health, and flourishing seems built into the intellectual world around her. A starker version of that commitment also comes out at the confirmation hearings, when senators ask about some of her early writings, in which she’d occasionally dissed the US. Of course she vigorously backtracked, calling America “the greatest country on earth” and “the most powerful country in the history of the world. Also, the most inspirational;” and she emphasizes that she “would never apologize for America.” The senators are impressed, and Power is confirmed.

Like any political encounter, Power’s confirmation hearing lends itself to multiple interpretations. We can read it as an ambitious political actor doing what’s necessary to reach a position of authority, where she can do some good.  Or we can read it as a textbook demonstration of how the real boundaries on American political discourse get set– namely, by those holding the real political cards, not by Harvard professors.

Or we can conclude that Power’s walk-back expresses beliefs she actually holds. Probably back home in Cambridge she wouldn’t use quite this Red State rhetoric. But the policies she’s advocated throughout her career– first as an intellectual, now as a participant– presuppose more or less this stance: from the outset, she’s endorsed American military intervention in troubled societies, on the assumption that it’s an inspirational force for good. That faith apparently remains unshaken by the disastrous results of our interventions in Afghanistan, Iraq, and Libya– the last a project that Power herself helped design and continues to endorse.

My point is, we don’t need to look for a George Soros or a Koch brother pulling the strings in cases like this. Samantha Power shows us the deeper micro-processes in American intellectual life, processes that attach brilliant young people– “happy, wholesome, self-confident, new-vanguard, shiny people”–to the twenty-first-century American project.

Patrons of the arts

“When bankers get together for dinner, they discuss Art. When artists get together for dinner, they discuss Money.” That’s the British playwright Oscar Wilde, speaking to us from around 1900. That was during the world’s previous great Gilded Age, and now that we’re deep into a new one, we humanists need to pay attention, even here in non-artistic, non-imagination-centric corners like the History Department. That’s because Wilde raises one of the basic questions we should be asking about ourselves: what’s our relationship to money and the powerful people who have it?

In wisecrack format, Wilde sums up one of the classic answers. Rich people love the arts, and artists (or historians, or philosophers– you get the idea) need money. Usually we can’t get that money selling our wares to ordinary people, since they have other needs to cover first, like housing, clothing, and food. Like it or not, we’re in a luxury business, selling expensive, delightful add-ons that make life better but aren’t needed to keep it going. We have to sell to the same folks who buy the other luxury products.

Of course the selling is more direct in the art world. Rich patrons interact directly with artists, and sometimes they tell the artist what to produce– a portrait of the kids, a design for a new home, a new opera. In academia, there are intermediaries. Donors give their money to institutions, which then dole it out to individual professors and researchers according to the institutions’ own guidelines and standards. But it’s basically the same process, rich people paying for cultural production.

Usually that doesn’t mean bad art or ideas, au contraire. Many of the rich have had good educations, and anyway they don’t have to care what other people think– they can make the adventurous calls, not just the safe ones. The Rockefellers created New York’s Museum of Modern Art back when modern art seemed crazy and dangerous, and that openness to the new has been a standard pattern since the Renaissance. Check out the seventeenth-century painter Caravaggio for an extreme example. He was gay, violent, and young, and he painted sacred scenes in wild new ways– but he received huge support from all sorts of Catholic big-shots.

But it seems that push always eventually comes to shove, and then the dark sides of artistic/intellectual patronage come into view. I’ve written here already about the case of Steven Salaita, whose appointment the University of Illinois overturned after wealthy donors complained about some of his tweets. And you’ve probably heard about the billionaire Koch brothers, sophisticated and generous patrons of the New York City Ballet and other cultural institutions, who’ve also donated tens of millions to various universities– but with strings attached: in return for the money, at least in one case, they’ve demanded that the university teach ideas congenial to them, and they may have demanded a say in the faculty appointment process.

In other words, the rich aren’t just buying aesthetic pleasure– they’re also investing their money, and like all investors they expect a return.

Now there’s a new example to ponder, more disturbing in that it concerns an especially sympathetic figure– the hedge-fund manager George Soros. Unlike most of the other modern billionaires, Soros has genuine intellectual credibility– before he got rich he did a real PhD, in a hard-core humanities discipline, and he’s used his money to support various admirable causes. He’s even helped create a whole new university in his native Hungary, devoted primarily to the humanities and social sciences.

So to a humanities professor like me, Soros is a good guy, exemplifying the best sides of cultural patronage.

But now Soros has joined the war-pushing business that’s so popular these days, calling for tougher European action in the Ukraine: “Europe is facing a challenge from Russia to its very existence,” he tells us; “the argument that has prevailed in both Europe and the United States is that Putin is no Hitler,” but “these are false hopes derived from a false argument with no factual evidence to support it;” all European resources “ought to be put to work in the war effort,” because “in the absence of unified resistance it is unrealistic to expect that Putin will stop pushing beyond Ukraine when the division of Europe and its domination by Russia is in sight.”

Whatever you may think about the Ukraine situation, there’s a lot here to weird you out. There’s the casual talk of going to war, as if launching a serious European war wouldn’t be one of the all-time human disasters. There’s the full-court demonization of our enemies, as monsters with whom it would be folly–“unrealistic”– to negotiate. We’ve had fifteen years of this kind of rhetoric– has it produced anything but disasters?

And then there’s the strange venue that Soros selected for his call to arms: the New York Review of Books. Most of those who encounter this site will know all about the New York Review, but in case you don’t, it’s the publication that pretty much encapsulates humanities department thinking in the US. Every two weeks, it offers extended reviews of academic books, along with one or two pieces of sophisticated political commentary; professors write most of these, but they write with educated-outsider readers in mind.  So people like me read it to learn about the new trends in English or Art History, or about debates on the origins of the American Revolution– it’s a way to get up to at least amateur speed on interesting topics, without doing the heavy reading yourself, a virtual coffee house for academics, where we all meet up.

Which raises the question, why is a call for European leaders to get tough appearing there, rather than in the Frankfurter Allgemeine Zeitung, Le Monde, or the New York Times?

Now, I have no clue what Soros has in mind with the substance of his warfare talk. Maybe he actually believes the comic book, super-villain-on-the-loose worldview he’s pushing, or maybe he has some money-making irons in the Ukraine fire (iron Maidans, as it were…), or maybe some mix of the two– who knows?

But we can do better guessing about that last question, the why-the-New York Review question. Because whatever else is going on, Soros is broadcasting to an audience made up mostly of us humanities professors and various humanities-adjacent types; he apparently wants us along on his foreign policy crusade. It’s the classic good news/bad news story. The good news: our collective opinion seems to matter in legitimating an enterprise of this kind, perhaps more than most of us realize. We’re worth courting. The bad news: when it comes to cultural patronage, the good guys like Soros give as much thought as anyone else to the returns their investments will bring.

Listening to Trollope

I’m a big fan of the mid-Victorian novelist Anthony Trollope.

For those who haven’t encountered him, Trollope combines in one package just about everything you might have heard about nineteenth-century English fiction. His novels are long, with dozens of characters, all of them prosperous, some of them super-rich. There’s a whole series just dealing with squabbles among Church of England clerics, and another series about a very rich duke. Typically the plots end in marriages, often between relatively poor young women and rich young men. There’s even fox-hunting, described in loving detail– Trollope himself rode and hunted.

So it’s not self-evident that Trollope would grab the attention of a leftist atheist rust-belt professor like me.

He does, though– partly because he’s among the all-time great writing technicians, able to delineate characters, emotions, and scenes in just a few lines, and partly because his sympathies are so deep. He persuades us that we can learn as much about the human condition from Archdeacon Grantly, Marie Goesler, the dukes of Omnium, and their friends as from anyone else in literature. As a bonus, he’s also an inspiration to anyone who writes even part-time. Trollope wrote for three full hours every morning before starting his day job; if he finished a novel before the time was up, he started in on the next one.

But there’s another reason for thinking about Trollope these days, more particularly for thinking about those six Church of England novels. One of their central themes is an experience that we in the twenty-first century university are reliving– the travails of tradition-minded institutions in rapidly-changing societies. Trollope has a lot to teach us about that situation.

To Trollope’s credit, he doesn’t sugarcoat or simplify his lessons. His fictional Church of England mixes good and bad qualities, and so do the individual clergymen who work for it. They do lots of good, but they also bask in their social privileges, and few of them work very hard; some are greedy, and most have moments of envy and anger. Weird inequalities run through the Church, because so many of its practices were set up centuries earlier. As vicar of Framley, Mark Robarts gets a fine income and a beautiful house; as perpetual curate of Hogglestock, the more learned, pious, and hardworking Josiah Crawley can barely feed his family.

Anyone familiar with the American university today can see the parallels. Like Trollope’s Church of England, we have our rich and poor institutions, our underpaid adjuncts and overpaid stars, our vanities and trivial feuds. We do some good, but we don’t do it very efficiently. At times, we cling to medieval arrangements that don’t serve obvious functions today.

The parallels are just as obvious when it comes to the challenges that face Trollope’s Church. In all the Church of England novels, calls for reform loom in the background, and sometimes they hit the characters hard.   Just like our own university reformers, Trollope’s Church-reformers want to make the institution more useful to society. They’re troubled by its inefficiencies and failure to change with the times. They see privileged clergymen who are stuck in the past and don’t contribute to the real world. They see an institution that needs to bring its practices into line with its mission statements.

Sound familiar?

So one lesson from Trollope is that our contemporary debates about the university recycle old battles and old arguments. The talk we hear every day– about how the university needs to serve new social needs, deliver services more efficiently, modernize — is about 150 years old. That doesn’t necessarily make it false or irrelevant. But it does mean it’s not based only on observation and brave new thinking. It’s a trope, as they say in the English Department.

Trollope’s traditionalists don’t have good answers to the criticisms they face. Pretty much all they can come up with is some mix of “we’ve always done it this way,” “God wants it this way,” and “society will fall apart if we shake things up.” Nowadays, we professors mostly avoid the “God wants it this way” answer, but updated versions of the other two are still going strong.

But Trollope’s sympathies are with the traditionalists anyway, despite the feebleness of their self-defense, and over the course of the novels we readers come to look differently at them. He never fully explains this alternate view, and his characters seem unaware of it, but the point eventually comes through: in Trollope’s Church, ideas, preaching, and rituals matter less than the possibility the institution allows for living out certain values.

He illustrates that idea with two of his clergmen, the only 100-percent Christians he comes up with in the whole series of novels. They’re both more or less failures in real life. The one is impossibly rigid and impractical, angering everyone he meets; the other is lovable but dithery and not very smart. Neither is much of a minister, though both of them try hard; neither would survive long without the sheltering protection of the Church. But their survival matters, because they walk the Christian walk, exemplifying for their friends what Christianity is really supposed to be. Without them, all we’d have would be the inert Christian texts, a series of empty formulas and content-free injunctions.

Values need to be lived, not just thought or believed in, Trollope tells us, and they can only be lived in the right conditions. Trollope’s Church provides those conditions.

Now, providing a safe space for saints isn’t a big concern for the twenty-first-century university. Over the years I’ve met one or two academic saints, but they’re not a key demographic in my world.

What is basic to my world is another version of Trollope’s encounter between inanimate texts and living human beings. For Trollope, the encounter was about living out the Bible’s demands. For us, it’s about keeping alive the texts and other forms of knowledge that make up civilization.

Of course, like the Bible, our civilizational texts will live on in the libraries and storage drives whether anyone looks at them or not. But they don’t really exist unless they’re read by actual human beings, living in the specific conditions of their own times. Read and grappled with in a serious way, meaning that someone has learned the languages and done all the other legwork that real understanding requires.

In different ways, this applies to every university discipline. Somebody has to do the heavy lifting– learn the technical vocabularies, dates, and names, practice arpeggios, perform experiments, study old manuscripts, travel to weird places.

Just like Trollope’s clerics, we knowers don’t see ourselves or what we’re doing very clearly, and we do a bad job of explaining ourselves to the outside world. When asked to explain all that reading, practicing, and experimenting, we talk about the importance of our “research,” the books we’re writing, the journal articles we’ve published. Then we’re left flat-footed when someone asks whether the world actually needs another new book on Shakespeare’s sonnets or the French nobility.  Why invest so much effort in a book that’s going to reach about six hundred readers?

Trollope points us to the real answer. We may not need the new books, and certainly the books themselves aren’t going to change the world. What we need are the years of studying and the strange kinds of knowledge that go into the books. We need people who walk the cultural walk, and we need quite a few of them if knowledge is going to survive in any meaningful way.

In our world, the university is the only safe haven for those people.

Disciplining the university, the Illinois edition

My last post talked about fear in the contemporary American university, as seen in the specific case of America’s Russia experts. Apparently many of them feel jumpy about expressing non-standard views, and that startled me.   Even before we get tenure, most of us professors enjoy a fair amount of security; anyway, I’d always assumed (naively, it turns out) that full-spectrum discussions were welcome in foreign policy matters, especially those involving nuclear weapons and exotic languages. So I offered a little speculation about what’s going on and where today’s jumpiness fits into the longer history of American intellectual conformity.

Now a new case of professorial fear is in the news, and it gives a more direct look at the mechanisms that can produce it. It’s the case of Steven Salaita, who’d been offered a tenured position at the University of Illinois Urbana-Champaign, only to have the university’s Chancellor and Board of Trustees yank the offer months after it had been accepted.

I won’t go into the details, which have been widely reported on in the academic news–you can find the story explained here, and here‘s an eloquent comment on it by the historian Natalie Zemon Davis–, but the main issue centers on tweeting. In a series of tweets, Salaita expressed in strong language his feelings about Israeli actions in the Gaza Strip, and the Illinois administration decided that his harsh tone crossed a line. He’d been “disrespectful” toward others’ views, and therefore shouldn’t work in UIUC classrooms. It’s relevant to the story that these were just personal opinions– Salaita’s teaching and research have nothing to do with Israel, Islam, Judaism, or Palestine (he’s a specialist in Native American studies). It’s also relevant that big donors and the university’s fundraising office sought to influence the administration’s decision in the matter (as shown here). Whether or not their threats counted, the donors gave it their best shot.

So the minimum Illinois story is that an academic can get into serious trouble for expressing personal opinions, and that university administrators face serious pressure from donors to ensure intellectual conformity. That minimum story is bad enough, and it’s hard not to suspect that there’s worse behind it. The university administration insists it’s not pushing specific views of Israeli policy, and that its only concern is the tone in which dissenting ideas are expressed– but would they have dumped Salaita if he’d said mean things about Vladimir Putin or Urban Meyer?

You don’t need to pursue those suspicions, though, to see the real menace here, namely the civility standard itself. It’s a trap, a way for powerful institutions to enforce intellectual discipline while pretending to encourage discussion. John Stuart Mill explained it all 150 years ago. Mill pointed out that we can’t set the boundaries of “temperate” and “fair” debate; “if the test be offence to those whose opinion is attacked, I think experience testifies that this offence is given whenever the attack is telling and powerful, and that every opponent who pushes them hard, and whom they find it difficult to answer, appears to them, if he shows any strong feeling on the subject, an intemperate oponent.”

In other words, real disagreement is going to include harsh language and hurt feelings, and the cult of civility is a way of preventing disagreement from getting too real.

The Salaita case shows that mechanism of repression working at full steam.

The intellectual in America: an example

Here’s an example of the intellectual’s situation in contemporary America, courtesy of the American historian of Russia Steven F. Cohen. Cohen is about as well-established a figure as could be imagined (some details here). After thirty years as a professor at Princeton, he now teaches at NYU; he’s published lots of scholarly books and received important honors; he seems to have plenty of money and reasonable access to the media. If anyone should feel secure about expressing opinions, it’s people like Cohen.

So it’s a shocker when he tells us about self-censorship within his well-informed, well-protected milieu. Cohen’s particular concern is American policy toward Russia, an issue on which he’s has spoken eloquently and courageously, but the details here matter less than the intellectual climate that he describes.

In that climate, he tells us, “some people who privately share our concerns” in “the media, universities and think tanks—do not speak out at all. For whatever reason—concern about being stigmatized, about their career, personal disposition—they are silent.” As for young scholars, those “who have more to lose,” Cohen himself urges silence. He reports telling junior colleagues that “‘American dissent in regard to Russia could adversely affect your career. At this stage of life, your first obligation is to your family and thus to your career. Your time to fight lies ahead.’”

This is a seriously depressing account, because Cohen isn’t even talking about outsider radicals, ethnic leaders, or potential “extremists,” the kind of people that the New York City Police Department might put under surveillance (see here and here for examples). He’s only discussing well-trained experts like himself, who work for well-defended, rich institutions. His friends have connections, their opinions fall within the spectrum of American common sense, and the subjects they study have major-league practical relevance. After all, we really don’t want to screw up our relations with another heavily-armed nuclear power. We want to get the story straight, and critical debate contributes to doing that.

Yet fear reigns even in this corner of the academic arcadia. At a minimum, Cohen tells us, university professors wait for tenure before expressing an opinion; until then, they shut up. Many of their elders apparently continue shutting up after when the immediate pressure eases, whether because there are still career steps to climb or for more personal reasons.

In some ways, of course, this is just an updated version of an observation that Alexis de Tocqueville made long ago. “I know of no country,” Tocqueville reported in Democracy in America, “where there prevails, in general, less independence of mind and less true freedom of discussion than in America…. In America, the majority draws a formidable ring around thought. Within those limits, the writer is free; but woe to him if he dares to go outside it.”

But there’s also something more sinister in the story that Cohen tells. Tocqueville believed that American democracy explained the problems he detected. “The tyranny of the majority” (he invented the phrase) ensured that non-conforming opinions wouldn’t be heard, because Americans (metaphorically) voted on their ideas just as they (really) voted on their city councilors. But Cohen and his friends aren’t actually facing the tyranny of the majority. They’re facing instead the readiness of powerful insiders to channel discussion in specific directions, by using among other tools their leverage over academic institutions. In other words, the old fashioned forms of power haven’t lost their relevance in our twenty-first century world– and even historians can feel their effects.

 

Why study history, Part 2: individual lives, collective knowledge

My last post talked about why we need historical knowledge. (Short version: history tries to see reality whole, with all the details, contradictions, and complexity left in, and we need that kind of thinking — because reality IS complicated, in ways that few academic disciplines acknowledge.)

So far so good, but then we hit the big cloud hanging over history education in 2014. “We” may need historical knowledge, but “we” don’t do the studying or pay the tuition or try to get jobs after finishing college. Individuals do all those things, and individuals have to live with the results. It’s all very nice and uplifting to say that people should study history, but what if there are no jobs for them? Why should students rack up fees and debts if there’s not much waiting for them after graduation?

What follows isn’t exactly an answer to that question; I’m not even sure there really is an answer, in the usual sense of the term. Instead, I present here some thoughts on the question itself, and suggest that we need to place it in larger contexts than we usually do. The “why study history” question, I’m saying, is really a question about how individuals, communities, and knowledge intersect in 2014.

The first step is to recognize the seriousness of the problem. The jobs situation for history graduates isn’t good, and it’s probably getting worse. Back in the good old days, meaning until about 1975, big corporations liked to train their own people, and they welcomed candidates with liberal arts degrees; it was understood that training would cost money, but that was an investment that eventually would pay big dividends. Anyway, liberal arts graduates could always fall back on teaching if business didn’t appeal to them.

Things are different today. Schools at every level are in financial trouble, and they’re not hiring many historians. In the corporate world, job candidates are increasingly expected to show up pre-trained and ready to contribute; no one expects them to stay around long enough for training programs to pay off, so HR departments favor people with career-ready educations, in economics, technology, health sciences, and the like. (See here for an account.) In these circumstances, a history major may be ok those who don’t have to worry about jobs after graduation, or for those who can treat college as a preparatory course for professional programs like law. It’s not so great for those who need to start paying the bills right away.

In response, historians have publicized all the ways in which history actually is a good preparation for a real career in the real world. And we have some reasons for saying so– history courses teach you to analyze situations and documents, write clearly, think about big pictures, understand other cultures (something worth real money in today’s inter-connected global economy). Most of the history department websites I’ve visited (here for example) include some version of these claims.

The American Historical Association (the history profession’s official collective voice in the US) has taken this approach one step farther. With the help of a foundation, it has set up a program (which it calls the Tuning Project) designed to bring college history teaching into closer alignment with employers’ needs, by putting professors in touch with employers and other stake-holders. If professors have a better understanding of what employers want, the hope is, we can better prepare students for the real world and draw more majors into our courses.

But you can see the problem: some parts of a history education give you the skills to work in a big-money corporation, but many others don’t. Some history topics require knowledge that’s hard to acquire and not much practical use in the twenty-first century– the dates of obscure wars, or the dead languages needed to understand some ancient civilizations. Other topics are likely to mark you as a dangerous malcontent. Picture a job seeker showing up at Mega Corporation X (or at the Chicago Board of Education, for that matter) with her senior thesis on union organizing in the 1930s, or the successes of Soviet economic programs, or Allied war crimes in World War II. Whatever her skills of analysis and cultural negotiation, she’s not the kind of candidate HR departments are looking for. She carries intellectual baggage; she looks like trouble.

That thought experiment suggests the possibility that “tuning” the history major actually means changing its content– by cutting out the troublesome (discordant?) elements, those that might upset our conventional wisdoms. Of course, you could argue that “tuning” just applies to teaching, and therefore doesn’t change the basics of historical knowledge. Professors still get to research and write about whatever they like; in their books, they still get to be intellectual adventurers and trouble-makers. But that’s no real answer, because as American institutions currently work, history teaching eventually shapes history research. If history majors aren’t studying unions or war crimes, universities aren’t going to be hiring faculty in those areas either, and those books won’t be written.

That’s bad news, because American society has a strong collective interest in making sure that this kind of knowledge gets produced. All societies need need to think about difficult questions and disturbing ideas, for the reasons that John Stuart Mill laid out way back in the 1850s. Societies that fail to do so (he explained) do stupid and immoral things; they fail to develop intellectually or socially; even their economic lives suffer, since the choke-hold of conventional wisdom eventually stifles business too. For Mill, disruptive knowledge was as much a practical as a spiritual need.

But it’s not clear how this collective need is to be met by the American university as it increasingly functions nowadays. As the language of the individualistic free market becomes more prevalent within it, fields of knowledge risk being defined by calculations concerning “the employability of our graduates” (as a document from my own university puts it). Given the pressures that they face, our students are fully justified in focusing on their “employability,” and university faculty have a duty to help them toward it. But that’s not the university’s only duty. It has at least an equal duty to develop knowledge, including especially knowledge untuned to employers’ needs, even antithetical to those needs.

That means that eventually the “why study history” question shades into a political problem. Historical knowledge is a form of collective property, and its health is bound up with other elements of our communal life. In the increasingly privatized universities of our times– privatized in financing, mechanics, and measurements of success–the “why study history” question may not have an answer.