Tag Archives: academia

Listening to Trollope

I’m a big fan of the mid-Victorian novelist Anthony Trollope.

For those who haven’t encountered him, Trollope combines in one package just about everything you might have heard about nineteenth-century English fiction. His novels are long, with dozens of characters, all of them prosperous, some of them super-rich. There’s a whole series just dealing with squabbles among Church of England clerics, and another series about a very rich duke. Typically the plots end in marriages, often between relatively poor young women and rich young men. There’s even fox-hunting, described in loving detail– Trollope himself rode and hunted.

So it’s not self-evident that Trollope would grab the attention of a leftist atheist rust-belt professor like me.

He does, though– partly because he’s among the all-time great writing technicians, able to delineate characters, emotions, and scenes in just a few lines, and partly because his sympathies are so deep. He persuades us that we can learn as much about the human condition from Archdeacon Grantly, Marie Goesler, the dukes of Omnium, and their friends as from anyone else in literature. As a bonus, he’s also an inspiration to anyone who writes even part-time. Trollope wrote for three full hours every morning before starting his day job; if he finished a novel before the time was up, he started in on the next one.

But there’s another reason for thinking about Trollope these days, more particularly for thinking about those six Church of England novels. One of their central themes is an experience that we in the twenty-first century university are reliving– the travails of tradition-minded institutions in rapidly-changing societies. Trollope has a lot to teach us about that situation.

To Trollope’s credit, he doesn’t sugarcoat or simplify his lessons. His fictional Church of England mixes good and bad qualities, and so do the individual clergymen who work for it. They do lots of good, but they also bask in their social privileges, and few of them work very hard; some are greedy, and most have moments of envy and anger. Weird inequalities run through the Church, because so many of its practices were set up centuries earlier. As vicar of Framley, Mark Robarts gets a fine income and a beautiful house; as perpetual curate of Hogglestock, the more learned, pious, and hardworking Josiah Crawley can barely feed his family.

Anyone familiar with the American university today can see the parallels. Like Trollope’s Church of England, we have our rich and poor institutions, our underpaid adjuncts and overpaid stars, our vanities and trivial feuds. We do some good, but we don’t do it very efficiently. At times, we cling to medieval arrangements that don’t serve obvious functions today.

The parallels are just as obvious when it comes to the challenges that face Trollope’s Church. In all the Church of England novels, calls for reform loom in the background, and sometimes they hit the characters hard.   Just like our own university reformers, Trollope’s Church-reformers want to make the institution more useful to society. They’re troubled by its inefficiencies and failure to change with the times. They see privileged clergymen who are stuck in the past and don’t contribute to the real world. They see an institution that needs to bring its practices into line with its mission statements.

Sound familiar?

So one lesson from Trollope is that our contemporary debates about the university recycle old battles and old arguments. The talk we hear every day– about how the university needs to serve new social needs, deliver services more efficiently, modernize — is about 150 years old. That doesn’t necessarily make it false or irrelevant. But it does mean it’s not based only on observation and brave new thinking. It’s a trope, as they say in the English Department.

Trollope’s traditionalists don’t have good answers to the criticisms they face. Pretty much all they can come up with is some mix of “we’ve always done it this way,” “God wants it this way,” and “society will fall apart if we shake things up.” Nowadays, we professors mostly avoid the “God wants it this way” answer, but updated versions of the other two are still going strong.

But Trollope’s sympathies are with the traditionalists anyway, despite the feebleness of their self-defense, and over the course of the novels we readers come to look differently at them. He never fully explains this alternate view, and his characters seem unaware of it, but the point eventually comes through: in Trollope’s Church, ideas, preaching, and rituals matter less than the possibility the institution allows for living out certain values.

He illustrates that idea with two of his clergmen, the only 100-percent Christians he comes up with in the whole series of novels. They’re both more or less failures in real life. The one is impossibly rigid and impractical, angering everyone he meets; the other is lovable but dithery and not very smart. Neither is much of a minister, though both of them try hard; neither would survive long without the sheltering protection of the Church. But their survival matters, because they walk the Christian walk, exemplifying for their friends what Christianity is really supposed to be. Without them, all we’d have would be the inert Christian texts, a series of empty formulas and content-free injunctions.

Values need to be lived, not just thought or believed in, Trollope tells us, and they can only be lived in the right conditions. Trollope’s Church provides those conditions.

Now, providing a safe space for saints isn’t a big concern for the twenty-first-century university. Over the years I’ve met one or two academic saints, but they’re not a key demographic in my world.

What is basic to my world is another version of Trollope’s encounter between inanimate texts and living human beings. For Trollope, the encounter was about living out the Bible’s demands. For us, it’s about keeping alive the texts and other forms of knowledge that make up civilization.

Of course, like the Bible, our civilizational texts will live on in the libraries and storage drives whether anyone looks at them or not. But they don’t really exist unless they’re read by actual human beings, living in the specific conditions of their own times. Read and grappled with in a serious way, meaning that someone has learned the languages and done all the other legwork that real understanding requires.

In different ways, this applies to every university discipline. Somebody has to do the heavy lifting– learn the technical vocabularies, dates, and names, practice arpeggios, perform experiments, study old manuscripts, travel to weird places.

Just like Trollope’s clerics, we knowers don’t see ourselves or what we’re doing very clearly, and we do a bad job of explaining ourselves to the outside world. When asked to explain all that reading, practicing, and experimenting, we talk about the importance of our “research,” the books we’re writing, the journal articles we’ve published. Then we’re left flat-footed when someone asks whether the world actually needs another new book on Shakespeare’s sonnets or the French nobility.  Why invest so much effort in a book that’s going to reach about six hundred readers?

Trollope points us to the real answer. We may not need the new books, and certainly the books themselves aren’t going to change the world. What we need are the years of studying and the strange kinds of knowledge that go into the books. We need people who walk the cultural walk, and we need quite a few of them if knowledge is going to survive in any meaningful way.

In our world, the university is the only safe haven for those people.

Disciplining the university, the Illinois edition

My last post talked about fear in the contemporary American university, as seen in the specific case of America’s Russia experts. Apparently many of them feel jumpy about expressing non-standard views, and that startled me.   Even before we get tenure, most of us professors enjoy a fair amount of security; anyway, I’d always assumed (naively, it turns out) that full-spectrum discussions were welcome in foreign policy matters, especially those involving nuclear weapons and exotic languages. So I offered a little speculation about what’s going on and where today’s jumpiness fits into the longer history of American intellectual conformity.

Now a new case of professorial fear is in the news, and it gives a more direct look at the mechanisms that can produce it. It’s the case of Steven Salaita, who’d been offered a tenured position at the University of Illinois Urbana-Champaign, only to have the university’s Chancellor and Board of Trustees yank the offer months after it had been accepted.

I won’t go into the details, which have been widely reported on in the academic news–you can find the story explained here, and here‘s an eloquent comment on it by the historian Natalie Zemon Davis–, but the main issue centers on tweeting. In a series of tweets, Salaita expressed in strong language his feelings about Israeli actions in the Gaza Strip, and the Illinois administration decided that his harsh tone crossed a line. He’d been “disrespectful” toward others’ views, and therefore shouldn’t work in UIUC classrooms. It’s relevant to the story that these were just personal opinions– Salaita’s teaching and research have nothing to do with Israel, Islam, Judaism, or Palestine (he’s a specialist in Native American studies). It’s also relevant that big donors and the university’s fundraising office sought to influence the administration’s decision in the matter (as shown here). Whether or not their threats counted, the donors gave it their best shot.

So the minimum Illinois story is that an academic can get into serious trouble for expressing personal opinions, and that university administrators face serious pressure from donors to ensure intellectual conformity. That minimum story is bad enough, and it’s hard not to suspect that there’s worse behind it. The university administration insists it’s not pushing specific views of Israeli policy, and that its only concern is the tone in which dissenting ideas are expressed– but would they have dumped Salaita if he’d said mean things about Vladimir Putin or Urban Meyer?

You don’t need to pursue those suspicions, though, to see the real menace here, namely the civility standard itself. It’s a trap, a way for powerful institutions to enforce intellectual discipline while pretending to encourage discussion. John Stuart Mill explained it all 150 years ago. Mill pointed out that we can’t set the boundaries of “temperate” and “fair” debate; “if the test be offence to those whose opinion is attacked, I think experience testifies that this offence is given whenever the attack is telling and powerful, and that every opponent who pushes them hard, and whom they find it difficult to answer, appears to them, if he shows any strong feeling on the subject, an intemperate oponent.”

In other words, real disagreement is going to include harsh language and hurt feelings, and the cult of civility is a way of preventing disagreement from getting too real.

The Salaita case shows that mechanism of repression working at full steam.

The intellectual in America: an example

Here’s an example of the intellectual’s situation in contemporary America, courtesy of the American historian of Russia Steven F. Cohen. Cohen is about as well-established a figure as could be imagined (some details here). After thirty years as a professor at Princeton, he now teaches at NYU; he’s published lots of scholarly books and received important honors; he seems to have plenty of money and reasonable access to the media. If anyone should feel secure about expressing opinions, it’s people like Cohen.

So it’s a shocker when he tells us about self-censorship within his well-informed, well-protected milieu. Cohen’s particular concern is American policy toward Russia, an issue on which he’s has spoken eloquently and courageously, but the details here matter less than the intellectual climate that he describes.

In that climate, he tells us, “some people who privately share our concerns” in “the media, universities and think tanks—do not speak out at all. For whatever reason—concern about being stigmatized, about their career, personal disposition—they are silent.” As for young scholars, those “who have more to lose,” Cohen himself urges silence. He reports telling junior colleagues that “‘American dissent in regard to Russia could adversely affect your career. At this stage of life, your first obligation is to your family and thus to your career. Your time to fight lies ahead.’”

This is a seriously depressing account, because Cohen isn’t even talking about outsider radicals, ethnic leaders, or potential “extremists,” the kind of people that the New York City Police Department might put under surveillance (see here and here for examples). He’s only discussing well-trained experts like himself, who work for well-defended, rich institutions. His friends have connections, their opinions fall within the spectrum of American common sense, and the subjects they study have major-league practical relevance. After all, we really don’t want to screw up our relations with another heavily-armed nuclear power. We want to get the story straight, and critical debate contributes to doing that.

Yet fear reigns even in this corner of the academic arcadia. At a minimum, Cohen tells us, university professors wait for tenure before expressing an opinion; until then, they shut up. Many of their elders apparently continue shutting up after when the immediate pressure eases, whether because there are still career steps to climb or for more personal reasons.

In some ways, of course, this is just an updated version of an observation that Alexis de Tocqueville made long ago. “I know of no country,” Tocqueville reported in Democracy in America, “where there prevails, in general, less independence of mind and less true freedom of discussion than in America…. In America, the majority draws a formidable ring around thought. Within those limits, the writer is free; but woe to him if he dares to go outside it.”

But there’s also something more sinister in the story that Cohen tells. Tocqueville believed that American democracy explained the problems he detected. “The tyranny of the majority” (he invented the phrase) ensured that non-conforming opinions wouldn’t be heard, because Americans (metaphorically) voted on their ideas just as they (really) voted on their city councilors. But Cohen and his friends aren’t actually facing the tyranny of the majority. They’re facing instead the readiness of powerful insiders to channel discussion in specific directions, by using among other tools their leverage over academic institutions. In other words, the old fashioned forms of power haven’t lost their relevance in our twenty-first century world– and even historians can feel their effects.

 

Why study history, Part 2: individual lives, collective knowledge

My last post talked about why we need historical knowledge. (Short version: history tries to see reality whole, with all the details, contradictions, and complexity left in, and we need that kind of thinking — because reality IS complicated, in ways that few academic disciplines acknowledge.)

So far so good, but then we hit the big cloud hanging over history education in 2014. “We” may need historical knowledge, but “we” don’t do the studying or pay the tuition or try to get jobs after finishing college. Individuals do all those things, and individuals have to live with the results. It’s all very nice and uplifting to say that people should study history, but what if there are no jobs for them? Why should students rack up fees and debts if there’s not much waiting for them after graduation?

What follows isn’t exactly an answer to that question; I’m not even sure there really is an answer, in the usual sense of the term. Instead, I present here some thoughts on the question itself, and suggest that we need to place it in larger contexts than we usually do. The “why study history” question, I’m saying, is really a question about how individuals, communities, and knowledge intersect in 2014.

The first step is to recognize the seriousness of the problem. The jobs situation for history graduates isn’t good, and it’s probably getting worse. Back in the good old days, meaning until about 1975, big corporations liked to train their own people, and they welcomed candidates with liberal arts degrees; it was understood that training would cost money, but that was an investment that eventually would pay big dividends. Anyway, liberal arts graduates could always fall back on teaching if business didn’t appeal to them.

Things are different today. Schools at every level are in financial trouble, and they’re not hiring many historians. In the corporate world, job candidates are increasingly expected to show up pre-trained and ready to contribute; no one expects them to stay around long enough for training programs to pay off, so HR departments favor people with career-ready educations, in economics, technology, health sciences, and the like. (See here for an account.) In these circumstances, a history major may be ok those who don’t have to worry about jobs after graduation, or for those who can treat college as a preparatory course for professional programs like law. It’s not so great for those who need to start paying the bills right away.

In response, historians have publicized all the ways in which history actually is a good preparation for a real career in the real world. And we have some reasons for saying so– history courses teach you to analyze situations and documents, write clearly, think about big pictures, understand other cultures (something worth real money in today’s inter-connected global economy). Most of the history department websites I’ve visited (here for example) include some version of these claims.

The American Historical Association (the history profession’s official collective voice in the US) has taken this approach one step farther. With the help of a foundation, it has set up a program (which it calls the Tuning Project) designed to bring college history teaching into closer alignment with employers’ needs, by putting professors in touch with employers and other stake-holders. If professors have a better understanding of what employers want, the hope is, we can better prepare students for the real world and draw more majors into our courses.

But you can see the problem: some parts of a history education give you the skills to work in a big-money corporation, but many others don’t. Some history topics require knowledge that’s hard to acquire and not much practical use in the twenty-first century– the dates of obscure wars, or the dead languages needed to understand some ancient civilizations. Other topics are likely to mark you as a dangerous malcontent. Picture a job seeker showing up at Mega Corporation X (or at the Chicago Board of Education, for that matter) with her senior thesis on union organizing in the 1930s, or the successes of Soviet economic programs, or Allied war crimes in World War II. Whatever her skills of analysis and cultural negotiation, she’s not the kind of candidate HR departments are looking for. She carries intellectual baggage; she looks like trouble.

That thought experiment suggests the possibility that “tuning” the history major actually means changing its content– by cutting out the troublesome (discordant?) elements, those that might upset our conventional wisdoms. Of course, you could argue that “tuning” just applies to teaching, and therefore doesn’t change the basics of historical knowledge. Professors still get to research and write about whatever they like; in their books, they still get to be intellectual adventurers and trouble-makers. But that’s no real answer, because as American institutions currently work, history teaching eventually shapes history research. If history majors aren’t studying unions or war crimes, universities aren’t going to be hiring faculty in those areas either, and those books won’t be written.

That’s bad news, because American society has a strong collective interest in making sure that this kind of knowledge gets produced. All societies need need to think about difficult questions and disturbing ideas, for the reasons that John Stuart Mill laid out way back in the 1850s. Societies that fail to do so (he explained) do stupid and immoral things; they fail to develop intellectually or socially; even their economic lives suffer, since the choke-hold of conventional wisdom eventually stifles business too. For Mill, disruptive knowledge was as much a practical as a spiritual need.

But it’s not clear how this collective need is to be met by the American university as it increasingly functions nowadays. As the language of the individualistic free market becomes more prevalent within it, fields of knowledge risk being defined by calculations concerning “the employability of our graduates” (as a document from my own university puts it). Given the pressures that they face, our students are fully justified in focusing on their “employability,” and university faculty have a duty to help them toward it. But that’s not the university’s only duty. It has at least an equal duty to develop knowledge, including especially knowledge untuned to employers’ needs, even antithetical to those needs.

That means that eventually the “why study history” question shades into a political problem. Historical knowledge is a form of collective property, and its health is bound up with other elements of our communal life. In the increasingly privatized universities of our times– privatized in financing, mechanics, and measurements of success–the “why study history” question may not have an answer.

 

 

 

Why study history, Part 1: The case of the disillusioned history major

Why study history? It’s a good question. Studying history means learning about dead people, by-gone situations, wrong-headed ideas, funny clothes. But that’s not where we live; the knowledge we need for real life deals with the here-and-now, not the over-and-done. That’s always been true, but it’s especially pertinent nowadays, as we face possibilities and troubles that never worried our ancestors. With self-aware, armed robots coming off the assembly lines any day now, should we really be worrying about Louis XIV and Julius Caesar? Wouldn’t that time and energy be better spent on our own problems?

Every historian has to take that question seriously. We get paid to think about the past, and we ask our students to spend at least a few hours a week thinking about it too– we’d better have some answers as to why it’s worth their time and ours.

Here I offer a few of my own answers by describing a case of rejection– the case of a smart guy who encountered history, didn’t like what he found, and turned elsewhere for intellectual sustenance.

The smart guy is Paul Krugman, one of America’s most important intellectuals. Krugman is a rare figure in the United States, both a star professor (having taught economics at MIT and Princeton, he’s now moving to CUNY) and at the same time someone who tries hard to reach ordinary readers, with a regular column in the New York Times and a widely-followed blog. His columns and blog posts show him to be literate, progressive in his politics, and deeply engaged by the real issues of contemporary life.

So for anyone trying to understand where historical study fits in today’s world, it’s interesting to learn that Krugman came to college expecting to major in history. What he found disappointed him. As he explained in a New Yorker profile a few years ago, his idea of history had been shaped by his teenage reading of Isaac Asimov’s Foundation series. Those books show history to be a difference-making intellectual tool. Having carefully studied the past, Asimov’s heroes (a small band of historians armed with math skills and big computers) can see where their society is heading, long before the people around them have a clue.

Not only can the heroes see the deep processes governing their world, they can act on their knowledge. Centuries in advance, they secretly establish mechanisms that will head off the worst of the troubles they’ve diagnosed, and their plans work; trouble emerges just as they’ve predicted, but since they’ve seen it coming, their pre-programmed remedies are effective. They can’t prevent all the disasters, but they do enough to improve billions of lives.

But Krugman discovered as a Yale freshman that university history didn’t match up with the history he’d read about in Asimov. Just the reverse– his history courses seemed to offer only a mass of complexities and uncertainty. They didn’t elucidate those deep processes of changes that Asimov’s historian-heroes had grasped, and they offered no hope of guiding society to a better future. “History was too much about what and not enough about why,” the New Yorker article explains. In his disappointment, Krugman switched departments, and in economics he found what history had failed to deliver. Unlike history, he found, economics explored the basic rules by which societies function; it offered the kind of knowledge that could guide effective action. “Suddenly, a simple story made sense of a huge and baffling swath of reality,” is the journalist’s summary of Krugman’s intellectual experience.

Ordinarily, teen reading and freshman disappointments wouldn’t count in assessing an intellectual’s views; we all get a free pass on our adolescent hopes and dreams. But Krugman himself offers Asimov as a key to understanding his grown-up intellectual life as a professional economist. He acknowledges, of course, that even as an economist he can’t attain the predictive clarity that Asimov’s historian-heroes managed, but economics allows him to play something like their role. Using economic reasoning, we can get beneath the surface mess of life, understand its essential mechanisms, and use that knowledge to nudge society’s development in the right directions. Unlike history, economics both explains the world and provides a basis for improving it.

So Paul Krugman’s autobiographical reflections contrast the robust and useable knowledge of economics with the baffling jumble of detail produced by historians. It’s a critique that matters because it starts from assumptions that most historians share. These are the complaints of an intellectual, someone who believes in what universities do, not the standard ignoramus haroomphing about tenured radicals and useless research.

And Krugman’s critique gets at something fundamental about history, and about where it diverges from other kinds of knowledge. He’s right that we historians are interested in everything that reality presents; no other academic discipline has as much invested in exploring the details of life, and none has the same belief that all the details deserve our attention. We fuss about getting the dates right, and we write whole books about individual lives and brief episodes, some famous, many of them previously unknown. We want to tell all the stories that humanity has preserved, even the stories of lost causes and dead-end lives. That’s the historian’s paradox: we study the dead because that’s where we can explore all the dimensions of life.

That absolute commitment to the real, in all its chaotic variety, is the defining characteristic of historical thinking, and Krugman is right to zero in on it.

But he’s wrong in seeing history as “too much about what and not enough about why,” and that mistake is also important. The real difference concerns HOW historians and economists understand causation, not how much importance they attach to asking why things happen. The New Yorker profile makes that clear in summarizing Krugman’s freshman studies of ancient slavery and medieval serfdom: in explaining them, his history teachers talked “about culture and national character and climate and changing mores and heroes and revolts and the history of agriculture and the Romans and the Christians and the Middle Ages and all the rest of it;” his economics teacher explained the same phenomena with one simple calculation about agricultural productivity, population, and land availability.

So in fact Krugman’s complaint wasn’t that history didn’t say “enough about why,” but that it said too much– in his history course there were too many explanations, involving too many areas of life, too disconnected from one another. His economics professor replaced that long list with one causal rule, which applied in all situations. For the young Krugman, it was easy to choose between these two modes of understanding: the simpler explanation was more powerful and more useful– and therefore better. The mature Krugman agrees.

A few years ago, I would have said that we don’t have to choose between these two visions of causation, and that economics and history are just two ways of thinking about the world, each useful for some purposes, each with strengths and weaknesses. Sometimes we need a simple and elegant formula for understanding whatever it is we’re dealing with, and sometimes we need to wallow in the complications. That’s true in our personal lives, and just as true in our thinking about human societies. It’s why universities have lots of departments, and why we push students to take a wide range of courses.

But in today’s world, it’s difficult to stay so tolerant, because economics today is not just one academic area among many or a toolbox of handy techniques for thinking about business. It presents itself instead as a master discipline, a way of thinking that can illuminate all areas of our lives. Its methods and assumptions now show up in education, law, family relations, crime prevention, and pretty much everywhere else. Its terminology is also everywhere, and we scarcely notice anymore when a political leader tells us to apply marketplace solutions to some new area of public life. (For a cogent account of this “economics imperialism,” see here.)

With economic thinking taking over so much of modern life, the “we don’t have to choose” option isn’ really on the table. Others are trying to choose for us, and it’s important to push back — and not to be too polite as we do so.

I’ve chosen Paul Krugman to push back against here, but not because he’s an especially extreme advocate of the economics take-over. On these issues he’s probably the best that mainstream economics has to offer, an economist who seems alert to the arguments, values, and concerns that other ways of thinking offer. But that’s really the point: it’s often the best representatives of a worldview that we should question, not the worst.

And the fundamental question for Paul Krugman is, why should we take seriously any simple story about human affairs, let alone prefer it to a more complicated story? In the dream world that Isaac Asimov cooked up, sure, but in our world of real human beings? Krugman’s college economics teacher could offer a one-step formula that “made sense of a huge and baffling swath of reality”– but fairy tales and paranoiac fantasies do the same thing. In those cases, we all know, “making sense” of a phenomenon isn’t the same as getting it right. Ditto for explaining our own or our friends’ behavior: it’s usually not a good sign when we give a one sentence answer explaining a divorce or a workplace event.

Why would we apply different standards to a phenomenon like slavery, which brings together so many forms of human motivation: self-interest, racism, anger, sex, religion, culture, tradition, the love of domination, and much more?

We need history for exactly the reasons that the young Paul Krugman found it so frustrating. Reality is complicated, and its complications press in on us, confronting us daily with lusts, angers, and all sorts of other forces that don’t make it into Asimov-style fiction.  For some purposes and in some limited contexts, we can get away with pretending that those forces don’t matter. We can’t pretend for very long, though, before reality starts biting back. Historical study won’t tell us what to do when it does, but at least it will give us a serious idea of what’s out there.