Tag Archives: academia

Ways we live now: the Volkswagen scandal and modern capitalsm

With so many crazy things going on these days, who even remembers the Great Volkswagen Scandal of 2015? That’s the one where they caught VW faking pollution performance in its diesel engines– and not just messing with the paperwork, either. VW actually installed a whole extra system in eleven million of its cars, allowing them to detect when they were in the shop for state emissions tests. While hooked up to the inspection machines, the cars ran in low pollution mode and met government standards; once back on the road, they went back to standard performance– which meant sending out forty-times as much pollution as in fake-out mode.

It’s not exactly in the spirit of blogging to bring up a months-old scandal– but I keep thinking about the VW case, because it says so much about how crazy contemporary capitalism has become. Also, about the craziness of our responses to contemporary capitalism.

As a partial list of what’s so crazy about the case, consider the following:

1) The scale of the operation. It’s a big deal to invent a cheat system and install it in eleven million cars. VW has fallen back on the “it was a few bad apples/rogue executives” defense we hear so often these days, but that’s pretty implausible. An operation like this required research and development, changes to assembly lines, major expenditures, all going on over several years.

2) The Germany thing. Being known for quality manufacturing is a very, very big deal to Germany. Of course the “made in Germany” brand is an economic tool, which helps the country sell its goods worldwide and charge premium prices for them, but it’s also part of the larger German psychology, really a nation-wide brand name. The idea is, Germans do things right, they don’t take short-cuts, their products deserve your trust. It’s the mindset that allows German politicians to lecture Greeks, Italians, and others about being lazy, slovenly tax evaders who can’t generate trade surpluses.

So you might think an iconic German company would hesitate to put all that at risk; even if companies occasionally did so, you’d think, the German authorities themselves would be extra vigilant about misbehavior that threatened the national brand. Apparently not.

3) The objective. Since I’m not a car guy, my handle on the technicalities is weak. But as I understand it, the cars worked just fine in low-pollution mode, they just performed better when spewing (lots) more pollutants– more pep, more power, more responsiveness, better fuel economy. In other words, VW was ready for crime if that’s what it took to make its customers marginally happier.

4) The stakes. Of course, the costs included more than just violating various countries’ laws. There’s also the real-life impact of all that extra pollution– premature deaths (about sixty in the US, so the researchers guess, and many more in Europe, where diesels are more popular), and of course contributing to the destruction of the planet. Think of it as human sacrifice on the altars of high performance and customer satisfaction.

5) Professors to the rescue. The story only came to light because complete outsiders looked into the case. A group of professor/researchers at West Virginia University bought a handful of the cars and did their own tests, out on the open road– only after they published their findings did the relevant regulatory agencies get involved, and only after that did VW itself take steps. (Steps which are still dragging along, by the way– VW still hasn’t replaced or refitted the engines.)

So among other things, the VW Scandal is a lesson about the benefits of tenure and the other protections the American research university provides its researchers. Thousands of people within VW and thousands more outside must have known what was going on, but there were zero whistle-blowers on this one– not surprisingly, when you think about what happens to most corporate whistle-blowers. Nowadays, the universities are the last refuge for this kind of free-range, trouble-making, profit-threatening research. Most everywhere else, raising this kind of question means losing your job.

Now, there’s a tendency to think about cases like this in terms of corporate psychology and ethics. You’ve probably seen stories like this one, which asks what all those people were thinking– how could they have signed off on the VW program, given its immoral elements?  Or if you haven’t seen VW-relevant stories, you’ve seen them about other corporate ethical disasters, like our recent banking scandals. Which is fine, I’m all for moral improvement, but I think that misses the real message of the Great VW Scandal.

Instead, the real message is just how completely capitalist calculation now trumps other possible ways of thinking– like, say, worrying about right and wrong, or about humanity’s survival, or your country’s reputation, or even the long-term health of the corporation itself. It all pretty much follows the Karl Marx script. Capitalism works as total Darwinian war, meaning that all options are on the table, escalation is always possible, and in the long run there are no safety zones. And of course, Marx would have been pleased to hear the rumors that other car companies have been doing the same thing. He’d say, that’s just how capitalism works.

Maybe it’s time to take the old guy a little more seriously.



Military education is to education as… Update!

A few days ago, I posted some thoughts on American military education, as exemplified by our army academy at West Point. It’s a topic that’s worth some attention. Out there in the world, we’ve now got troops in about 150 countries— people in all of them should be wondering how American generals think. Meanwhile here at home the military enjoys amazingly high levels of respect. Apparently 70 percent of us think our military officers want what’s best for the country, as opposed to what’s best for themselves; only 32 percent think that about civil servants, and for members of congress the number drops to 12 percent. Enjoying respect like that, it’s no wonder values and ideas spread outward from our officer corps to the rest of society.

So we need to think hard about how good those ideas and values are likely to be, and my post suggested that they’re not so good. The young people going through West Point don’t get experience in thinking hard about any particular thing, because they’re moving too fast, with only brief stops at all kind of knowledge service stations, and no time at all for unstructured reflection. It’s no surprise that once they’re out in the field, they aren’t so hot at understanding complicated societies like Afghanistan. As they get older and become opinion-makers on matters of international politics, like David Petraeus, the effects are even worse.

Before I’d even finished writing up my thoughts, a new example popped into the news that should make us worry even more about what the hell goes on at West Point. It’s the case William Bradford, an assistant professor at West Point whom The Guardian covered a couple of weeks ago. It turned out that Bradford had published an essay calling some American law professors “an Islamist Fifth Column” that the military can legitimately target/kill; the professors’ law schools are fair game too, as long as our soldiers try not to kill any innocent by-standers. Apparently the essay doesn’t name names, but it does supply an estimate of how many fifth-columnists need to be dealt with– about forty nation-wide.

One thing you can say for West Point, they moved fast to end the Bradford scandal– he resigned almost immediately after the Guardian story broke; the academy also explained that he’d written up these opinions before joining its faculty, so the army couldn’t be held resposible for them. In other words, the military’s excuse is that the article was all written up before they hired the guy– meaning, they could have reviewed it, asked some questions, looked at some other candidates.

For all I know that’s what actually happened, and we should take Bradford as expressing out loud what the rest of the military is thinking silently– maybe we should be really scared, rather than just appalled. But my money’s on this being just another example of the speed-obsessed superficiality that my last post described. This is the kind of thing that happens in an organization that doesn’t give its members any unstructured time.

“They also serve who only stand… and think.”

Military education is to education as military music is to music

A few years ago, my daughter got me the book Absolutely American: Four Years at West Point. The amazon.com blurb describes it as “a thrilling portrait of a unique institution and those who make up its ranks,” which is actually not too wildly overstated– at least, it’s a very good book. In classic New Journalism style, the author rents a house near the West Point campus and follows a group of cadets (a small group, but they seem representative) through their whole time there. Also in classic style, the author starts out a hardened wise-guy/big city cynic, and ends up really, really liking the young people he meets. He’s impressed at how demanding the education is, how many challenges the cadets overome, and how fundamentally decent they are to one another. He’s also impressed at how they grow up in the course of their time there.

Now, we’re a pretty unmilitary family, and the only person I know who’s gone to a service academy is my ex-brother-in-law, who got kicked out after one semester at the Merchant Marine Academy. But like most everyone else, we’re suckers for young-people-growing-up-and-meeting-challenges narratives and for stories about the romantic, “unique institutions” that make the growing-up happen– whether it’s Rugby School, LA’s Garfield High School, or Hogwarts. West Point fits right into the series, making Absolutely a seriously feel-good read.

At least until you start thinking about the details of West Point education, and then you start to realize something weird: these young people barely have time for the washroom, let alone for any thinking about whatever it is they’re learning. Basically, they’re up before dawn (6 am), then running non-stop from class to class, activity to activity until bedtime. There are two or three hours of free time in the course of the day, but mostly that gets used up on sports, room cleaning, and other kinds of prep work. Maybe some of these kids find time in there for getting excited about a weird novel or writing project, or talking about ideas– but they’re much more likely to use their few spare minutes on pure escapism. Anyway, with “almost every facet of life being graded” (in the words of a recent cadet), there’s not much payoff in unstructured activity.

I assume the concept is,  when West Point graduates are out fighting our enemies, they’re not going to have time for random reflection or enriching reading. They’ll have to think and decide fast, without the benefit of a good night’s sleep or a research library, and the West Point atmosphere of constant busy-ness is supposed to prepare them for that. Of course they’ll also have to be physically fit, and West Point’s sports requirement (everybody has to play some organized sport) prepares them for that too.

It sounds convincing, but is it really a good idea to have military leaders who’ve basically never had the experience of thinking seriously about something? By which I mean partly unhurried thinking, with time to argue things out and pursue loose ends. At least for the last fifty years, the think-fast-not-deep thing just hasn’t worked very well– as I’ve pointed out before, the American military is on a long losing streak, despite having more firepower than the rest of the world combined.

That’s bad in itself, but it seems to me the real reason for worry is that this West-Point approach is seeping out beyond the military itself, into American society at large. On the one hand, our military leaders aren’t content with the military domain any more; instead they turn up in high civic offices, running the CIA and the like, and in the media, where they hold forth about the state of the world and what we should do about it. Even David Petraeus has recently returned to advice-offering, pushing for various strategies in dealing with ISIS. According to CNN, “many in the foreign policy establishment still seek out his views, so his proposal will no doubt be taken seriously.”

Meanwhile our education reformers sound eager to bring some of the West Point spirit to our beleaguered schools. We hear about is the need for frequent testing and clear goals, for both students and teachers; unstructured activity can only derail those objectives. Every so often there’s even a push to bring back school uniforms, though thankfully that seems to have died down for the moment.  What hasn’t died down is the sense that young people need to pack in more activities, and whatever gets in the way of those is just an obstacle to good education.

So my suggestion is, thinking about our sorry military score-card isn’t just for military historians or policy geeks. All those wars we’ve lost against weaker opponents suggest that West Point-ism doesn’t even work on the battlefields it was designed for. Why would we expect it to work elsewhere in society? Why are we making students’ lives more constantly busy, rather than less?

Telescopic philanthropy and the modern university

Over the last year, I’ve offered occasional thoughts about the role of philanthropy in today’s world, looking mostly at the billionaires (like Bill Gates and George Soros) doing the giving. It seemed–actually it still does– weird and interesting that these guys would concern themselves with how historians and other educators spend their time, and I offered a few possible explanations.

But lately I’ve been more interested in the other side of that story, meaning the enthusiasm that some professors are starting to show about big-money philanthropy. It’s an enthusiasm that says some interesting things about how the university and its humanist professors fit into the twenty-first-century world.

My example is Peter Singer, who’s probably today’s most visible philanthropy-fan professor. Singer’s a world-famous Australian philosopher, who’s taught at Princeton since 1999 (Wikipedia quotes a colleague describing him as “almost certainly the best-known and most widely read of all contemporary philosophers”). He’s been pushing philanthropy for a long time, but lately it’s become his primary focus, and he’s signed on to a view called “effective altruism.” The basic idea is: all over the world, there are people in life-and-death need, and the rest of us have a duty to do everything we can to rescue them. “Everything we can” includes giving as much as we can, but also being smart about it, by making our dollars go as far as possible (that’s the “effective” side of the equation). That includes selecting charities that function efficiently, with low overhead costs and modest offices, and orienting our altruism to the truly desperate– sending food to starving Africans counts way more than (say) endowing book purchases at the local library.

Most interesting of all, those on the donor side have a duty to organize their own lives for maximum altruistic effectiveness. Singer offers an example from his Princeton classrooms, that of a brilliant young student who (influenced by Singer’s teaching) decided against a fast-track career in academic philosophy and instead went to Wall Street; he reasoned that all the extra money he’d earn there would make him a far more effective altruist. Singer has only praise for this career switch, which will allow the young man “to save a hundred lives” in his first year or two out of school, way more than would have been possible on a professor’s salary. Career choices like these (Singer assures us) form part of “an exciting new movement” that’s sweeping elite universities world wide; they show philosophy “returning to its Socratic role” of shaking up our ideas about the good life, dramatically transforming students’ lives, and making “the world a better place.”

From all this glossy talk of innovation and excitement, you wouldn’t know there are important criticisms of Singer’s approach, but they’re out there. As numerous observers point out, big philanthrophy undermines democratic values, by giving individual donors decision-making power over what society as a whole gets. It reinforces social hierarchies, by dividing the world between big-hearted givers and weakling takers and (as in Singer’s version) by giving extra moral status to big money. It tends to destroy public institutions, as their survival increasingly depends on pleasing a few wealthy donors. And it often has more directly destructive consequences. Sending free American food to Africa sounds great, but if it destroys African farming (how do farmers stay afloat when NGOs distribute free food in the marketplace?) and subsidizes American agribusiness (from whom NGOs purchase the “free” food), is it really helpful? Is this about “saving lives” or “clearing out small farmers so that multi-nationals can step in?”

Singer’s Wall-Street-bound student offers an extreme form of this last problem, because whatever his exact big business role, it would take some deep calculations to know whether his donations counter-balance the harm he may doing– especially since some of that harm won’t show up for many years. Do his investments contribute to climate change, for instance, or to carcinogenic industrial processes? Do his clients use the money he makes for them to lobby against health and safety regulations, or against old age pensions? It may be two or three generations before we know the costs and benefits, even if we add up only the lives saved and lost.

I’ll leave the list of criticisms there, since I find them so compelling (for a more complete and careful discussion, see this terrific article by the philosopher Matthew Snow). Instead, I want to think a little about what Singer and “effective altruism” tell us about the modern university and its culture.

First, we might note the strange combination of qualities that Singer’s story attributes to the university itself. There’s an element of professorial megalomania in the account– Singer presents himself as a new Socrates, shaking up philosophy and redirecting the polis, and he sees world-improvement starting in university classrooms, rather than among, say, the poor themselves. But there’s also some startling self-abasement in his story, because its real heroes are the wonderful undergraduates just passing through the university, on their way to making money and improving the world. Think of it as an updated version F. Scott Fitzgerald‘s glamor-Princeton, now featuring ethically sensitized young people rather than Jazz Age partiers. The professors function as their life-coaches, and the other figures in the university– graduate students, librarians, researchers, and the like– don’t make it into the picture.

Then there’s the strange historical shallowness in Singer’s account– meaning, it offers no hint that debate about philanthropy has been raging for about 250 years, and in that time the nay-sayers have landed some pretty good punches. The great nineteenth-century novels are full of philanthropists, most of them horrifying. (Think Charles Dickens’s Mrs. Jellyby, whose “telescopic philanthropy” centers on sending colonists to Africa while her own family sinks into ruin and degradation, or Charlotte Bronte’s Mr. Brocklehurst, bullying impoverished young women.) Even earlier, the French thinker/politician Anne-Robert-Jacques Turgot pointed out how silly and self-serving most charitable donations look to later generations. In the 1960s and 1970s, Michel Foucault explored the structural linkages between philanthropy and power, showing that even the slam-dunk do-gooder projects have come with very heavy baggage.

Now, presumably a smart, learned guy like Peter Singer knows all this, and he may reason that it’s not his job to argue against his own views– that’s for those of us who disagree with him. But whatever full disclosure duties he may have, we can still notice the peculiar firmness of his non-historicity. Here’s another world-class professor implicitly telling the public that the past doesn’t matter, we don’t need to think about it, let’s just focus on the bright, shiny future.

Which leads to a last peculiarity in Singer’s story, the politics. Singer presents himself as a leftist, and he was even a Green Party candidate back in Australia. Clearly he’s desperately concerned about the state of the world today, and really wants to improve people’s lives. Yet when we’ve stripped it down, “effective altruism” looks an awful lot like a celebration of capitalism and the well-paid folks who make it run. It’s a deeply flawed system, seems to be Singer’s position, but it’s not going away, and so we have to work with it.

As I’ve said before, the neo-cons who fume about tenured radicals in the universities can probably relax.

Mysteries of the classroom

What happens when we teach? It’s a more peculiar process than you might think.

To illustrate, here’s one of those True-Stories-That-Are-Also-Parables we writers like so much: My first term in graduate school, I landed in a research seminar taught by an old guy nearing retirement. Apparently he’d been hot stuff back in the 1930s, but in 1968, not so much. He hadn’t published anything in years, and from our fifteen weeks together I can remember only two classroom moments. One came when he hauled out from his desk some photocopies he’d made years earlier, of a seventeenth-century ship-building contract– he thought maybe he could publish the details in a model-ship-building magazine. The other time, he showed up twenty minutes late, wet from the rain, and mad because he couldn’t find on-campus parking. It was like a laboratory demonstration of Clark Kerr’s joke about the duties of a university president– to provide football for the alumni, sex for the undergraduates, and parking for the faculty. There were only four of us students. We crammed into his office for a couple of hours every week, listened to him ramble on, then went home and did our research.

So this was about as hopeless a teaching set-up as you could find– yet the seminar turned out to be quite the big deal for us students. All four of us wrote seminar papers that turned into dissertations, and all the dissertations became respectable university press books; we all got decent academic jobs, though one of us bailed out of the profession (and into law school) before getting tenure.

Every parable needs a moral, so here’s mine: we see part of what goes on in a classroom, but there’s lots we don’t see– invisible forces swirling around, electrical charges sparking and fusing at subatomic levels, multiple temporalities colliding and redirecting one another. Had there been teaching evaluations back then, we’d have given Professor X F-minuses, and the deans would have packed him off to teacher remediation boot camp. But then, check out his “learning outcomes”– meaning, did the course give us students what we signed up to get? That’s the big metric now in vogue among administrators, and according to it Professor X belongs in the Teaching Excellence Hall of Fame.

Now, I understand the objections to making anything much of my story. It’s just one example, and graduate school is a peculiar business; we were bright, proto-professional keeners, not disaffected freshmen. Anyway, it was a long time ago, when the world had more room for ineffectual bumblers like Professor X.

All fair enough, but I’ve encountered less dramatic examples of the Professor X story all my life– bad teachers from whom I learned a lot, certified teaching stars who left me bewildered and/or scared. Of course some of those stars were the obvious fakes, the bombastic performers with nothing to say, but some were the real deal– I just wasn’t ready for what they were offering. I see the same things happening these days among the students I encounter.

Obviously that explains some of the subatomic interactions going on in the classroom. People need different teaching at different moments in their lives. Sometimes a conscientious and brilliant instructor overpowers and discourages; sometimes the chance to feel snooty and superior about your teacher (as we did with Professor X) is just what the doctor ordered– it encourages intellectual adventurism, or just helps you survive a difficult stretch like the first weeks of graduate school. And then there are all the other logics– lucky encounters with the right mix of fellow students or with the right topics and books, no matter who’s teaching them.

But when we’ve said all this, there are still plenty of classroom forces at play whose logics elude us– and some that we don’t even see. Which is just to say, the classroom is a site of human interactions, much like other human interactions only moreso than most: more compact and intense, with more at stake, with more layers, more moving parts.

You don’t have to be inside the university to know this human-interaction model of teaching faces challenges nowadays– from online education, from students who feel they’ve already got too many human interactions in their lives, from shrinking university budgets, from measurement-besotted administrators.

All I can say is, check out those learning outcomes!

Do humanities professors dream of electric sheep?

Over the weekend, our graduate students put on a fantastic one-day conference, and it included a faculty roundtable discussing the digital humanities. The line-up included one super-enthusiast, two moderates, and me as the designated Mr. Negative– which in itself tells you where the window of debate is now located. I mean, I blog, I occasionally tweet, I push my students to consult Wikipedia for the background facts on what we’re studying. Take away digital photography, and I wouldn’t last a week in the archives; take away my morning dose of internet news, and I’m a wreck. The digital revolution has gone awfully far if someone like me gets cast as the voice of caution and doubt.

In the Teaching section here and on my Academia.edu site, I’ll post a cleaned-up version of my formal comments. Here I’ll offer a short version of those, mixed with some thoughts that came to me during the (outstanding) discussion that followed the panel’s presentations.

I won’t go on much about my own super-enthusiast side, except to say it’s real. As it happens, my particular weakness as a scholar coincides with some dramatic strengths of the new digital resources. I’ve always had trouble getting dates and details exactly right, and the old printed reference bibliographies have always just left me depressed and listless– anyway the specialized resources I usually need aren’t even available in the universities where I’ve taught. Think of it as my kinky version (not my only version, I hasten to add…) of a thrill we’re all experiencing these days: suddenly I’ve got a cheap, easy electronic solution to a dark, secret, personal weakness.

But the storm warnings also seem to impress me more than most of my colleagues. For the PG-13, super-scary version, check out the philosopher Tim Mulligan’s Ethics for a Broken World: Imagining Philosophy After Catastrophe. Among many other issues, Mulligan thinks seriously about the reality we all know lurks behind the digital wonderland– namely, it could go poof at any moment, because of a war, a breakdown of the electrical system, evil-super-hackers, an NSA Stuxnet-type operation gone wrong, or dozens of other altogether-possible scenarios

So Mulligan imagines his post-catastrophe philosophers having to make do with what he calls the Princeton Codex– scrambled bits and pieces of Princeton University’s paper library that survived climate change and its attendant disasters, in roughly the same messed-up way as ancient European literature survived the Dark Ages.

Except for one big difference. Everything from the ancient world at least had a fighting chance of making it through the bad times, and a lot was waiting there for people like Thomas Aquinas and Copernicus to sort through and build on when the dust settled. In 2015 we’re probably already beyond that point. A steadily greater percentage of our knowledge is now preserved only up there in the cloud, and pretty soon it will be most of our knowledge; if it goes, it’s gone for good.

So that’s the Total-Catastrophe worry, but there’s also the Right-Here-Right-Now worry: digital knowledge reshuffles the sociology of knowledge, in some ways for the better, in others for the worse. At this point we don’t know how much worse, but maybe quite a bit.

On the plus side, the digital world gives new reality to old ideals of equality and fraternity. Like everyone else, I now connect directly and easily with scholars all over the world, people I would never have encountered in the old days. And I get to publish my thoughts in places like this without awaiting the approval of editors or reviewers. Sure, the hierarchies and barriers still exist, but they’re way weaker than they used to be.

But as Alexis de Tocqueville explained long ago, the third element in the great French trinity doesn’t necessarily play well with the other two– and Tocqueville would have loved thinking about liberty’s tormented place in the new digital regime. “Tormented,” because our online doings are watched 24/7, by governments, insurance companies, angry teens, employers, and all sorts of others. Real havoc regularly ensues–health coverage rejected, jobs lost, visas denied, legal trouble, personal humiliations.

In the nature of things, life in this new panopticon entails controlling what we say and do, and even what we learn– multiple authorities now monitor our visits to informational websites. It’s the most effective kind of censorship, the kind where we do the real work ourselves, each of us monitoring our own utterances.

That seems to be part of a larger problem, which we’ve barely started wrestling with: digital culture binds us extra-intensely to our late-capitalist social order, not only because the individual bonds are so strong, but also because there are so many of them. Of course we rely on the corporations that supply our computers, browsers, storage, electricity, etc etc etc. But we also find ourselves slotted into mini-capitalist-entrpreneur roles– each of us bloggers now worries about generating traffic, attracting readers, speaking to our audience; nowadays we’re all minor-league versions of the hustlers who produce the Big Bang Theory.

Higher up the food chain, the resemblance gets even creepier. Here’s the former director of a major digital humanities project, a well-established project at a great public university, speaking some years ago about his job: “A main part of Thomas’s role as Director is to write grants, as well as to seek out appropriate public and private agencies, whose interests match the VCDH’s projects. He compares it to finding funds for a venture capital firm.”

So in this world of surveillance, audience-seeking, entrepreneurship, and venture capital, what happens to the humanists’ trouble-making functions, our capacity to raise harsh questions and social criticism?

My own answer is, so far, so good. Anyone who reads these posts will understand how liberating I’ve found the new media. But the storm clouds are there, and they may get very dark, very fast.


Historians and irony, Part II

My last post talked about historians’ irony, which I presented as a way of approaching the past, a tendency not a specific interpretation. Irony-friendly historians tend to see people as having a limited handle on their circumstances, and even on their own intentions. Not knowing the world or ourselves very well, on this view, we humans regularly blunder into tragedy, generating processes we can’t control and outcomes we didn’t want. We don’t know what the fuck we’re doing.

I also suggested that irony of that kind is out of fashion nowadays. Not among all historians, and not 100 percent among any historians– as I said last time, we can never give it up altogether, because we know more than the people we study about how their stories turn out. But historians and irony are mostly on the outs right now, and that counts as something important about our era of historical writing. Open a recent history book, and you’re likely to encounter words like “contingency” and “agency.” Even late in the day, these words tell us, things could have gone differently, and individual decisions made a real difference. These words also tell us not to condescend to people in the past– not to view them as the helpless puppets of bigger forces, not to dismiss their efforts, hopes, and ideas, good and bad alike.

Things were REALLY different back in the mid-twentieth century, and they were still mostly different in the mid-seventies, when I got my PhD. In those days, the talk was all about long-term processes, societal changes, and the blindness of historical actors, and you found that talk pretty much everywhere in the profession, among Marxists and Freudians on the political left, modernization theorists and demographers in the middle, political historians on the right. These scholars mostly hated each other, but they agreed on a basic interpretive stance: big forces trumped individual wills.

So what happened? How did the history biz go from mainly-ironic to mainly-non-ironic? The question matters, because it touches on the ideological functions of history knowledge in our times. Mainly-ironic and mainly-non-ironic histories provide different lessons about how the world works.

Of course, some of the change just reflects our improving knowledge of the past. We talk more nowadays about contingency because we know so much more about the details of political change. We talk more about the agency of the downtrodden because we’ve studied them so much more closely– now we know that serfs, slaves, women, and other oppressed groups had their own weapons of small-scale resistance, even amidst terrible oppression. They couldn’t overturn the systems that enclosed them, but they could use what powers they had to carve out zones of relative freedom, in which they could live on their own terms.

And then, there’s what you might call the generational dialectic. Like most other intellectuals, we historians tend to fight with our intellectual parents– so if the mid-twentieth-century historians were all into big impersonal forces and longterm processes, it’s not surprising their successors looked to poke holes in their arguments, by pointing out all the contingencies and agency that the previous generation had missed. That’s one of the big ways our kind of knowledge advances, through criticism and debate. (For a discussion of this process as it works in a neighboring  discipline, see here.)

So there are plenty of reasons internal to the history profession that help account for irony’s ebb– and that’s without even mentioning the decay of Marxism, Freudianism, and all those other -isms that tried to explain individual behavior in terms of vast impersonal forces. Almost nobody finds those explanatory theories as persuasive as we once did, in the history department or anywhere else.

But having said all that, we’re left with an uncomfortable chronological juxtaposition: the historians’ turn to mainly-non-irony coincided with the circa-1980 neo-liberal turn in society at large, the cultural revolution symbolized by Margaret Thatcher in Britain and Ronald Reagan in the US. There’s a substantive juxtaposition as well: while we historians have been rediscovering agency among the downtrodden and freedom of maneuver among political actors, neo-liberal ideology has stressed individuals’ creativity and resourcefulness, their capacity to achieve happiness despite the structures that seem to imprison them. Unleashing market forces, getting people off welfare, reducing individuals’ reliance on public resources– these all start from the presumption that people have agency. They know what they’re doing, and they should be allowed to do it.

In other words, Edward Thompson’s warnings against “the enormous condescension of posterity” weirdly foreshadow various neo-con one-liners about how social programs and collective goods condescend to the disadvantaged. (For an example, check out George Will and George W. Bush talking about cultural “condescension.”)

Which of course is a pretty ironic thought, given that Thompson was a committed political activist and brilliant Marxist theorist. But if it could happen in the 1950s, it can happen now: intellectuals who hate each other and disagree on many specifics can nonetheless be teaching the same basic ideological lessons.

To me this suggests it may be time to rethink concepts like contingency and agency, or at least re-regulate our dosages. Maybe our alertness to agency has diminished our sensitivity to tragedy, to the ways in which circumstances really can entrap and grind down both individuals and whole communities. Maybe we need to think more about the long chains connecting specific political actions and constricting everyone’s freedom.

Maybe we historians need to stop being so damned optimistic!


Historians and irony, Part I

We historians have a long, intense, up-and-down relationship with irony, the kind that merits an “it’s complicated” tag. We argue with irony, shout, try going our own separate way– but the final break never comes, and eventually we and irony always wind up back in bed together. Like all stormy relationships, it’s worth some serious thought.  (Note for extra reading:  like pretty much any other historian who discusses irony, I’ve been hugely influenced by the great historian/critic Hayden White— when you have the time, check out his writing.)

Now, historians’ irony doesn’t quite track our standard contemporary uses of the word. It’s not about cliché hipsters saying things they don’t really mean, or about unexpected juxtapositions, like running into your ex at an awkward moment.

No, we historians go for the heavy-hitting version, as developed by the Ancient Greeks and exemplified by their ironist-in-chief Oedpius Rex. In the Greek play, you’ll remember, he’s a respected authority figure hot on the trail of a vicious killer– only to discover that he himself did the terrible deed, plus some other terrible deeds nobody even imagined. Like most of the Greek tragic stars, he thinks he’s in charge but really he’s clueless.

You can see how that kind of irony appeals to historians. After all, we spend a lot of our time studying people who misjudged their command of events– and anyway, we know the long-term story, how events played out after the instigators died. Most of the leaders who got Europe into World War I thought it would last a few weeks and benefit their countries. By 1918 four of the big player-states had been obliterated, and the ricochet damage was only beginning– Stalin, Hitler, the Great Depression, the atomic bomb, and a whole trail of other bad news can all be traced back to 1914.

That’s why our relationship to irony never makes it all the way to the divorce court. It’s basic to what we do.

But there are other sides to the relationship, and that’s where the shouting starts. We historians don’t just confront people’s ignorance of long-term consequences. There’s also the possibility they don’t understand what they’re doing while they’re doing it. That possibility takes lots of forms, and we encounter them in daily life as well as in the history books. There’s the psychological version, as when we explain tough-guy behavior (whether by a seventeenth-century king or twenty-first-century racists) in terms of childhood trauma or crises of masculinity. There’s the financial self-interest version, as when we believe political leaders subconsciously tailor their policies to their career needs.

And then there are the vast impersonal forces versions, what we might call ultra-irony, where historians see individuals as powerless against big processes of social change. That’s how the Russian novelist Leo Tolstoy described the Napoleonic wars, and how the French philosopher Alexis de Tocqueville described the advance of democracy— efforts to stop it just helped speed it up. Marxist and semi-Marxist historians have seen something similar in the great western revolutions. Those fighting tyrannical kings in 1640, 1776, and 1789 didn’t think they were helping establish global capitalism– many hated the whole idea of capitalism– but their policies had that effect all the same.

You can see why historians have such a fraught, high-voltage relationship with ultra-irony interpretations like these. On the one hand, sure– we all know that many social forces are bigger than we are; we laugh at those who try to stop new technologies or restore Victorian sex habits; we know we’re born into socio-cultural systems and can’t just opt out of them.

On the other hand, historical practice rests on evidence, documentation– and where do we find some president or union leader telling us he did it all because his childhood sucked? How do we document vast impersonal forces? Ironic interpretations require pushy readings of the documents– speculation, going beyond what the evidence tells us, inserting our own interpretive frameworks. Nothing makes us historians more jumpy.

There’s a deeper problem as well: interpretations like these diminish human dignity, by telling us that people in the past didn’t know what they were doing or even what they wanted to do. If we accept these interpretations, we deny agency to historical actors, belittle their ideas, dreams, and efforts, mock their honesty and intelligence. We dehumanize history– the human actors are the pawns, the vast impersonal forces run the game.

Those are serious criticisms, and they’ve been around since the nineteenth century.

But the interesting thing is, their persuasive force rises and falls over time. You’ll have a whole generation of historians who find ultra-irony persuasive and helpful; it feels right, and it seems to open up exciting new research questions. Then the tide shifts, and historians become more concerned with agency. They listen closely to historical actors’ own views of who they were and what they were doing.

By and large, the mid-twentieth century fell into Phase 1 of this cycle– it was a time when historians saw irony everywhere and paid lots of attention to big impersonal forces. Marxism was riding high, but so also were the other -isms: Freud-influenced historians saw unconscious drives pushing people to act as they did; Weberians saw the experience of modernization behind political and religious movements. “Underlying causes” were big, and we viewed participants’ own accounts with suspicion– we assumed they didn’t understand their own motives or circumstances.

But that changed in the 1970s, and for the past thirty years we’ve been deep in Phase 2, the no-irony phase. We’re concerned with taking historical actors seriously and with avoiding what a great Marxist historian called “the enormous condescension of posterity.” We believe in “agency”– meaning, from the top to the bottom of the social scale, people can help shape their own destinies.

What does it all mean? I have a few thoughts, but I’ll wait until the next post to lay them out– stay tuned!

Junior oligarchs in America: the Samantha Power case

My last post talked about some ways that power seeps into American intellectual life. It’s a complicated process, with lots of moving parts. But (I argued) it’s unwise to ignore any of those parts, even the ones that look like decorative frills. Ballet companies and English departments are remote from the world of policy think tanks and political candidacies– but funding the ones helps the others function effectively, by creating new spheres of influence and new sympathies in once-critical audiences.

Now I want to consider another example, which has more to say about the receivers of culture patronage. It’s the case of Samantha Power, currently US ambassador to the United Nations and the subject of a lengthy, fascinating New Yorker profile. It’s a case worth thinking about because Power so perfectly embodies the top echelons of American intellectual life. She has two Ivy League degrees, from Yale and Harvard, and she’s a professor at Harvard’s Kennedy School of Government; she writes a lot, three books over the last decade, one a Pulitzer Prize winner, plus a stream of articles and occasional pieces; her husband Cass Sunstein belongs to the same world– he’s an ultra-prominent Harvard Law School professor.

So Power gives us one glimpse into American intellectual life at the center, where it has the greatest potential leverage. As you’ve probably noticed, it’s twenty-seven years and counting since we had a president without at least one degree from either Harvard or Yale; like Samantha Power, George W. Bush has two.

Of course the New Yorker article has lots to say about Power’s intellectual gifts and her capacity for nonstop hard work, but it also says lots about her personal charm; she’s tall, athletic, dresses well, and apparently enjoys the galvanic effect she has on many people she meets. Sure, all these details are there partly because of journalistic sexism (it’s hard to imagine Cass Sunstein getting quite the same appearance report card), but I don’t think that’s the whole story — charm takes multiple forms, and it seems to be a real component of intellectual life at the level Power inhabits. After all, most private universities in the US select for personal qualities as well as for grades and test scores; and as the most selective of the bunch, the Ivies can put extra weight on that side of the application process.

Certainly Power’s charm isn’t just a matter of good looks and athleticism. It also includes a warm, healthy, happy home life. The profile describes at length her attachments to her parents and her story-book wedding; and it describes her husband and kids watching from the gallery as she undergoes her pre-confirmation Senate grilling. Her family’s warm support is a part of what she’s giving us.

Reading all this reminded me of two brief observations about this top-echelon world that I recently encountered– both of them from outsiders, both off-hand remarks, yet both sharp enough that they’ve stayed with me.   One’s from an anonymous blog commenter, apparently a non-Harvard philosophy graduate student or youngish professor (all he tells us is his gender), describing a chance encounter with some apparently Harvard-connected young scholars– whom he sums up as “these happy, wholesome, self-confident, new-vanguard, shiny people from great schools going great places together.”

The other observation comes from the science journalist Daniel Bergner, toward the end of his wonderful book about current research on women’s sexuality.  The research he describes includes some pretty out-there experiments, testing physiological responses to porn images, for instance, and it generates some out-there conclusions, for instance, that women by nature are just as non-monogamous, sexual, and potentially crazy as men. Toward the end of his inquiries, Bergner realizes that the scientists he’s been tracking tend to work at non-top-echelon institutions; and finally he asks one of them “why I never found myself phoning the psychology departments of Harvard or Yale or Princeton, why I never spent time with their professors, why so few of America’s most elite universities devoted any attention to her field.”  The researcher replies that the Ivies just don’t do this kind of thing. Exploring sex in these ways is too weird, too potentially upsetting; it may contribute to happier homes, but it’s just as likely to blow up the whole concept.

All this to say, it’s not just Samantha Power. A certain vision of American-style middle-of-the-road happiness, health, and flourishing seems built into the intellectual world around her. A starker version of that commitment also comes out at the confirmation hearings, when senators ask about some of her early writings, in which she’d occasionally dissed the US. Of course she vigorously backtracked, calling America “the greatest country on earth” and “the most powerful country in the history of the world. Also, the most inspirational;” and she emphasizes that she “would never apologize for America.” The senators are impressed, and Power is confirmed.

Like any political encounter, Power’s confirmation hearing lends itself to multiple interpretations. We can read it as an ambitious political actor doing what’s necessary to reach a position of authority, where she can do some good.  Or we can read it as a textbook demonstration of how the real boundaries on American political discourse get set– namely, by those holding the real political cards, not by Harvard professors.

Or we can conclude that Power’s walk-back expresses beliefs she actually holds. Probably back home in Cambridge she wouldn’t use quite this Red State rhetoric. But the policies she’s advocated throughout her career– first as an intellectual, now as a participant– presuppose more or less this stance: from the outset, she’s endorsed American military intervention in troubled societies, on the assumption that it’s an inspirational force for good. That faith apparently remains unshaken by the disastrous results of our interventions in Afghanistan, Iraq, and Libya– the last a project that Power herself helped design and continues to endorse.

My point is, we don’t need to look for a George Soros or a Koch brother pulling the strings in cases like this. Samantha Power shows us the deeper micro-processes in American intellectual life, processes that attach brilliant young people– “happy, wholesome, self-confident, new-vanguard, shiny people”–to the twenty-first-century American project.

Patrons of the arts

“When bankers get together for dinner, they discuss Art. When artists get together for dinner, they discuss Money.” That’s the British playwright Oscar Wilde, speaking to us from around 1900. That was during the world’s previous great Gilded Age, and now that we’re deep into a new one, we humanists need to pay attention, even here in non-artistic, non-imagination-centric corners like the History Department. That’s because Wilde raises one of the basic questions we should be asking about ourselves: what’s our relationship to money and the powerful people who have it?

In wisecrack format, Wilde sums up one of the classic answers. Rich people love the arts, and artists (or historians, or philosophers– you get the idea) need money. Usually we can’t get that money selling our wares to ordinary people, since they have other needs to cover first, like housing, clothing, and food. Like it or not, we’re in a luxury business, selling expensive, delightful add-ons that make life better but aren’t needed to keep it going. We have to sell to the same folks who buy the other luxury products.

Of course the selling is more direct in the art world. Rich patrons interact directly with artists, and sometimes they tell the artist what to produce– a portrait of the kids, a design for a new home, a new opera. In academia, there are intermediaries. Donors give their money to institutions, which then dole it out to individual professors and researchers according to the institutions’ own guidelines and standards. But it’s basically the same process, rich people paying for cultural production.

Usually that doesn’t mean bad art or ideas, au contraire. Many of the rich have had good educations, and anyway they don’t have to care what other people think– they can make the adventurous calls, not just the safe ones. The Rockefellers created New York’s Museum of Modern Art back when modern art seemed crazy and dangerous, and that openness to the new has been a standard pattern since the Renaissance. Check out the seventeenth-century painter Caravaggio for an extreme example. He was gay, violent, and young, and he painted sacred scenes in wild new ways– but he received huge support from all sorts of Catholic big-shots.

But it seems that push always eventually comes to shove, and then the dark sides of artistic/intellectual patronage come into view. I’ve written here already about the case of Steven Salaita, whose appointment the University of Illinois overturned after wealthy donors complained about some of his tweets. And you’ve probably heard about the billionaire Koch brothers, sophisticated and generous patrons of the New York City Ballet and other cultural institutions, who’ve also donated tens of millions to various universities– but with strings attached: in return for the money, at least in one case, they’ve demanded that the university teach ideas congenial to them, and they may have demanded a say in the faculty appointment process.

In other words, the rich aren’t just buying aesthetic pleasure– they’re also investing their money, and like all investors they expect a return.

Now there’s a new example to ponder, more disturbing in that it concerns an especially sympathetic figure– the hedge-fund manager George Soros. Unlike most of the other modern billionaires, Soros has genuine intellectual credibility– before he got rich he did a real PhD, in a hard-core humanities discipline, and he’s used his money to support various admirable causes. He’s even helped create a whole new university in his native Hungary, devoted primarily to the humanities and social sciences.

So to a humanities professor like me, Soros is a good guy, exemplifying the best sides of cultural patronage.

But now Soros has joined the war-pushing business that’s so popular these days, calling for tougher European action in the Ukraine: “Europe is facing a challenge from Russia to its very existence,” he tells us; “the argument that has prevailed in both Europe and the United States is that Putin is no Hitler,” but “these are false hopes derived from a false argument with no factual evidence to support it;” all European resources “ought to be put to work in the war effort,” because “in the absence of unified resistance it is unrealistic to expect that Putin will stop pushing beyond Ukraine when the division of Europe and its domination by Russia is in sight.”

Whatever you may think about the Ukraine situation, there’s a lot here to weird you out. There’s the casual talk of going to war, as if launching a serious European war wouldn’t be one of the all-time human disasters. There’s the full-court demonization of our enemies, as monsters with whom it would be folly–“unrealistic”– to negotiate. We’ve had fifteen years of this kind of rhetoric– has it produced anything but disasters?

And then there’s the strange venue that Soros selected for his call to arms: the New York Review of Books. Most of those who encounter this site will know all about the New York Review, but in case you don’t, it’s the publication that pretty much encapsulates humanities department thinking in the US. Every two weeks, it offers extended reviews of academic books, along with one or two pieces of sophisticated political commentary; professors write most of these, but they write with educated-outsider readers in mind.  So people like me read it to learn about the new trends in English or Art History, or about debates on the origins of the American Revolution– it’s a way to get up to at least amateur speed on interesting topics, without doing the heavy reading yourself, a virtual coffee house for academics, where we all meet up.

Which raises the question, why is a call for European leaders to get tough appearing there, rather than in the Frankfurter Allgemeine Zeitung, Le Monde, or the New York Times?

Now, I have no clue what Soros has in mind with the substance of his warfare talk. Maybe he actually believes the comic book, super-villain-on-the-loose worldview he’s pushing, or maybe he has some money-making irons in the Ukraine fire (iron Maidans, as it were…), or maybe some mix of the two– who knows?

But we can do better guessing about that last question, the why-the-New York Review question. Because whatever else is going on, Soros is broadcasting to an audience made up mostly of us humanities professors and various humanities-adjacent types; he apparently wants us along on his foreign policy crusade. It’s the classic good news/bad news story. The good news: our collective opinion seems to matter in legitimating an enterprise of this kind, perhaps more than most of us realize. We’re worth courting. The bad news: when it comes to cultural patronage, the good guys like Soros give as much thought as anyone else to the returns their investments will bring.