Tag Archives: Why study history?

Sex, money, and the refusal of historical knowledge

My last post explored a baffling feature of our twenty-first-century world. Here we are deep into an era of hyper-free-market-ism, with all sorts of unexpected items coming up for sale; we’ve already brought capitalist entrepeneurship to our battlefields, and thoughtful writers are pushing for an open market in body parts like kidneys. And then, we’re also in a mostly post-puritan era, having mainstreamed various sex practices that horrified our parents, plus some they never even imagined.

So you’d think a free-market, anything-goes society like ours would look tolerantly on the sale of sex– yet in the zone where the market meets the sexualized body, it’s actually an age of crack-downs and high-intensity moral crusading. Countries that used to permit prostitution have reversed course and are working to stomp it out. Big philanthropic organizations whip up fears of sex trafficking, and their billboards pop up in towns across North America. Articles denounce old movies like “Pretty Woman” for glamorizing prostitution— in fact Google auto-completes the phrase before you’ve finished typing, and it generates 8,600 results.

Apparently the world has changed pretty significantly since 1990, when “Pretty Woman” was a box-office smash.

A situation like this (you might think) pretty much screams for the application of historical knowledge. After all, since the 1970s historians have produced a ton of terrific research on sex work itself and on all sorts of adjacent topics. We historians also think a lot about moments when values and practices undergo rapid change, as they seem to be doing these days.

But you wouldn’t know any of this from browsing the contemporary news about sex work; in fact you’d never guess historians ever got within ten-foot-pole reach of such topics. Most journalists don’t bother adding picturesque historical examples to their stories about the contemporary scene, and they apparently NEVER ask historians for any larger perspectives.

In other words, it’s another case that shows just how determined our society is not to learn from history. And by “our society,” I mostly don’t mean the evangelical yahoo sector, for whom history’s just a distraction; what counts for them are God’s eternal injunctions and probititions. No, the significant refusals come from the liberal-minded, humanistically-educated sorts who shape our policy discourses. Many of these folks must have studied some history in college– surely some were even history majors?– but you wouldn’t know it from the way they talk.

As an extreme example, consider the Harvard-educated New York Times columnist Nicholas Kristof, who’s been a key player in the recent anti-sex-trafficking movement. Kristof likes on-the-scenes journalistic interventions, and he’s visited about 150 countries (he pretty much exemplifies the modern idea that keeping busy trumps thinking and reading about things). In the course of these jaunts, he’s rescued young prostitutes in Cambodia and India, by paying off brothel keepers, and he’s accompanied police raids on brothels in Thailand. The results have appeared not only in his Times columns and blog, but also in a documentary movie, and they’ve made him an international celebrity. Another prominent jouralist describes him as “the Indiana Jones of our generation of journalists.”

Of course these exploits have also included occasional pratfalls. Kristof was an enthusiastic booster for a “former prostitute” whose story proved fake, and he’s been shown to have used poor judgment in some other cases. Perhaps as a result, his Wikipedia page currently (September 27, 2015) makes no reference to his sex-trafficking stories.

But maybe pratfalls and mistakes are inevitable in investigations like these. What’s much weirder is the historical shadow that follows Kristof’s efforts. Because at least in his prostitution articles, Kristof reenacts almost almost flawlessly a famous nineteenth-century journalistic scoop. The historian Judith Walkowitz shows us the nineteenth-century original in her wonderful book City of Dreadful Delight, which examines sex and gender practices in late Victorian London. She devotes a couple of chapters to the reporter W. T. Stead, who braved the dangers of London slums, uncovered their vast networks of sex trafficking and child prostitution, and triumphantly rescued  one girl for 5 £.

At least that was his story, as he published it in the newspaper he edited. In fact (as Walkowitz’s patient research shows), just about every element in the story had been massaged, trimmed, and rebuilt so as to fit into prefabricated literary boxes. Many of those boxes came from nineteenth-century melodrama, whose stock characters reappear in Stead’s account: there’s an innocent girl under threat, an impoverished family that’s too weak to defend her, depraved, scheming rich men. And then there’s the journalist himself, as lone hero outsider fighting an evil system– partly by saving a specific girl, partly by shining journalistic light on the world’s dark corners.

Nowadays journalists travel farther to reach the world’s dark corners, but otherwise how different are Kristof’s stories? The brightly-colored characters, the plot line, and the dénouement are the same; just like Stead, Kristof even specifies the dollar amounts he shelled out in his rescue operations– it’s the kind of detail that added zing to a story in 1885 and still does today.

You might say, so what? Kristof’s not a scholar, and he’s not obliged to footnote what previous authorities said about his subject. Isn’t he performing a valuable service by drawing attention to evils and suffering? Why make a big deal about his recycled narratives?

The answer is, because it’s just stupid to base our understanding of the world and the people in it on the simplest categories of nineteenth-century fiction. Sex work may or may not be a bad thing, which may or may not deserve repression.  But it’s something real people undertake, responding to their real circumstances and actively choosing among their real, not-so-great options. Recycling nineteenth-century narratives in the twenty-first century guarantees we won’t even see those realities, let alone understand them or respond sensibly.

Historians like Walkowitz can help us see these realities in our own world– but only if the power players start listening.

The drones club

Everybody’s heard of drones, right? They’re the latest Big Thing in western war making– pilotless-but-armed aircraft that circle for hours without refueling, allowing armies to gather information about once-inaccessible territories and attack enemies without warning. The machines are human-controlled in distant command centers, using advanced information and communications technology, but apparently they’re becoming more autonomous, with new AI capabilities. So far only the US has actually killed anyone with them. But other countries are getting into the game, and everyone seems to agree that drones are the new face of battle.

All of which raises some questions.

For now I’ll pass quickly over the most basic– namely, are the people pursuing these projects completely insane? I mean, they never heard of Terminator, Skynet, and all the other dystopian killer-robot scenarios? Who doesn’t know these stories end in tears?

I won’t say more about that side of things here, not because it’s unimportant but because it needs zero thought– any kid with video-streaming knows the score, even if the well-educated Serious People running our public institutions don’t.

But some historically-informed reflection may help in a different way, by sharpening our understanding of the pre-apocalypse arc of the drone story– meaning, how things are likely to play out before we reach a full-frontal Terminator fiasco. That’s because historians have studied the life-cycles of other super-weapons, and we can say something about where this particular instance is heading.

Historical thinking is especially worthwhile here because even the anti-drone camp seems to buy into a basic idea about them– namely, that they fundamentally change the nature of war itself. Historians have encountered that belief in numerous contexts, and up to now it’s been wrong every single time.

As an example, here’s the always-admirable-and-usually-right Ted Rall, arguing that drones put an end to war’s character as a duel between adversaries. That’s what the great war theorist Carl von Clausewitz thought war was, but now (so say Rall and others) it’s closer to a manhunt, a one-sided encounter between predator and prey, because those who have the drones operate in such complete safety, hundreds or thousands of miles away from the killing scene. “The armed drone … unambiguously allows the state to kill anyone and everyone with impunity, without the slightest physical risk whatsoever.”

Rall himself is against this mode of killing, but among the Serious People his anti-drone objections transmogrify into pro-drone justifications. You’ve almost certainly heard some of them: drones save American lives, allow the precise elimination of bad guys, and actually reduce the bloodshed and mayhem of war. Do you want another World War I, with millions of boots on the ground, or robot surgical strikes that allow most people to go about their lives and don’t wreck whole societies?  Far from being embarrassed about it, the US actually advertises the manhunt image of modern warfare– our top killer-drone is called the Predator.

Which is why historians have to step up and point out that no, actually war doesn’t change its essential nature, and Clausewitz still applies here in the twenty-first century. We’ve had a long succession of super-weapons– the machine guns, tanks, airplanes, and submarines of World War I, the radios and strategic bombing of World War II, the napalm and B-52s of Vietnam, to cite just some twentieth-century examples. Each time there’s talk of military revolution, new rules of the game, and the new technologies briefly tip the military balance to the side that first invented them. Then the other side adapts one way or another, and the essential nature of military conflict reasserts itself. It’s still a contest of wills and intelligence, and it still centers on hurting the other side enough that they yield.

I mean, in case you missed it: in the last fifty years, the low tech North Vietnamese, Afghans, and Iraqi insurgents all defeated the ultra-high-tech Americans. They won mostly because the wars in question mattered way more to them than to us.

Now, in most of the big twentieth-century wars, military adaptation meant imitation– the other side started making its own tanks/submarines/atomic bombs. But the Vietnam-Afghanistan-Iraq examples show the deeper reality beneath these technological arms races. In a real war, each side does what it thinks it has to to win, and that doesn’t necessarily mean keeping up with technology fads. It might mean abandoning a great city to the enemy, as the Russians did in 1812, leaving Napoleon to freeze and eventually get the hell out; it might mean suicidal missions like Vietnam’s 1968 Tet Offensive, designed to demoralize the American public that was paying the bills; it can mean roadside bombings and terrorism, as in the Algerian and Iraq Wars, or just hunkering down, as both sides did against strategic bombing of World War II– it killed hundreds of thousands on both sides, but apparently did nothing to shorten the war.

We don’t yet know the specifics of how the drone super-weapon story will unfold. But we’ve got a long historical record telling us there’s no war in which one side gets total impunity, none that isn’t partly a test of wills– and none in which both sides don’t end up hurting.  Historically, the side that wins is the side that’s willing to take more of that hurt, not the one with the most toys.

Maybe we’ll avoid the killer robot apocalypse, but we don’t get a free pass on the nature of war.


Historians and irony, Part II

My last post talked about historians’ irony, which I presented as a way of approaching the past, a tendency not a specific interpretation. Irony-friendly historians tend to see people as having a limited handle on their circumstances, and even on their own intentions. Not knowing the world or ourselves very well, on this view, we humans regularly blunder into tragedy, generating processes we can’t control and outcomes we didn’t want. We don’t know what the fuck we’re doing.

I also suggested that irony of that kind is out of fashion nowadays. Not among all historians, and not 100 percent among any historians– as I said last time, we can never give it up altogether, because we know more than the people we study about how their stories turn out. But historians and irony are mostly on the outs right now, and that counts as something important about our era of historical writing. Open a recent history book, and you’re likely to encounter words like “contingency” and “agency.” Even late in the day, these words tell us, things could have gone differently, and individual decisions made a real difference. These words also tell us not to condescend to people in the past– not to view them as the helpless puppets of bigger forces, not to dismiss their efforts, hopes, and ideas, good and bad alike.

Things were REALLY different back in the mid-twentieth century, and they were still mostly different in the mid-seventies, when I got my PhD. In those days, the talk was all about long-term processes, societal changes, and the blindness of historical actors, and you found that talk pretty much everywhere in the profession, among Marxists and Freudians on the political left, modernization theorists and demographers in the middle, political historians on the right. These scholars mostly hated each other, but they agreed on a basic interpretive stance: big forces trumped individual wills.

So what happened? How did the history biz go from mainly-ironic to mainly-non-ironic? The question matters, because it touches on the ideological functions of history knowledge in our times. Mainly-ironic and mainly-non-ironic histories provide different lessons about how the world works.

Of course, some of the change just reflects our improving knowledge of the past. We talk more nowadays about contingency because we know so much more about the details of political change. We talk more about the agency of the downtrodden because we’ve studied them so much more closely– now we know that serfs, slaves, women, and other oppressed groups had their own weapons of small-scale resistance, even amidst terrible oppression. They couldn’t overturn the systems that enclosed them, but they could use what powers they had to carve out zones of relative freedom, in which they could live on their own terms.

And then, there’s what you might call the generational dialectic. Like most other intellectuals, we historians tend to fight with our intellectual parents– so if the mid-twentieth-century historians were all into big impersonal forces and longterm processes, it’s not surprising their successors looked to poke holes in their arguments, by pointing out all the contingencies and agency that the previous generation had missed. That’s one of the big ways our kind of knowledge advances, through criticism and debate. (For a discussion of this process as it works in a neighboring  discipline, see here.)

So there are plenty of reasons internal to the history profession that help account for irony’s ebb– and that’s without even mentioning the decay of Marxism, Freudianism, and all those other -isms that tried to explain individual behavior in terms of vast impersonal forces. Almost nobody finds those explanatory theories as persuasive as we once did, in the history department or anywhere else.

But having said all that, we’re left with an uncomfortable chronological juxtaposition: the historians’ turn to mainly-non-irony coincided with the circa-1980 neo-liberal turn in society at large, the cultural revolution symbolized by Margaret Thatcher in Britain and Ronald Reagan in the US. There’s a substantive juxtaposition as well: while we historians have been rediscovering agency among the downtrodden and freedom of maneuver among political actors, neo-liberal ideology has stressed individuals’ creativity and resourcefulness, their capacity to achieve happiness despite the structures that seem to imprison them. Unleashing market forces, getting people off welfare, reducing individuals’ reliance on public resources– these all start from the presumption that people have agency. They know what they’re doing, and they should be allowed to do it.

In other words, Edward Thompson’s warnings against “the enormous condescension of posterity” weirdly foreshadow various neo-con one-liners about how social programs and collective goods condescend to the disadvantaged. (For an example, check out George Will and George W. Bush talking about cultural “condescension.”)

Which of course is a pretty ironic thought, given that Thompson was a committed political activist and brilliant Marxist theorist. But if it could happen in the 1950s, it can happen now: intellectuals who hate each other and disagree on many specifics can nonetheless be teaching the same basic ideological lessons.

To me this suggests it may be time to rethink concepts like contingency and agency, or at least re-regulate our dosages. Maybe our alertness to agency has diminished our sensitivity to tragedy, to the ways in which circumstances really can entrap and grind down both individuals and whole communities. Maybe we need to think more about the long chains connecting specific political actions and constricting everyone’s freedom.

Maybe we historians need to stop being so damned optimistic!


Why study history, Monday update

Ta-Nehesi Coates has a terrific essay in The Atlantic about last week’s great National Prayer Breakfast Controversy.

Apparently Barack Obama had been asked to address this annual confab of Christian power-players, and in his remarks he suggested that today’s Islamic fundamentalists aren’t uniquely barbarous, crazy, or evil; in the course of history, Christians– even American Christians!!– had also occasionally killed, kidnapped, looted, and enslaved in the name of their God. Naturally, outrage ensued, with the Best In Class award going to a Virginia politician: Obama’s remarks, he explained afterward, were “‘the most offensive I’ve ever heard a president make.’

Now, I’m assuming readers here don’t need me to underline the ridiculousness of such talk. (On the other hand, the normally sensible Christian Science Monitor presents this as a debate with two more or less legitimate sides, so maybe the underlining is needed; if so, take it as given.)

But the great NPB Controversy does offer a worthwhile reminder of something else, which has been a big theme on this blog: our society needs historical study, and it needs the specific kinds of historical study that historians undertake. Meaning, it’s not enough just to look to the past for data about economic trends, or battlefield success, or the values that led some societies to develop stable democratic governments.

Economists, generals, and political scientists do all those things, sometimes usefully– but we also need more. We need the whole past, complete with its losers and victims, its crimes and craziness, its miseries.  We especially need to know about our own crimes, follies, and victims, as well as those of other people. That’s the kind of thing that historians dig up.

We need that kind of knowledge for the most obvious ethical and practical reasons. Even we non-Christians know it’s just wrong to view ourselves as fundamentally superior to other peoples, immune to their criminialities and fanaticisms. It also doesn’t work, as we Americans ought to have learned from our last fifteen years of foreign policy disasters. If we don’t want to learn humility for its own sake or to honor Jesus, how about we do it to avoid another fifteen years of expensive, bloody, planet-wrecking military failure?

The great NPB Controversy also illustrates a second thread running through these posts: we have a collective need for history of this kind, but we can’t expect private individuals to meet it, not unless we provide them way more collective support. The outraged prayer breakfasters didn’t hesitate to trash The Most Powerful Man On Earth ™–are they really going to hesitate to trash untenured academics saying the same thing in stronger terms?

That’s why we historians make a big mistake when we defend our enterprise in terms of its career-making benefits, all those skills that are supposed to get our students good jobs and bright futures. Those benefits exist, but along with them come career-endangering risks; the good jobs aren’t going to go to young people who’ve been accused of negativism by some southern politician or angry blogger.  Individualizing the virtues and rewards of historical study means drifting toward feel-good, accentuate-the-positive histories, the kind that will please employers– and it means that history will lose the central place it might have had in national debates.

The great NPB Controversy shows us that feel-good historical culture is already here– it’s time to push back.

On Not-Learning-From-History, 1: we’ve got a problem

In my teaching, I usually tell students to be suspicious about “lessons from history.” The past is complicated, I say. Situations differ, and our knowledge about them is always imperfect; usually it’s downright lousy. The main lessons of history concern human ignorance, I usually say; we just don’t know enough about the past to draw lessons from it about the future.

But lately I’ve changed my tune. It’s not that there are no lessons from history, I’ve come to think– actually there are plenty. It’s just that most of them are so obvious and straightforward that we historians take them for granted, as base-line common sense that doesn’t need talking about. That silence wouldn’t be a problem, except that many people– smart, educated, high-minded people– either haven’t learned those low-level lessons from history, or have somehow unlearned them.

So I’ve started to wonder how that Not-Learning-From-History happens. I’ve come to think it’s a complicated and interesting process, which deserves some attention. I’ll have more to say about the process in the next few weeks. Here, I just want to suggest some dimensions of the problem itself– mainly, that it’s really big.

The kind of history lesson I have in mind comes from the realm of political leadership. Of course I’m thinking a lot about leadership these days, since we’ve apparently entered a new “he’s the next Hitler” phase, this one featuring Vladimir Putin of Russia. (Unless you’ve been away for the last decade or two, you’ll know we’ve had plenty of other next-Hitlers recently.) The idea is that Putin (following the original Hitler pattern) exhibits a mix of demonic ambition, irrational violence, and masterful control over his helpless subjects. So he’s dangerous and has to be stopped now, before he gets going on his project of world domination; ignoring him will only lead to more trouble.

Here’s where the historian’s lessons ought to come in, because in the real history books even Hitler himself didn’t fit the “next Hitler” pattern. Nobody does, partly because (history teaches) political leaders are never all-powerful puppetmasters; even dictators need cooperation from millions of subjects to get anything done, and those millions of subjects are getting something that’s making them cooperate. History also teaches that all societies have real collective interests, which their leaders usually try to advance one way or another.

Taking those lessons seriously doesn’t mean denying the role of individuals in history, and it doesn’t say anything about international conflicts. Collisions of societal interests can be violent, and sometimes there’s no way to compromise among them; individuals– both political leaders and others– have often shaped their countries’ development.

But history does teach that it’s stupid to treat any leaders as demons, Marvel-style super-villains, or lunatics.  It’s just as stupid to think of them as societal cancers, whose surgical removal will allow the social body to return to healthy growth. That’s just not how societies work.

And yet to many influential people in Washington, London, and elsewhere, that stupidity apparently counts as common sense. It’s not just talk, either. A significant amount of recent military action has centered on “taking out” various leadership groups, “decapitating” regimes, all that sort of thing– meaning that the super-villain idea of government is actually shaping what really happens in the real world. If we just get rid of whichever next-Hitler we’re currently focusing on, the idea goes, things will start to go right in Afghanistan/Iraq/Iran/Syria/the Ukraine/ and the dozens of other places our foreign policy touches.

So that’s a first take on the Not-Learning-From-History problem. We apparently have smart, highly-educated, powerfuI people who haven’t absorbed the simplest lessons that history can teach. I mean, we’re not talking about a failure to understand long footnotes on obscure topics. Even the basics aren’t getting through.

It’s not clear to me what’s going on, but it’s something we ought to try to understand.

Why study history, Part 2: individual lives, collective knowledge

My last post talked about why we need historical knowledge. (Short version: history tries to see reality whole, with all the details, contradictions, and complexity left in, and we need that kind of thinking — because reality IS complicated, in ways that few academic disciplines acknowledge.)

So far so good, but then we hit the big cloud hanging over history education in 2014. “We” may need historical knowledge, but “we” don’t do the studying or pay the tuition or try to get jobs after finishing college. Individuals do all those things, and individuals have to live with the results. It’s all very nice and uplifting to say that people should study history, but what if there are no jobs for them? Why should students rack up fees and debts if there’s not much waiting for them after graduation?

What follows isn’t exactly an answer to that question; I’m not even sure there really is an answer, in the usual sense of the term. Instead, I present here some thoughts on the question itself, and suggest that we need to place it in larger contexts than we usually do. The “why study history” question, I’m saying, is really a question about how individuals, communities, and knowledge intersect in 2014.

The first step is to recognize the seriousness of the problem. The jobs situation for history graduates isn’t good, and it’s probably getting worse. Back in the good old days, meaning until about 1975, big corporations liked to train their own people, and they welcomed candidates with liberal arts degrees; it was understood that training would cost money, but that was an investment that eventually would pay big dividends. Anyway, liberal arts graduates could always fall back on teaching if business didn’t appeal to them.

Things are different today. Schools at every level are in financial trouble, and they’re not hiring many historians. In the corporate world, job candidates are increasingly expected to show up pre-trained and ready to contribute; no one expects them to stay around long enough for training programs to pay off, so HR departments favor people with career-ready educations, in economics, technology, health sciences, and the like. (See here for an account.) In these circumstances, a history major may be ok those who don’t have to worry about jobs after graduation, or for those who can treat college as a preparatory course for professional programs like law. It’s not so great for those who need to start paying the bills right away.

In response, historians have publicized all the ways in which history actually is a good preparation for a real career in the real world. And we have some reasons for saying so– history courses teach you to analyze situations and documents, write clearly, think about big pictures, understand other cultures (something worth real money in today’s inter-connected global economy). Most of the history department websites I’ve visited (here for example) include some version of these claims.

The American Historical Association (the history profession’s official collective voice in the US) has taken this approach one step farther. With the help of a foundation, it has set up a program (which it calls the Tuning Project) designed to bring college history teaching into closer alignment with employers’ needs, by putting professors in touch with employers and other stake-holders. If professors have a better understanding of what employers want, the hope is, we can better prepare students for the real world and draw more majors into our courses.

But you can see the problem: some parts of a history education give you the skills to work in a big-money corporation, but many others don’t. Some history topics require knowledge that’s hard to acquire and not much practical use in the twenty-first century– the dates of obscure wars, or the dead languages needed to understand some ancient civilizations. Other topics are likely to mark you as a dangerous malcontent. Picture a job seeker showing up at Mega Corporation X (or at the Chicago Board of Education, for that matter) with her senior thesis on union organizing in the 1930s, or the successes of Soviet economic programs, or Allied war crimes in World War II. Whatever her skills of analysis and cultural negotiation, she’s not the kind of candidate HR departments are looking for. She carries intellectual baggage; she looks like trouble.

That thought experiment suggests the possibility that “tuning” the history major actually means changing its content– by cutting out the troublesome (discordant?) elements, those that might upset our conventional wisdoms. Of course, you could argue that “tuning” just applies to teaching, and therefore doesn’t change the basics of historical knowledge. Professors still get to research and write about whatever they like; in their books, they still get to be intellectual adventurers and trouble-makers. But that’s no real answer, because as American institutions currently work, history teaching eventually shapes history research. If history majors aren’t studying unions or war crimes, universities aren’t going to be hiring faculty in those areas either, and those books won’t be written.

That’s bad news, because American society has a strong collective interest in making sure that this kind of knowledge gets produced. All societies need need to think about difficult questions and disturbing ideas, for the reasons that John Stuart Mill laid out way back in the 1850s. Societies that fail to do so (he explained) do stupid and immoral things; they fail to develop intellectually or socially; even their economic lives suffer, since the choke-hold of conventional wisdom eventually stifles business too. For Mill, disruptive knowledge was as much a practical as a spiritual need.

But it’s not clear how this collective need is to be met by the American university as it increasingly functions nowadays. As the language of the individualistic free market becomes more prevalent within it, fields of knowledge risk being defined by calculations concerning “the employability of our graduates” (as a document from my own university puts it). Given the pressures that they face, our students are fully justified in focusing on their “employability,” and university faculty have a duty to help them toward it. But that’s not the university’s only duty. It has at least an equal duty to develop knowledge, including especially knowledge untuned to employers’ needs, even antithetical to those needs.

That means that eventually the “why study history” question shades into a political problem. Historical knowledge is a form of collective property, and its health is bound up with other elements of our communal life. In the increasingly privatized universities of our times– privatized in financing, mechanics, and measurements of success–the “why study history” question may not have an answer.




Why study history, Part 1: The case of the disillusioned history major

Why study history? It’s a good question. Studying history means learning about dead people, by-gone situations, wrong-headed ideas, funny clothes. But that’s not where we live; the knowledge we need for real life deals with the here-and-now, not the over-and-done. That’s always been true, but it’s especially pertinent nowadays, as we face possibilities and troubles that never worried our ancestors. With self-aware, armed robots coming off the assembly lines any day now, should we really be worrying about Louis XIV and Julius Caesar? Wouldn’t that time and energy be better spent on our own problems?

Every historian has to take that question seriously. We get paid to think about the past, and we ask our students to spend at least a few hours a week thinking about it too– we’d better have some answers as to why it’s worth their time and ours.

Here I offer a few of my own answers by describing a case of rejection– the case of a smart guy who encountered history, didn’t like what he found, and turned elsewhere for intellectual sustenance.

The smart guy is Paul Krugman, one of America’s most important intellectuals. Krugman is a rare figure in the United States, both a star professor (having taught economics at MIT and Princeton, he’s now moving to CUNY) and at the same time someone who tries hard to reach ordinary readers, with a regular column in the New York Times and a widely-followed blog. His columns and blog posts show him to be literate, progressive in his politics, and deeply engaged by the real issues of contemporary life.

So for anyone trying to understand where historical study fits in today’s world, it’s interesting to learn that Krugman came to college expecting to major in history. What he found disappointed him. As he explained in a New Yorker profile a few years ago, his idea of history had been shaped by his teenage reading of Isaac Asimov’s Foundation series. Those books show history to be a difference-making intellectual tool. Having carefully studied the past, Asimov’s heroes (a small band of historians armed with math skills and big computers) can see where their society is heading, long before the people around them have a clue.

Not only can the heroes see the deep processes governing their world, they can act on their knowledge. Centuries in advance, they secretly establish mechanisms that will head off the worst of the troubles they’ve diagnosed, and their plans work; trouble emerges just as they’ve predicted, but since they’ve seen it coming, their pre-programmed remedies are effective. They can’t prevent all the disasters, but they do enough to improve billions of lives.

But Krugman discovered as a Yale freshman that university history didn’t match up with the history he’d read about in Asimov. Just the reverse– his history courses seemed to offer only a mass of complexities and uncertainty. They didn’t elucidate those deep processes of changes that Asimov’s historian-heroes had grasped, and they offered no hope of guiding society to a better future. “History was too much about what and not enough about why,” the New Yorker article explains. In his disappointment, Krugman switched departments, and in economics he found what history had failed to deliver. Unlike history, he found, economics explored the basic rules by which societies function; it offered the kind of knowledge that could guide effective action. “Suddenly, a simple story made sense of a huge and baffling swath of reality,” is the journalist’s summary of Krugman’s intellectual experience.

Ordinarily, teen reading and freshman disappointments wouldn’t count in assessing an intellectual’s views; we all get a free pass on our adolescent hopes and dreams. But Krugman himself offers Asimov as a key to understanding his grown-up intellectual life as a professional economist. He acknowledges, of course, that even as an economist he can’t attain the predictive clarity that Asimov’s historian-heroes managed, but economics allows him to play something like their role. Using economic reasoning, we can get beneath the surface mess of life, understand its essential mechanisms, and use that knowledge to nudge society’s development in the right directions. Unlike history, economics both explains the world and provides a basis for improving it.

So Paul Krugman’s autobiographical reflections contrast the robust and useable knowledge of economics with the baffling jumble of detail produced by historians. It’s a critique that matters because it starts from assumptions that most historians share. These are the complaints of an intellectual, someone who believes in what universities do, not the standard ignoramus haroomphing about tenured radicals and useless research.

And Krugman’s critique gets at something fundamental about history, and about where it diverges from other kinds of knowledge. He’s right that we historians are interested in everything that reality presents; no other academic discipline has as much invested in exploring the details of life, and none has the same belief that all the details deserve our attention. We fuss about getting the dates right, and we write whole books about individual lives and brief episodes, some famous, many of them previously unknown. We want to tell all the stories that humanity has preserved, even the stories of lost causes and dead-end lives. That’s the historian’s paradox: we study the dead because that’s where we can explore all the dimensions of life.

That absolute commitment to the real, in all its chaotic variety, is the defining characteristic of historical thinking, and Krugman is right to zero in on it.

But he’s wrong in seeing history as “too much about what and not enough about why,” and that mistake is also important. The real difference concerns HOW historians and economists understand causation, not how much importance they attach to asking why things happen. The New Yorker profile makes that clear in summarizing Krugman’s freshman studies of ancient slavery and medieval serfdom: in explaining them, his history teachers talked “about culture and national character and climate and changing mores and heroes and revolts and the history of agriculture and the Romans and the Christians and the Middle Ages and all the rest of it;” his economics teacher explained the same phenomena with one simple calculation about agricultural productivity, population, and land availability.

So in fact Krugman’s complaint wasn’t that history didn’t say “enough about why,” but that it said too much– in his history course there were too many explanations, involving too many areas of life, too disconnected from one another. His economics professor replaced that long list with one causal rule, which applied in all situations. For the young Krugman, it was easy to choose between these two modes of understanding: the simpler explanation was more powerful and more useful– and therefore better. The mature Krugman agrees.

A few years ago, I would have said that we don’t have to choose between these two visions of causation, and that economics and history are just two ways of thinking about the world, each useful for some purposes, each with strengths and weaknesses. Sometimes we need a simple and elegant formula for understanding whatever it is we’re dealing with, and sometimes we need to wallow in the complications. That’s true in our personal lives, and just as true in our thinking about human societies. It’s why universities have lots of departments, and why we push students to take a wide range of courses.

But in today’s world, it’s difficult to stay so tolerant, because economics today is not just one academic area among many or a toolbox of handy techniques for thinking about business. It presents itself instead as a master discipline, a way of thinking that can illuminate all areas of our lives. Its methods and assumptions now show up in education, law, family relations, crime prevention, and pretty much everywhere else. Its terminology is also everywhere, and we scarcely notice anymore when a political leader tells us to apply marketplace solutions to some new area of public life. (For a cogent account of this “economics imperialism,” see here.)

With economic thinking taking over so much of modern life, the “we don’t have to choose” option isn’ really on the table. Others are trying to choose for us, and it’s important to push back — and not to be too polite as we do so.

I’ve chosen Paul Krugman to push back against here, but not because he’s an especially extreme advocate of the economics take-over. On these issues he’s probably the best that mainstream economics has to offer, an economist who seems alert to the arguments, values, and concerns that other ways of thinking offer. But that’s really the point: it’s often the best representatives of a worldview that we should question, not the worst.

And the fundamental question for Paul Krugman is, why should we take seriously any simple story about human affairs, let alone prefer it to a more complicated story? In the dream world that Isaac Asimov cooked up, sure, but in our world of real human beings? Krugman’s college economics teacher could offer a one-step formula that “made sense of a huge and baffling swath of reality”– but fairy tales and paranoiac fantasies do the same thing. In those cases, we all know, “making sense” of a phenomenon isn’t the same as getting it right. Ditto for explaining our own or our friends’ behavior: it’s usually not a good sign when we give a one sentence answer explaining a divorce or a workplace event.

Why would we apply different standards to a phenomenon like slavery, which brings together so many forms of human motivation: self-interest, racism, anger, sex, religion, culture, tradition, the love of domination, and much more?

We need history for exactly the reasons that the young Paul Krugman found it so frustrating. Reality is complicated, and its complications press in on us, confronting us daily with lusts, angers, and all sorts of other forces that don’t make it into Asimov-style fiction.  For some purposes and in some limited contexts, we can get away with pretending that those forces don’t matter. We can’t pretend for very long, though, before reality starts biting back. Historical study won’t tell us what to do when it does, but at least it will give us a serious idea of what’s out there.