Tag Archives: history

Historians and progress, some further thoughts

A few weeks ago, I offered some thoughts about the tangled, mixed-up relationship we historians have with the idea of progress. My own entangled feelings include love for our modern gadgets and means of communication; worries about climate-change, atomic war, and all the other killer threats that lurk right around the corner; sadness at the human connections we’ve lost in the last few decades; delight in the easing of the puritanism, sexism, and racism that still ruled in the 1950s; fear that those achievements are about to be rolled back. And then, like most historians these days, I take cultural relativism seriously– meaning, in thinking about societies, my starting assumption is they’re all about equally successful in organizing themselves. Can you even have an idea of progress without the belief that some ways of living are just plain better than others?

My post talked about the strangeness of doing history in this post-idea-of-progress world. Of course we can still write about the past, and do it very well. But can we believe in the significance of what we’re doing? If we don’t believe the world is going anywhere in particular, does it matter all that much where it was a few hundred years ago? I quoted the historian E. H. Carr, who thought you actually couldn’t do history without believing in progress. A fair number of our students seem to agree with him– they’re very interested in the recent past, whose connections to their own lives they can see, but they have no sense that what happened during (say) the sixteenth century shaped their own lives today.

Thinking about that question of historians and progress got me thinking more than I usually do about why our sense of progress is so weak these days. I’ve come to think it’s a more interesting question than we usually imagine.

I mean, we’ve got all the usual suspects that explain cultural pessimism neatly lined up, the kind of forces that historians like to cite to explain (for instance) Europe’s dark mood after World War I.  Just in the last two decades, we’ve had wars, economic crises, genocides, destabilizing scientific discoveries. But the interesting thing is, these forces for doubt loomed even larger for previous generations, without cutting into their belief that the world was moving vigorously forward. My parents’ generation lived through the Great Depression, World War II, the Holocaust, atomic threats– our next door neighbor even built himself a backyward bomb shelter. Yet back in 1950s-ville everyone took “you can’t stop progress” as a slam-dunk life principle. It didn’t just come from the enthusiasts, either. People who hated the progress they were seeing shared the basic belief that it was inevitable.

So maybe we should consider the possibility that the external shocks — the wars, crises, and collective crimes– aren’t the whole story, maybe not even the main story. Maybe this is an area where we should highlight human agency, and look to political choices, made by identifiable groups of people, instead of big outside forces.

To see that side of things, think about what’s happening to today’s children and young adults. Almost nobody nowadays believes their kids will live as well as they do, and it’s likely things will be even worse for the following generation. We believe that because we see the mechanisms in action, in all those recently-installed measures that screw over the younger generations: we’ve ended low-cost higher education, crapified the public schools, shoved older folks’ health-care costs onto the healthy and young, ended workplace protections, created the monstrous student loan empires, and on and on.

So it’s not surprising we don’t believe in progress, despite all the miraculous inventions of the last few years– if you think life is going to be worse for the next generation than it is for us, that pretty much defines not believing in progress. But that disbelief isn’t a side effect of Heisenberg’s Uncertainty Principle or awareness of the Holocaust or the Global Economic Crisis of 2008 or the rise of China. It’s actually because we seem not to want progress in its most basic form, that of making young people better off than we are.

There’s a glimmer of hope in that conclusion, since collective decisions like these can be reversed. But this line of thought also raises a big historical question, that nobody seems to be asking these days:  why on earth are we mistreating the young people like this?

Historians and progress

A few weeks ago, my seminar discussed E. H. Carr’s What is History? It’s another of those old books that I often ask my students to read, on grounds that voices from the past can shake up our understanding of the present. They tell us how the world was understood by smart people who didn’t share our assumptions about it.

Of course What is History? isn’t a real antique like The Communist Manifesto, which I also push on my students, for the same reason. What is History? came out a mere fifty-odd years ago, in 1961. But the issues it raises are stunningly relevant today, and I don’t know of anything that covers the problems of historical knowledge so well. Problems like, how can we know anything about the past, since it’s over and done with? and, how are we supposed to distinguish between important issues that are worth studying and pointless trivia? and, what’s a historical explanation, anyway? With only a little massaging, a lot of Carr’s 1961 wisdom can sound like it comes from a post-modern theorist in the English department.

But there’s one issue where Carr seems to speak to us from a distant planet: he was a firm believer in progress, in fact one of his chapters is titled “History As Progress,” and he meant it: “A society which has lost belief in its capacity to progress in the future will quickly cease to concern itself with progress in the past,” is one of his lines.  Another is, “Nor do I know how, without some such conception of progress, society can survive.”

Does anyone nowadays have that touchingly innocent belief that the world is actually getting better? Does anyone even think we know where the world is headed, whether for better or for worse? In that sense, an infinite gap stretches between us and 1961.

It’s pretty tempting to explain Carr’s optimism in biographical terms. He grew up just before the twentieth century’s great calamities, and from day 1 he belonged to the gentlemen’s club-style British elite: fancy schools, a top degree at Cambridge (in classics, no less), work in the Foreign Office and London journalism, and high-level academic positions. He had radical opinions, and viewed the Soviet Union with what today seems insane enthusiasm. But even that had a place in the British establishment of his era. It didn’t prevent him from writing editorials for the London Times or holding a fellowship at Bailiol College, Oxford, right at the top of the British establishment.

Sheltered in all those ways, you might think, of course Carr could take an Olympian view of the historical troubles around him, as mere speed bumps on the way to a brighter future. But then you look at some of the other important historical writing from his era, and an awful lot of it has that same sense of forward marching. My favorite example comes from the French historian Lucien Febvre, writing in France in 1942— possibly the single darkest year in the twentieth century. World War II had been going for three years, the Holocaust was beginning in earnest, the Germans occupied France, smart people still thought Hitler would win. In spite of which, Febvre’s whole work was predicated on the contrast between dark, confused, and frightened pre-modern societies and modern societies like his own, which enjoyed the benefits of science, rationality, and electric lighting. At least “in normal times,” as Febvre put it.  (For the specifics, you can look at my book Lost Worlds, which examines Febvre and other French social history types.)

Could there be a more poignant confession of faith in progress? Examples like Febvre make me think it’s the era that explains Carr’s faith in progress, not his posh social niche.

But more important, it’s made me wonder if he wasn’t on to something in linking that faith to a certain vision of history itself, maybe even to history with certain ambitions– just because it’s hard to find the strong historical vision of mid-twentieth-century writing in the history we write today. Of course we still have lots of wonderful historical writing, but we’ve mainly given up trying to connect our discoveries about the past with our visions about the future– perhaps because so few of us imagine we can see that future with any clarity.  The great historical works these days tend to be microscopic in focus, detailed examinations of moments, individuals, and practices.

So the Carr example makes it seem we historians face a bad choice. We can keep to a dubious idea of progress and use it to shape our historical thinking, or we can write history that’s disconnected from how we think the world is going– in other words, we can write fragments, with no conviction that these fit into some larger pattern, or that there actually is a larger pattern. Was Carr correct in thinking that we can’t write meaningful history without those convictions?

Sex, money, and the refusal of historical knowledge

My last post explored a baffling feature of our twenty-first-century world. Here we are deep into an era of hyper-free-market-ism, with all sorts of unexpected items coming up for sale; we’ve already brought capitalist entrepeneurship to our battlefields, and thoughtful writers are pushing for an open market in body parts like kidneys. And then, we’re also in a mostly post-puritan era, having mainstreamed various sex practices that horrified our parents, plus some they never even imagined.

So you’d think a free-market, anything-goes society like ours would look tolerantly on the sale of sex– yet in the zone where the market meets the sexualized body, it’s actually an age of crack-downs and high-intensity moral crusading. Countries that used to permit prostitution have reversed course and are working to stomp it out. Big philanthropic organizations whip up fears of sex trafficking, and their billboards pop up in towns across North America. Articles denounce old movies like “Pretty Woman” for glamorizing prostitution— in fact Google auto-completes the phrase before you’ve finished typing, and it generates 8,600 results.

Apparently the world has changed pretty significantly since 1990, when “Pretty Woman” was a box-office smash.

A situation like this (you might think) pretty much screams for the application of historical knowledge. After all, since the 1970s historians have produced a ton of terrific research on sex work itself and on all sorts of adjacent topics. We historians also think a lot about moments when values and practices undergo rapid change, as they seem to be doing these days.

But you wouldn’t know any of this from browsing the contemporary news about sex work; in fact you’d never guess historians ever got within ten-foot-pole reach of such topics. Most journalists don’t bother adding picturesque historical examples to their stories about the contemporary scene, and they apparently NEVER ask historians for any larger perspectives.

In other words, it’s another case that shows just how determined our society is not to learn from history. And by “our society,” I mostly don’t mean the evangelical yahoo sector, for whom history’s just a distraction; what counts for them are God’s eternal injunctions and probititions. No, the significant refusals come from the liberal-minded, humanistically-educated sorts who shape our policy discourses. Many of these folks must have studied some history in college– surely some were even history majors?– but you wouldn’t know it from the way they talk.

As an extreme example, consider the Harvard-educated New York Times columnist Nicholas Kristof, who’s been a key player in the recent anti-sex-trafficking movement. Kristof likes on-the-scenes journalistic interventions, and he’s visited about 150 countries (he pretty much exemplifies the modern idea that keeping busy trumps thinking and reading about things). In the course of these jaunts, he’s rescued young prostitutes in Cambodia and India, by paying off brothel keepers, and he’s accompanied police raids on brothels in Thailand. The results have appeared not only in his Times columns and blog, but also in a documentary movie, and they’ve made him an international celebrity. Another prominent jouralist describes him as “the Indiana Jones of our generation of journalists.”

Of course these exploits have also included occasional pratfalls. Kristof was an enthusiastic booster for a “former prostitute” whose story proved fake, and he’s been shown to have used poor judgment in some other cases. Perhaps as a result, his Wikipedia page currently (September 27, 2015) makes no reference to his sex-trafficking stories.

But maybe pratfalls and mistakes are inevitable in investigations like these. What’s much weirder is the historical shadow that follows Kristof’s efforts. Because at least in his prostitution articles, Kristof reenacts almost almost flawlessly a famous nineteenth-century journalistic scoop. The historian Judith Walkowitz shows us the nineteenth-century original in her wonderful book City of Dreadful Delight, which examines sex and gender practices in late Victorian London. She devotes a couple of chapters to the reporter W. T. Stead, who braved the dangers of London slums, uncovered their vast networks of sex trafficking and child prostitution, and triumphantly rescued  one girl for 5 £.

At least that was his story, as he published it in the newspaper he edited. In fact (as Walkowitz’s patient research shows), just about every element in the story had been massaged, trimmed, and rebuilt so as to fit into prefabricated literary boxes. Many of those boxes came from nineteenth-century melodrama, whose stock characters reappear in Stead’s account: there’s an innocent girl under threat, an impoverished family that’s too weak to defend her, depraved, scheming rich men. And then there’s the journalist himself, as lone hero outsider fighting an evil system– partly by saving a specific girl, partly by shining journalistic light on the world’s dark corners.

Nowadays journalists travel farther to reach the world’s dark corners, but otherwise how different are Kristof’s stories? The brightly-colored characters, the plot line, and the dénouement are the same; just like Stead, Kristof even specifies the dollar amounts he shelled out in his rescue operations– it’s the kind of detail that added zing to a story in 1885 and still does today.

You might say, so what? Kristof’s not a scholar, and he’s not obliged to footnote what previous authorities said about his subject. Isn’t he performing a valuable service by drawing attention to evils and suffering? Why make a big deal about his recycled narratives?

The answer is, because it’s just stupid to base our understanding of the world and the people in it on the simplest categories of nineteenth-century fiction. Sex work may or may not be a bad thing, which may or may not deserve repression.  But it’s something real people undertake, responding to their real circumstances and actively choosing among their real, not-so-great options. Recycling nineteenth-century narratives in the twenty-first century guarantees we won’t even see those realities, let alone understand them or respond sensibly.

Historians like Walkowitz can help us see these realities in our own world– but only if the power players start listening.

Historians and irony, Part II

My last post talked about historians’ irony, which I presented as a way of approaching the past, a tendency not a specific interpretation. Irony-friendly historians tend to see people as having a limited handle on their circumstances, and even on their own intentions. Not knowing the world or ourselves very well, on this view, we humans regularly blunder into tragedy, generating processes we can’t control and outcomes we didn’t want. We don’t know what the fuck we’re doing.

I also suggested that irony of that kind is out of fashion nowadays. Not among all historians, and not 100 percent among any historians– as I said last time, we can never give it up altogether, because we know more than the people we study about how their stories turn out. But historians and irony are mostly on the outs right now, and that counts as something important about our era of historical writing. Open a recent history book, and you’re likely to encounter words like “contingency” and “agency.” Even late in the day, these words tell us, things could have gone differently, and individual decisions made a real difference. These words also tell us not to condescend to people in the past– not to view them as the helpless puppets of bigger forces, not to dismiss their efforts, hopes, and ideas, good and bad alike.

Things were REALLY different back in the mid-twentieth century, and they were still mostly different in the mid-seventies, when I got my PhD. In those days, the talk was all about long-term processes, societal changes, and the blindness of historical actors, and you found that talk pretty much everywhere in the profession, among Marxists and Freudians on the political left, modernization theorists and demographers in the middle, political historians on the right. These scholars mostly hated each other, but they agreed on a basic interpretive stance: big forces trumped individual wills.

So what happened? How did the history biz go from mainly-ironic to mainly-non-ironic? The question matters, because it touches on the ideological functions of history knowledge in our times. Mainly-ironic and mainly-non-ironic histories provide different lessons about how the world works.

Of course, some of the change just reflects our improving knowledge of the past. We talk more nowadays about contingency because we know so much more about the details of political change. We talk more about the agency of the downtrodden because we’ve studied them so much more closely– now we know that serfs, slaves, women, and other oppressed groups had their own weapons of small-scale resistance, even amidst terrible oppression. They couldn’t overturn the systems that enclosed them, but they could use what powers they had to carve out zones of relative freedom, in which they could live on their own terms.

And then, there’s what you might call the generational dialectic. Like most other intellectuals, we historians tend to fight with our intellectual parents– so if the mid-twentieth-century historians were all into big impersonal forces and longterm processes, it’s not surprising their successors looked to poke holes in their arguments, by pointing out all the contingencies and agency that the previous generation had missed. That’s one of the big ways our kind of knowledge advances, through criticism and debate. (For a discussion of this process as it works in a neighboring  discipline, see here.)

So there are plenty of reasons internal to the history profession that help account for irony’s ebb– and that’s without even mentioning the decay of Marxism, Freudianism, and all those other -isms that tried to explain individual behavior in terms of vast impersonal forces. Almost nobody finds those explanatory theories as persuasive as we once did, in the history department or anywhere else.

But having said all that, we’re left with an uncomfortable chronological juxtaposition: the historians’ turn to mainly-non-irony coincided with the circa-1980 neo-liberal turn in society at large, the cultural revolution symbolized by Margaret Thatcher in Britain and Ronald Reagan in the US. There’s a substantive juxtaposition as well: while we historians have been rediscovering agency among the downtrodden and freedom of maneuver among political actors, neo-liberal ideology has stressed individuals’ creativity and resourcefulness, their capacity to achieve happiness despite the structures that seem to imprison them. Unleashing market forces, getting people off welfare, reducing individuals’ reliance on public resources– these all start from the presumption that people have agency. They know what they’re doing, and they should be allowed to do it.

In other words, Edward Thompson’s warnings against “the enormous condescension of posterity” weirdly foreshadow various neo-con one-liners about how social programs and collective goods condescend to the disadvantaged. (For an example, check out George Will and George W. Bush talking about cultural “condescension.”)

Which of course is a pretty ironic thought, given that Thompson was a committed political activist and brilliant Marxist theorist. But if it could happen in the 1950s, it can happen now: intellectuals who hate each other and disagree on many specifics can nonetheless be teaching the same basic ideological lessons.

To me this suggests it may be time to rethink concepts like contingency and agency, or at least re-regulate our dosages. Maybe our alertness to agency has diminished our sensitivity to tragedy, to the ways in which circumstances really can entrap and grind down both individuals and whole communities. Maybe we need to think more about the long chains connecting specific political actions and constricting everyone’s freedom.

Maybe we historians need to stop being so damned optimistic!

 

Historians and irony, Part I

We historians have a long, intense, up-and-down relationship with irony, the kind that merits an “it’s complicated” tag. We argue with irony, shout, try going our own separate way– but the final break never comes, and eventually we and irony always wind up back in bed together. Like all stormy relationships, it’s worth some serious thought.  (Note for extra reading:  like pretty much any other historian who discusses irony, I’ve been hugely influenced by the great historian/critic Hayden White— when you have the time, check out his writing.)

Now, historians’ irony doesn’t quite track our standard contemporary uses of the word. It’s not about cliché hipsters saying things they don’t really mean, or about unexpected juxtapositions, like running into your ex at an awkward moment.

No, we historians go for the heavy-hitting version, as developed by the Ancient Greeks and exemplified by their ironist-in-chief Oedpius Rex. In the Greek play, you’ll remember, he’s a respected authority figure hot on the trail of a vicious killer– only to discover that he himself did the terrible deed, plus some other terrible deeds nobody even imagined. Like most of the Greek tragic stars, he thinks he’s in charge but really he’s clueless.

You can see how that kind of irony appeals to historians. After all, we spend a lot of our time studying people who misjudged their command of events– and anyway, we know the long-term story, how events played out after the instigators died. Most of the leaders who got Europe into World War I thought it would last a few weeks and benefit their countries. By 1918 four of the big player-states had been obliterated, and the ricochet damage was only beginning– Stalin, Hitler, the Great Depression, the atomic bomb, and a whole trail of other bad news can all be traced back to 1914.

That’s why our relationship to irony never makes it all the way to the divorce court. It’s basic to what we do.

But there are other sides to the relationship, and that’s where the shouting starts. We historians don’t just confront people’s ignorance of long-term consequences. There’s also the possibility they don’t understand what they’re doing while they’re doing it. That possibility takes lots of forms, and we encounter them in daily life as well as in the history books. There’s the psychological version, as when we explain tough-guy behavior (whether by a seventeenth-century king or twenty-first-century racists) in terms of childhood trauma or crises of masculinity. There’s the financial self-interest version, as when we believe political leaders subconsciously tailor their policies to their career needs.

And then there are the vast impersonal forces versions, what we might call ultra-irony, where historians see individuals as powerless against big processes of social change. That’s how the Russian novelist Leo Tolstoy described the Napoleonic wars, and how the French philosopher Alexis de Tocqueville described the advance of democracy— efforts to stop it just helped speed it up. Marxist and semi-Marxist historians have seen something similar in the great western revolutions. Those fighting tyrannical kings in 1640, 1776, and 1789 didn’t think they were helping establish global capitalism– many hated the whole idea of capitalism– but their policies had that effect all the same.

You can see why historians have such a fraught, high-voltage relationship with ultra-irony interpretations like these. On the one hand, sure– we all know that many social forces are bigger than we are; we laugh at those who try to stop new technologies or restore Victorian sex habits; we know we’re born into socio-cultural systems and can’t just opt out of them.

On the other hand, historical practice rests on evidence, documentation– and where do we find some president or union leader telling us he did it all because his childhood sucked? How do we document vast impersonal forces? Ironic interpretations require pushy readings of the documents– speculation, going beyond what the evidence tells us, inserting our own interpretive frameworks. Nothing makes us historians more jumpy.

There’s a deeper problem as well: interpretations like these diminish human dignity, by telling us that people in the past didn’t know what they were doing or even what they wanted to do. If we accept these interpretations, we deny agency to historical actors, belittle their ideas, dreams, and efforts, mock their honesty and intelligence. We dehumanize history– the human actors are the pawns, the vast impersonal forces run the game.

Those are serious criticisms, and they’ve been around since the nineteenth century.

But the interesting thing is, their persuasive force rises and falls over time. You’ll have a whole generation of historians who find ultra-irony persuasive and helpful; it feels right, and it seems to open up exciting new research questions. Then the tide shifts, and historians become more concerned with agency. They listen closely to historical actors’ own views of who they were and what they were doing.

By and large, the mid-twentieth century fell into Phase 1 of this cycle– it was a time when historians saw irony everywhere and paid lots of attention to big impersonal forces. Marxism was riding high, but so also were the other -isms: Freud-influenced historians saw unconscious drives pushing people to act as they did; Weberians saw the experience of modernization behind political and religious movements. “Underlying causes” were big, and we viewed participants’ own accounts with suspicion– we assumed they didn’t understand their own motives or circumstances.

But that changed in the 1970s, and for the past thirty years we’ve been deep in Phase 2, the no-irony phase. We’re concerned with taking historical actors seriously and with avoiding what a great Marxist historian called “the enormous condescension of posterity.” We believe in “agency”– meaning, from the top to the bottom of the social scale, people can help shape their own destinies.

What does it all mean? I have a few thoughts, but I’ll wait until the next post to lay them out– stay tuned!

Why study history, Monday update

Ta-Nehesi Coates has a terrific essay in The Atlantic about last week’s great National Prayer Breakfast Controversy.

Apparently Barack Obama had been asked to address this annual confab of Christian power-players, and in his remarks he suggested that today’s Islamic fundamentalists aren’t uniquely barbarous, crazy, or evil; in the course of history, Christians– even American Christians!!– had also occasionally killed, kidnapped, looted, and enslaved in the name of their God. Naturally, outrage ensued, with the Best In Class award going to a Virginia politician: Obama’s remarks, he explained afterward, were “‘the most offensive I’ve ever heard a president make.’

Now, I’m assuming readers here don’t need me to underline the ridiculousness of such talk. (On the other hand, the normally sensible Christian Science Monitor presents this as a debate with two more or less legitimate sides, so maybe the underlining is needed; if so, take it as given.)

But the great NPB Controversy does offer a worthwhile reminder of something else, which has been a big theme on this blog: our society needs historical study, and it needs the specific kinds of historical study that historians undertake. Meaning, it’s not enough just to look to the past for data about economic trends, or battlefield success, or the values that led some societies to develop stable democratic governments.

Economists, generals, and political scientists do all those things, sometimes usefully– but we also need more. We need the whole past, complete with its losers and victims, its crimes and craziness, its miseries.  We especially need to know about our own crimes, follies, and victims, as well as those of other people. That’s the kind of thing that historians dig up.

We need that kind of knowledge for the most obvious ethical and practical reasons. Even we non-Christians know it’s just wrong to view ourselves as fundamentally superior to other peoples, immune to their criminialities and fanaticisms. It also doesn’t work, as we Americans ought to have learned from our last fifteen years of foreign policy disasters. If we don’t want to learn humility for its own sake or to honor Jesus, how about we do it to avoid another fifteen years of expensive, bloody, planet-wrecking military failure?

The great NPB Controversy also illustrates a second thread running through these posts: we have a collective need for history of this kind, but we can’t expect private individuals to meet it, not unless we provide them way more collective support. The outraged prayer breakfasters didn’t hesitate to trash The Most Powerful Man On Earth ™–are they really going to hesitate to trash untenured academics saying the same thing in stronger terms?

That’s why we historians make a big mistake when we defend our enterprise in terms of its career-making benefits, all those skills that are supposed to get our students good jobs and bright futures. Those benefits exist, but along with them come career-endangering risks; the good jobs aren’t going to go to young people who’ve been accused of negativism by some southern politician or angry blogger.  Individualizing the virtues and rewards of historical study means drifting toward feel-good, accentuate-the-positive histories, the kind that will please employers– and it means that history will lose the central place it might have had in national debates.

The great NPB Controversy shows us that feel-good historical culture is already here– it’s time to push back.

More thoughts about Bill Gates and Big History

My last post commented on the enthusiasm and money that Bill Gates has been pouring into Big History, a way of teaching history that focuses on very, very long-term processes of change. There I mostly talked about the institutional sides of the story– what it means to have one not-very-well-informed rich guy making decisions about what everyone else should learn.

Here I want to talk content. I want to ask about the messages conveyed in a Big History approach to the past and the background assumptions that it seems to embody.

But before going any farther, readers should probably glance back at the consumer warning that’s at the top of this Opinions section. It explains that the opinions here are just that, opinions, not scholarship or value-neutral reporting, and that’s double extra true when it comes to Big History. I haven’t read up on the details or tried to see all the arguments in its favor. I haven’t looked into the pedagogy side either. It may be that Big History works great in classrooms full of teenagers– we’d still want to know whether it was worth teaching in the first place.

So today we’re skipping the nuances and subtleties, and getting straight to Big History’s Big Implications. What would it mean to make a Big History perspective the foundation of young people’s understanding of the past? David Christian, whose ideas so inspired Bill Gates, describes the intent as providing “a clear vision of humanity as a whole.” In a Guardian article, Gates himself is quoted as saying that the approach will help students “understand what it means to be human.” So what kind of answer is he funding?

One answer is, it’s a vision in which human beings don’t count for too much. In the Gates-funded version of Big History, we’re a speed bump on a long highway. We humans only showed up recently; relatively speaking, we’re not going to be here much longer, and the rest of the universe will get along just fine after we’re gone.

We also don’t have too much influence while we’re here, because so much of “what it means to be human” was fixed long ago: first by the geology, chemistry, and biology of the earth we inhabit, then by our earliest neuro-wiring as humans, for things like language and community life.

Within those parameters, there’s not much room for difference or transformation– the gaps separating us 21st-century Americans from, say, ancient Egyptians count for much less than the basics we share. Seen within the 250,000-year history of humanity, Aristophanes, Shakespeare, and Amy Heckerling might as well be the same person. Ditto for Confucius, Thomas Aquinas, Mary Shelley, Karl Marx, and Rosalind Franklin.

You get my drift: Big History sure sounds like a training in resignation to all the inevitabilities that have built up over the last few hundred thousand years, not to mention the millions of years before we humans arrived. The changes that matter are bound up with enormous processes that we can’t do much about, and whatever we humans can achieve doesn’t match up against all that we can’t change. Bringing fast food workers’ wages up to $15 from the current $8?  Does that issue really amount to a hill of beans from the Big History perspective? Workers and activists should save themselves a lot of heartbreak and just accept the world as it is.

Is it unkind to suggest that a billionaire in today’s America might think that’s a great lesson to teach?

 

 

Billionaire History Man

Talk about weird news: last weekend, the New York Times reported that Bill Gates has developed an enthuasism for history. Not just as bedtime reading, either. Through his foundation, he’s begun pouring money into history teaching, in the hope of making history classes more interesting and more useful to America’s high schoolers. It’s all part of a bigger plan, apparently. More interested students will be better students, they’ll stay in school, get good jobs, not sink into drugs and despair, and help bring back the productive middle-class America that Gates grew up in.

The lightbulb moment apparently came during an early morning home treadmill session. Not wanting to waste that time, Gates likes to absorb improving material while he jogs, and this morning it was a Teaching Company lecture about “Big History,” by an Australian history professor named David Christian. Gates was blown away. Meetings followed, lesson plans were developed, financing was promised, and now Big History is being taught at a growing number of high schools, public and private alike.

Here I won’t say much about the substance of Big History. The key thing seems to be that it divides all time into eight stages, with the appearance of homo sapiens constituting stage 6 and the invention of agriculture stage 7. That leaves for stage 8 everything we usually think of as “history”– you know, Greeks and Romans, Confucius, the rise of Islam, slavery, industrial revolutions, African empires, American, French, Haitian, Russian, Chinese, and Vietnamese revolutions, the American Civil War, the Holocaust, that kind of thing. Fitting all that into one-eighth of a high school semester (about ten days, by my count) must make for some lively teaching.

So Big History’s content is plenty worth discussing, but for now I want just to say a little about Bill Gates’s involvement in it.

I’ll start with the obvious stuff. First, schools need money, and it’s a Good Thing that a billionaire wants to give it to them. Second, it’s Not A Good Thing that one billionaire gets to decide what millions of children learn, with add-on implications for hundreds of thousands of future teachers. Because if millions of high schoolers have to learn Gates’s version of history, an awful lot of college students will have to do the same if they want careers in education.

Third, it’s bad news that a semi-retired billionaire is getting his ideas about history from DVD lectures and TED talks. The whole story would still be creepy, but at least it would have been comforting to learn that Gates’s flash of insight came from a summer spent reading Edward Gibbon, Fernand Braudel, Natalie Davis, Jonathan Spence, and William Cronon. If billionaires are going to redesign American education, can’t they at least do some homework? Have real books become that difficult for them?

So there’s plenty here to get someone like me riled up. But there’s another angle to the story that deserves some thought, and that’s the strange spectacle of a billionaire tech oligarch concerning himself with history in the first place. Of course there are the obvious explanations, and they pop up often in the comments on the Times website. There’s the warm-hearted philanthrophy explanation: American schools aren’t doing all that well, and Gates is at least trying to fix them. And there’s the capitalism explanation: there’s money to be made in the education business, from selling books, programs, and other gear to a vast captive market. (Believers in the second hypothesis will note that the story itself comes from Andrew Ross Sorkin, a Times business writer who runs their semi-independent Dealbook blog, and who has on occasion served as a conduit for big-business opinion. The Times apparently didn’t involve its education writers in the story, and Professor Christian is the only living historian that it quotes.)

But maybe the particular motives don’t matter very much in a case like this– maybe the big fact is just the depth of Gates’s involvement in what is basically a cultural debate. He’s not just giving money or selling gadgets to schools– he’s pushing one vision of history and criticizing others, using philanthropy to shape what does and doesn’t count as history knowledge; and because of the big dollars involved, his intervention doesn’t just concern the target audience of high school students–it’s about teachers, potential teachers, and their teachers too.

We historians often worry that what we do is irrelevant to society at large, but Bill Gates is here to teach us otherwise– apparently power players are thinking about our enterprise. Big History at least has that Big Message for us.

Why study history, Part 2: individual lives, collective knowledge

My last post talked about why we need historical knowledge. (Short version: history tries to see reality whole, with all the details, contradictions, and complexity left in, and we need that kind of thinking — because reality IS complicated, in ways that few academic disciplines acknowledge.)

So far so good, but then we hit the big cloud hanging over history education in 2014. “We” may need historical knowledge, but “we” don’t do the studying or pay the tuition or try to get jobs after finishing college. Individuals do all those things, and individuals have to live with the results. It’s all very nice and uplifting to say that people should study history, but what if there are no jobs for them? Why should students rack up fees and debts if there’s not much waiting for them after graduation?

What follows isn’t exactly an answer to that question; I’m not even sure there really is an answer, in the usual sense of the term. Instead, I present here some thoughts on the question itself, and suggest that we need to place it in larger contexts than we usually do. The “why study history” question, I’m saying, is really a question about how individuals, communities, and knowledge intersect in 2014.

The first step is to recognize the seriousness of the problem. The jobs situation for history graduates isn’t good, and it’s probably getting worse. Back in the good old days, meaning until about 1975, big corporations liked to train their own people, and they welcomed candidates with liberal arts degrees; it was understood that training would cost money, but that was an investment that eventually would pay big dividends. Anyway, liberal arts graduates could always fall back on teaching if business didn’t appeal to them.

Things are different today. Schools at every level are in financial trouble, and they’re not hiring many historians. In the corporate world, job candidates are increasingly expected to show up pre-trained and ready to contribute; no one expects them to stay around long enough for training programs to pay off, so HR departments favor people with career-ready educations, in economics, technology, health sciences, and the like. (See here for an account.) In these circumstances, a history major may be ok those who don’t have to worry about jobs after graduation, or for those who can treat college as a preparatory course for professional programs like law. It’s not so great for those who need to start paying the bills right away.

In response, historians have publicized all the ways in which history actually is a good preparation for a real career in the real world. And we have some reasons for saying so– history courses teach you to analyze situations and documents, write clearly, think about big pictures, understand other cultures (something worth real money in today’s inter-connected global economy). Most of the history department websites I’ve visited (here for example) include some version of these claims.

The American Historical Association (the history profession’s official collective voice in the US) has taken this approach one step farther. With the help of a foundation, it has set up a program (which it calls the Tuning Project) designed to bring college history teaching into closer alignment with employers’ needs, by putting professors in touch with employers and other stake-holders. If professors have a better understanding of what employers want, the hope is, we can better prepare students for the real world and draw more majors into our courses.

But you can see the problem: some parts of a history education give you the skills to work in a big-money corporation, but many others don’t. Some history topics require knowledge that’s hard to acquire and not much practical use in the twenty-first century– the dates of obscure wars, or the dead languages needed to understand some ancient civilizations. Other topics are likely to mark you as a dangerous malcontent. Picture a job seeker showing up at Mega Corporation X (or at the Chicago Board of Education, for that matter) with her senior thesis on union organizing in the 1930s, or the successes of Soviet economic programs, or Allied war crimes in World War II. Whatever her skills of analysis and cultural negotiation, she’s not the kind of candidate HR departments are looking for. She carries intellectual baggage; she looks like trouble.

That thought experiment suggests the possibility that “tuning” the history major actually means changing its content– by cutting out the troublesome (discordant?) elements, those that might upset our conventional wisdoms. Of course, you could argue that “tuning” just applies to teaching, and therefore doesn’t change the basics of historical knowledge. Professors still get to research and write about whatever they like; in their books, they still get to be intellectual adventurers and trouble-makers. But that’s no real answer, because as American institutions currently work, history teaching eventually shapes history research. If history majors aren’t studying unions or war crimes, universities aren’t going to be hiring faculty in those areas either, and those books won’t be written.

That’s bad news, because American society has a strong collective interest in making sure that this kind of knowledge gets produced. All societies need need to think about difficult questions and disturbing ideas, for the reasons that John Stuart Mill laid out way back in the 1850s. Societies that fail to do so (he explained) do stupid and immoral things; they fail to develop intellectually or socially; even their economic lives suffer, since the choke-hold of conventional wisdom eventually stifles business too. For Mill, disruptive knowledge was as much a practical as a spiritual need.

But it’s not clear how this collective need is to be met by the American university as it increasingly functions nowadays. As the language of the individualistic free market becomes more prevalent within it, fields of knowledge risk being defined by calculations concerning “the employability of our graduates” (as a document from my own university puts it). Given the pressures that they face, our students are fully justified in focusing on their “employability,” and university faculty have a duty to help them toward it. But that’s not the university’s only duty. It has at least an equal duty to develop knowledge, including especially knowledge untuned to employers’ needs, even antithetical to those needs.

That means that eventually the “why study history” question shades into a political problem. Historical knowledge is a form of collective property, and its health is bound up with other elements of our communal life. In the increasingly privatized universities of our times– privatized in financing, mechanics, and measurements of success–the “why study history” question may not have an answer.

 

 

 

Why study history, Part 1: The case of the disillusioned history major

Why study history? It’s a good question. Studying history means learning about dead people, by-gone situations, wrong-headed ideas, funny clothes. But that’s not where we live; the knowledge we need for real life deals with the here-and-now, not the over-and-done. That’s always been true, but it’s especially pertinent nowadays, as we face possibilities and troubles that never worried our ancestors. With self-aware, armed robots coming off the assembly lines any day now, should we really be worrying about Louis XIV and Julius Caesar? Wouldn’t that time and energy be better spent on our own problems?

Every historian has to take that question seriously. We get paid to think about the past, and we ask our students to spend at least a few hours a week thinking about it too– we’d better have some answers as to why it’s worth their time and ours.

Here I offer a few of my own answers by describing a case of rejection– the case of a smart guy who encountered history, didn’t like what he found, and turned elsewhere for intellectual sustenance.

The smart guy is Paul Krugman, one of America’s most important intellectuals. Krugman is a rare figure in the United States, both a star professor (having taught economics at MIT and Princeton, he’s now moving to CUNY) and at the same time someone who tries hard to reach ordinary readers, with a regular column in the New York Times and a widely-followed blog. His columns and blog posts show him to be literate, progressive in his politics, and deeply engaged by the real issues of contemporary life.

So for anyone trying to understand where historical study fits in today’s world, it’s interesting to learn that Krugman came to college expecting to major in history. What he found disappointed him. As he explained in a New Yorker profile a few years ago, his idea of history had been shaped by his teenage reading of Isaac Asimov’s Foundation series. Those books show history to be a difference-making intellectual tool. Having carefully studied the past, Asimov’s heroes (a small band of historians armed with math skills and big computers) can see where their society is heading, long before the people around them have a clue.

Not only can the heroes see the deep processes governing their world, they can act on their knowledge. Centuries in advance, they secretly establish mechanisms that will head off the worst of the troubles they’ve diagnosed, and their plans work; trouble emerges just as they’ve predicted, but since they’ve seen it coming, their pre-programmed remedies are effective. They can’t prevent all the disasters, but they do enough to improve billions of lives.

But Krugman discovered as a Yale freshman that university history didn’t match up with the history he’d read about in Asimov. Just the reverse– his history courses seemed to offer only a mass of complexities and uncertainty. They didn’t elucidate those deep processes of changes that Asimov’s historian-heroes had grasped, and they offered no hope of guiding society to a better future. “History was too much about what and not enough about why,” the New Yorker article explains. In his disappointment, Krugman switched departments, and in economics he found what history had failed to deliver. Unlike history, he found, economics explored the basic rules by which societies function; it offered the kind of knowledge that could guide effective action. “Suddenly, a simple story made sense of a huge and baffling swath of reality,” is the journalist’s summary of Krugman’s intellectual experience.

Ordinarily, teen reading and freshman disappointments wouldn’t count in assessing an intellectual’s views; we all get a free pass on our adolescent hopes and dreams. But Krugman himself offers Asimov as a key to understanding his grown-up intellectual life as a professional economist. He acknowledges, of course, that even as an economist he can’t attain the predictive clarity that Asimov’s historian-heroes managed, but economics allows him to play something like their role. Using economic reasoning, we can get beneath the surface mess of life, understand its essential mechanisms, and use that knowledge to nudge society’s development in the right directions. Unlike history, economics both explains the world and provides a basis for improving it.

So Paul Krugman’s autobiographical reflections contrast the robust and useable knowledge of economics with the baffling jumble of detail produced by historians. It’s a critique that matters because it starts from assumptions that most historians share. These are the complaints of an intellectual, someone who believes in what universities do, not the standard ignoramus haroomphing about tenured radicals and useless research.

And Krugman’s critique gets at something fundamental about history, and about where it diverges from other kinds of knowledge. He’s right that we historians are interested in everything that reality presents; no other academic discipline has as much invested in exploring the details of life, and none has the same belief that all the details deserve our attention. We fuss about getting the dates right, and we write whole books about individual lives and brief episodes, some famous, many of them previously unknown. We want to tell all the stories that humanity has preserved, even the stories of lost causes and dead-end lives. That’s the historian’s paradox: we study the dead because that’s where we can explore all the dimensions of life.

That absolute commitment to the real, in all its chaotic variety, is the defining characteristic of historical thinking, and Krugman is right to zero in on it.

But he’s wrong in seeing history as “too much about what and not enough about why,” and that mistake is also important. The real difference concerns HOW historians and economists understand causation, not how much importance they attach to asking why things happen. The New Yorker profile makes that clear in summarizing Krugman’s freshman studies of ancient slavery and medieval serfdom: in explaining them, his history teachers talked “about culture and national character and climate and changing mores and heroes and revolts and the history of agriculture and the Romans and the Christians and the Middle Ages and all the rest of it;” his economics teacher explained the same phenomena with one simple calculation about agricultural productivity, population, and land availability.

So in fact Krugman’s complaint wasn’t that history didn’t say “enough about why,” but that it said too much– in his history course there were too many explanations, involving too many areas of life, too disconnected from one another. His economics professor replaced that long list with one causal rule, which applied in all situations. For the young Krugman, it was easy to choose between these two modes of understanding: the simpler explanation was more powerful and more useful– and therefore better. The mature Krugman agrees.

A few years ago, I would have said that we don’t have to choose between these two visions of causation, and that economics and history are just two ways of thinking about the world, each useful for some purposes, each with strengths and weaknesses. Sometimes we need a simple and elegant formula for understanding whatever it is we’re dealing with, and sometimes we need to wallow in the complications. That’s true in our personal lives, and just as true in our thinking about human societies. It’s why universities have lots of departments, and why we push students to take a wide range of courses.

But in today’s world, it’s difficult to stay so tolerant, because economics today is not just one academic area among many or a toolbox of handy techniques for thinking about business. It presents itself instead as a master discipline, a way of thinking that can illuminate all areas of our lives. Its methods and assumptions now show up in education, law, family relations, crime prevention, and pretty much everywhere else. Its terminology is also everywhere, and we scarcely notice anymore when a political leader tells us to apply marketplace solutions to some new area of public life. (For a cogent account of this “economics imperialism,” see here.)

With economic thinking taking over so much of modern life, the “we don’t have to choose” option isn’ really on the table. Others are trying to choose for us, and it’s important to push back — and not to be too polite as we do so.

I’ve chosen Paul Krugman to push back against here, but not because he’s an especially extreme advocate of the economics take-over. On these issues he’s probably the best that mainstream economics has to offer, an economist who seems alert to the arguments, values, and concerns that other ways of thinking offer. But that’s really the point: it’s often the best representatives of a worldview that we should question, not the worst.

And the fundamental question for Paul Krugman is, why should we take seriously any simple story about human affairs, let alone prefer it to a more complicated story? In the dream world that Isaac Asimov cooked up, sure, but in our world of real human beings? Krugman’s college economics teacher could offer a one-step formula that “made sense of a huge and baffling swath of reality”– but fairy tales and paranoiac fantasies do the same thing. In those cases, we all know, “making sense” of a phenomenon isn’t the same as getting it right. Ditto for explaining our own or our friends’ behavior: it’s usually not a good sign when we give a one sentence answer explaining a divorce or a workplace event.

Why would we apply different standards to a phenomenon like slavery, which brings together so many forms of human motivation: self-interest, racism, anger, sex, religion, culture, tradition, the love of domination, and much more?

We need history for exactly the reasons that the young Paul Krugman found it so frustrating. Reality is complicated, and its complications press in on us, confronting us daily with lusts, angers, and all sorts of other forces that don’t make it into Asimov-style fiction.  For some purposes and in some limited contexts, we can get away with pretending that those forces don’t matter. We can’t pretend for very long, though, before reality starts biting back. Historical study won’t tell us what to do when it does, but at least it will give us a serious idea of what’s out there.