Tag Archives: American life

My man Clausewitz

Some weeks ago, I described my admiration for the mid-Victorian novelist Anthony Trollope. I fall way outside Trollope’s target demographic, which was conservative, Church-of-England-style Christians. But I find myself re-reading him often, and learning from him. It’s been a lesson in the limited importance of literary intentions, both authors’ and readers’. We don’t know what books are going to matter to us, just as authors don’t know whom they’re going to reach, or how.

Today, I want to discuss another literary enthusiasm I’ve recently developed, which has surprised me just as much: it’s for Carl von Clausewitz, the early nineteenth-century Prussian military philosopher.

Clausewitz was a theorist who also walked the walk. He joined the Prussian army at age twelve, and for the next twenty years he fought in all its wars against revolutionary and Napoleonic France– the biggest, bloodiest wars Europe had seen up to that time. But he made his superiors jumpy, and they eventually parked him in the Prussian military academy, where he taught future officers, honed his theories, and worked away at his enormous book On War. It still wasn’t done when he died, but his devoted widow assembled the pieces, and it became an instant classic. It’s still taught at military colleges around the world, including our own West Point.

Even if you haven’t read Clausewitz, you’ve probably heard some of the snappy phrases he invented, like “the fog of war” and “war is the continuation of politics by other means.” There are dozens of other one-liners that aren’t as well known but ought to be. In fact he was something of a literary genius– he carries you along as you read, and you find yourself reading longer stretches of the text than you’d planned. Like other great writers, he forces you to look at the world in new ways.

That literary oomph turns out to be more common than you might expect among history’s great generals, and Clausewitz himself explains why: war “may appear to be uncomplicated,” he tells us, but actually it “cannot be waged with distinction except by men of outstanding intellect.” To make his point, he tosses in some startling comparisons. In some ways, he says, the good commander resembles the poets, painters, scholars, and intellectuals. Like them, he has to use imagination and insight into the human condition, as well as the specific skills and disciplines of his art.

Clausewitz’s reasons get to the heart of his ideas about war– namely, that it’s a really, really complicated business, which even the geniuses can’t fully master. The mediocrities don’t stand a chance.

Sure, he tells us, from a distance “everything looks simple: the knowledge required does not look remarkable,” the strategic options look obvious; anyone with a good map can figure out how best to encircle a city or cut off opposing troops. But the reality is unimaginably complex, because it involves thousands or millions of individual human beings, all acting on the basis of their own emotions and will, all enduring maximum stress. The physical environment poses its own difficulties. Simple acts become complicated in the smoke, mud, dust, and exhaustion of combat; geography takes on strange new shapes; chance events assume enormous importance. As he puts it in another of his sharp formulations: “War is the realm of chance. No other human activity gives it greater scope: no other has such incessant and varied dealings with this intruder.”    (The quotations come from On War, in the spectacular translation put together by Michael Howard and Peter Paret.)

In these circumstances, Clausewitz’s commander is on a quest for knowledge, trying to find the truth when “all information and assumptions are open to doubt, and with chance at work everywhere.” Courage amidst dangers, training, equipment, faith in the mission– in war all those count, but the indispensable qualities are intellectual: “first, an intellect that, even in the darkest hour, retains some glimmerings of the inner light which leads to truth; and second, the courage to follow this faint light wherever it may lead.” For Clausewitz, truth about situations and the people involved in them is the ultimate war-making tool.  That’s why the commander needs elements of the humanist’s mindset.

There’s lots more to Clausewitz, of course, some of which maybe I’ll write about in the next few weeks. But for now let’s stop and think a minute about how his vision of military knowledge fits with what we encounter here today, in twenty-first-century America.

Because we also have lots of ideas about war. We ought to, because over the last generation war has been the main constant in American life. The War in Afghanistan has now lasted longer than the Trojan War, and twice as long as World War II. Some retired general pops up on TV pretty much every night, and most weeks you can find op-eds by thoughtful experts pushing for American military intervention somewhere in the world. Most of us don’t go to war ourselves, but we’ve come to view war-making as a normal part of American political life.

We can do that partly because our American ideas about war differ so wildly from Clausewitz’s. He talked about chance, uncertainty, inadequate information, and the need for imaginative brilliance to get at the reality of any military situation. We describe war instead as knowable, predictable, and manageable. Our favorite war terminology is medical– we speak of surgical strikes and interventions; we describe our enemies as cancerous growths that need to be excised; we call many of our interventions humanitarian acts, life-saving missions. And in modern war as in modern medicine, we’ve got technologies that Clausewitz never dreamed of; drones, night vision goggles, computers– these allow our soldiers to overcome war’s information gaps. Of course technology doesn’t eliminate all uncertainty. Unexpected problems still arise on the battlefield, as they do at the hospital– but now we can address them effectively.

So given that we live in a different technological world, is Clausewitz basically a museum piece, or is he someone we should be listening to?  How seriously should we take a voice from the horse-drawn, muzzle-loading era?

One reason for listening is that Clausewitz gives us the voice of a hardened Prussian officer, who’d fought in high-level battles, both victories and defeats, without losing his faith in either war or the army. When he tells us about the unknowability of war, he’s talking as a believer, not a pacifist dreamer or bleeding-heart do-gooder. He doesn’t doubt the value of war– he just wants us to know what it really is.

The other reason concerns us, not Clausewitz. The brutal fact is, American conventional wisdom about war doesn’t look so good these days. We’ve got the biggest, best-equipped army in the world, but we’re on a fifty-year losing streak– against a series of much weaker enemies. (Ok, we looked impressive against Grenada and Panama, but you get the point.)  Our humanitarian interventions have typically made situations worse, not better.

Maybe it’s time to rethink our approach to this most serious of human activities– and we could do worse than starting with Clausewitz.

Advertisements

More thoughts about Bill Gates and Big History

My last post commented on the enthusiasm and money that Bill Gates has been pouring into Big History, a way of teaching history that focuses on very, very long-term processes of change. There I mostly talked about the institutional sides of the story– what it means to have one not-very-well-informed rich guy making decisions about what everyone else should learn.

Here I want to talk content. I want to ask about the messages conveyed in a Big History approach to the past and the background assumptions that it seems to embody.

But before going any farther, readers should probably glance back at the consumer warning that’s at the top of this Opinions section. It explains that the opinions here are just that, opinions, not scholarship or value-neutral reporting, and that’s double extra true when it comes to Big History. I haven’t read up on the details or tried to see all the arguments in its favor. I haven’t looked into the pedagogy side either. It may be that Big History works great in classrooms full of teenagers– we’d still want to know whether it was worth teaching in the first place.

So today we’re skipping the nuances and subtleties, and getting straight to Big History’s Big Implications. What would it mean to make a Big History perspective the foundation of young people’s understanding of the past? David Christian, whose ideas so inspired Bill Gates, describes the intent as providing “a clear vision of humanity as a whole.” In a Guardian article, Gates himself is quoted as saying that the approach will help students “understand what it means to be human.” So what kind of answer is he funding?

One answer is, it’s a vision in which human beings don’t count for too much. In the Gates-funded version of Big History, we’re a speed bump on a long highway. We humans only showed up recently; relatively speaking, we’re not going to be here much longer, and the rest of the universe will get along just fine after we’re gone.

We also don’t have too much influence while we’re here, because so much of “what it means to be human” was fixed long ago: first by the geology, chemistry, and biology of the earth we inhabit, then by our earliest neuro-wiring as humans, for things like language and community life.

Within those parameters, there’s not much room for difference or transformation– the gaps separating us 21st-century Americans from, say, ancient Egyptians count for much less than the basics we share. Seen within the 250,000-year history of humanity, Aristophanes, Shakespeare, and Amy Heckerling might as well be the same person. Ditto for Confucius, Thomas Aquinas, Mary Shelley, Karl Marx, and Rosalind Franklin.

You get my drift: Big History sure sounds like a training in resignation to all the inevitabilities that have built up over the last few hundred thousand years, not to mention the millions of years before we humans arrived. The changes that matter are bound up with enormous processes that we can’t do much about, and whatever we humans can achieve doesn’t match up against all that we can’t change. Bringing fast food workers’ wages up to $15 from the current $8?  Does that issue really amount to a hill of beans from the Big History perspective? Workers and activists should save themselves a lot of heartbreak and just accept the world as it is.

Is it unkind to suggest that a billionaire in today’s America might think that’s a great lesson to teach?

 

 

The intellectual in America: an example

Here’s an example of the intellectual’s situation in contemporary America, courtesy of the American historian of Russia Steven F. Cohen. Cohen is about as well-established a figure as could be imagined (some details here). After thirty years as a professor at Princeton, he now teaches at NYU; he’s published lots of scholarly books and received important honors; he seems to have plenty of money and reasonable access to the media. If anyone should feel secure about expressing opinions, it’s people like Cohen.

So it’s a shocker when he tells us about self-censorship within his well-informed, well-protected milieu. Cohen’s particular concern is American policy toward Russia, an issue on which he’s has spoken eloquently and courageously, but the details here matter less than the intellectual climate that he describes.

In that climate, he tells us, “some people who privately share our concerns” in “the media, universities and think tanks—do not speak out at all. For whatever reason—concern about being stigmatized, about their career, personal disposition—they are silent.” As for young scholars, those “who have more to lose,” Cohen himself urges silence. He reports telling junior colleagues that “‘American dissent in regard to Russia could adversely affect your career. At this stage of life, your first obligation is to your family and thus to your career. Your time to fight lies ahead.’”

This is a seriously depressing account, because Cohen isn’t even talking about outsider radicals, ethnic leaders, or potential “extremists,” the kind of people that the New York City Police Department might put under surveillance (see here and here for examples). He’s only discussing well-trained experts like himself, who work for well-defended, rich institutions. His friends have connections, their opinions fall within the spectrum of American common sense, and the subjects they study have major-league practical relevance. After all, we really don’t want to screw up our relations with another heavily-armed nuclear power. We want to get the story straight, and critical debate contributes to doing that.

Yet fear reigns even in this corner of the academic arcadia. At a minimum, Cohen tells us, university professors wait for tenure before expressing an opinion; until then, they shut up. Many of their elders apparently continue shutting up after when the immediate pressure eases, whether because there are still career steps to climb or for more personal reasons.

In some ways, of course, this is just an updated version of an observation that Alexis de Tocqueville made long ago. “I know of no country,” Tocqueville reported in Democracy in America, “where there prevails, in general, less independence of mind and less true freedom of discussion than in America…. In America, the majority draws a formidable ring around thought. Within those limits, the writer is free; but woe to him if he dares to go outside it.”

But there’s also something more sinister in the story that Cohen tells. Tocqueville believed that American democracy explained the problems he detected. “The tyranny of the majority” (he invented the phrase) ensured that non-conforming opinions wouldn’t be heard, because Americans (metaphorically) voted on their ideas just as they (really) voted on their city councilors. But Cohen and his friends aren’t actually facing the tyranny of the majority. They’re facing instead the readiness of powerful insiders to channel discussion in specific directions, by using among other tools their leverage over academic institutions. In other words, the old fashioned forms of power haven’t lost their relevance in our twenty-first century world– and even historians can feel their effects.