Ann Masten is a Regents Professor and the Irving B. Harris Professor of Child Development in the Institute of Child Development at the University of Minnesota. She completed her doctoral training at the University of Minnesota in clinical psychology and an internship at UCLA. In 1986, she joined the faculty in the Institute of Child Development at the University of Minnesota, serving as chair of the department from 1999 to 2005.
Masten’s research focuses on understanding processes that promote competence and prevent problems in human development, with a focus on adaptive processes and pathways, developmental tasks and cascades, and resilience in the context of high cumulative risk, adversity, and trauma. She directs the Project Competence Research on Risk and Resilience, including studies of normative populations and high-risk young people exposed to war, natural disasters, poverty, homelessness, and migration. The ultimate objective of her research is to inform sciences, practices, and policies that aim to promote positive development and a better future for children and families whose lives are threatened by adversity.
Masten recently co-chaired the Forum on Investing in Young Children Globally for the U.S. National Academies. She has served as President of the Society for Research in Child Development and President of Division 7 (Developmental) of the American Psychological Association (APA). She is a 2014 recipient of the Urie Bronfenbrenner Award for Lifetime Contributions to Developmental Psychology in the Service of Science and Society from the APA. Author of more than 200 publications, Masten has presented to diverse audiences on the themes of risk and resilience in human development. She regularly teaches a MOOC through Coursera.org on “Resilience in Children Exposed to Trauma, Disaster and War: Global Perspectives.” She is the author of the 2014 book, “Ordinary Magic: Resilience in Children”, published by Guilford Press.
The rumor was he’d killed an Iraqi soldier with his bare hands. Or maybe bashed his head in with a radio. Something to that effect. Either way, during inspections at Officer Candidates School, the Marine Corps version of boot camp for officers, he was the Sergeant Instructor who asked the hardest, the craziest questions. No softballs. No, “Who’s the Old Man of the Marine Corps?” or “What’s your first general order?” The first time he paced down the squad bay, all of us at attention in front of our racks, he grilled the would-be infantry guys with, “Would it bother you, ordering men into an assault where you know some will die?” and the would-be pilots with, “Do you think you could drop a bomb on an enemy target, knowing you might also kill women and kids?”
When he got to me, down at the end, he unloaded one of his more involved hypotheticals. “All right candidate. Say you think there’s an insurgent in a house and you call in air support, but then when you walk through the rubble there’s no insurgents, just this dead Iraqi civilian with his brains spilling out of his head, his legs still twitching and a little Iraqi kid at his side asking you why his father won’t get up. So. What are you going to tell that Iraqi kid?”
Amid all the playacting of OCS—screaming “Kill!” with every movement during training exercises, singing cadences about how tough we are, about how much we relish violence—this felt like a valuable corrective. In his own way, that Sergeant Instructor was trying to clue us in to something few people give enough thought to when they sign up: joining the Marine Corps isn’t just about exposing yourself to the trials and risks of combat—it’s also about exposing yourself to moral risk.
I never had to explain to an Iraqi child that I’d killed his father. As a public affairs officer, working with the media and running an office of Marine journalists, I was never even in combat. And my service in Iraq was during a time when things seemed to be getting better. But that period was just one small part of the disastrous war I chose to have a stake in. “We all volunteered,” a friend of mine and a five-tour Marine veteran, Elliot Ackerman, said to me once. “I chose it and I kept choosing it. There’s a sort of sadness associated with that.”
As a former Marine, I’ve watched the unraveling of Iraq with a sense of grief, rage, and guilt. As an American citizen, I’ve felt the same, though when I try to trace the precise lines of responsibility of a civilian versus a veteran, I get all tangled up. The military ethicist Martin Cook claims there is an “implicit moral contract between the nation and its soldiers,” which seems straightforward, but as the mission of the military has morphed and changed, it’s hard to see what that contract consists of. A decade after I joined the Marines, I’m left wondering what obligations I incurred as a result of that choice, and what obligations I share with the rest of my country toward our wars and to the men and women who fight them. What, precisely, was the bargain that I struck when I raised my hand and swore to defend my country against all enemies, foreign and domestic?
It was somewhat surprising (to me, anyway, and certainly to my parents) that I wound up in the Marines. I wasn’t from a military family. My father had served in the Peace Corps, my mother was working in international medical development. If you’d asked me what I wanted to do, post-college, I would have told you I wanted to become a career diplomat, like my maternal grandfather. I had no interest in going to war.
Operation Desert Storm was the first major world event to make an impression on me—though to my seven-year-old self the news coverage showing grainy videos of smart bombs unerringly finding their targets made those hits seem less a victory of soldiers than a triumph of technology. The murky, muddy conflicts in Mogadishu and the Balkans registered only vaguely. War, to my mind, meant World War II, or Vietnam. The first I thought of as an epic success, the second as a horrific failure, but both were conflicts capable of capturing the attention of our whole society. Not something struggling for air-time against a presidential sex scandal.
So I didn’t get my ideas about war from the news, from the wars actually being fought during my teenage years. I got my ideas from books.
My novels and my history books were sending very mixed signals. War was either pointless hell, or it was the shining example of American exceptionalism.
Reading novels like Joseph Heller’s Catch-22, or Tim O’Brien’s The Things They Carried, I learned to see war as pointless suffering, absurdity, a spectacle of man’s inhumanity to man. Yet narrative nonfiction told me something different, particularly the narrative nonfiction about World War II, a genre really getting off the ground in the late-90s and early aughts. Perhaps this was a belated result of the Gulf War, during which the military seemed to have shaken off its post-Vietnam malaise and shown that, yes, goddamn it, we can win something, and win it good. Books like Stephen Ambrose’s Band of Brothers and Tom Brokaw’s The Greatest Generation went hand-in-hand with movies like Saving Private Ryan to present a vision of remarkable heroism in a world that desperately needed it.
In short, my novels and my histories were sending very mixed signals. War was either pointless hell, or it was the shining example of American exceptionalism. In middle-school, I’d read Ambrose’s Citizen Soldiers, about the European Theater in World War II. More than anything else, it was the title that stayed with me, the notion of service in a grand cause as the extension of citizenship. I never bothered to consider that the mix of draftees and volunteers who served in World War II wasn’t so different from the mix of draftees and volunteers who served in Vietnam, or that the atrocities committed in that war were no less horrific than those committed in Vietnam, though no one was likely to write a best-selling book about Vietnam entitled Citizen Soldiers. The title appealed to me. Deeply. But I didn’t see any grand causes in the 1990s, just a series of messy, limited engagements. Of course, in the history of American warfare, from the Indian Wars to the Philippines to the Banana Wars, it was the grand causes that were the anomalies, not the brushfire wars at the edge of empire.
Then 9/11 happened. We all have our stories of where we were that day. Mine is that I was in the woods, hiking the Appalachian Trail. As my little group of hikers scrambled over the rough paths we kept running into people telling stories of planes hitting the World Trade Center. It sounded preposterous, the sort of rumor that could easily spread in an isolated place, in the days before everybody had a smartphone. But we kept hearing the story, in ever more detail, until it became clear—particularly for those of us from New York—that we had to leave the woods.
I can’t say that I joined the military because of 9/11. Not exactly. By the time I got around to it the main U.S. military effort had shifted to Iraq, a war I’d supported though one which I never associated with al-Qaida or Osama bin Laden. But without 9/11, we might not have been at war there, and if we hadn’t been at war, I wouldn’t have joined.
It was a strange time to make the decision, or at least, it seemed strange to many of my classmates and professors. I raised my hand and swore my oath of office on May 11, 2005. It was a year and a half after Saddam Hussein’s capture. The weapons of mass destruction had not been found. The insurgency was growing. It wasn’t just the wisdom of the invasion that was in doubt, but also the competence of the policymakers. Then-Secretary of Defense Donald Rumsfeld had been proven wrong about almost every major post-invasion decision, from troop levels to post-war reconstruction funds. Anybody paying close attention could tell that Iraq was spiraling into chaos, and the once jubilant public mood about our involvement in the war, with over 70 percent of Americans in 2003 nodding along in approval, was souring. But the potential for failure, and the horrific cost in terms of human lives that failure would entail, only underscored for me why I should do my part. This was my grand cause, my test of citizenship.
Citizen-soldiers versus “base hirelings”
The highly professional all-volunteer force I joined, though, wouldn’t have fit with the Founding Fathers’ conception of citizen-soldiers. They distrusted standing armies: Alexander Hamilton thought Congress should vote every two years “upon the propriety of keeping a military force on foot”; James Madison claimed “armies kept up under the pretext of defending, have enslaved the people”; and Thomas Jefferson suggested the Greeks and Romans were wise “to put into the hands of their rulers no such engine of oppression as a standing army.”
They wanted to rely on “the people,” not on professionals. According to the historian Thomas Flexner, at the outset of the Revolutionary War George Washington had grounded his military thinking on the notion that “his virtuous citizen-soldiers would prove in combat superior, or at least equal, to the hireling invaders.” This was an understandably attractive belief for a group of rebellious colonists with little military experience. The historian David McCullough tells us that the average American Continental soldier viewed the British troops as “hardened, battle-scarred veterans, the sweepings of the London and Liverpool slums, debtors, drunks, common criminals and the like, who had been bullied and beaten into mindless obedience.”
Even lower in their eyes were the Hessian troops the British had hired to fight the colonists, which were commanded by Lieutenant-General Leopold Philip von Heister. A veteran of many campaigns, von Heister had crankily sailed over from England, touched shore, “called for hock and swallowed large potations to the health of his friends,” and then, apparently, set out trying to kill Americans.
There’s a long tradition of distrust for mercenaries, from Aristotle claiming they “turn cowards … when the danger puts too great a strain on them” to Machiavelli arguing they’re “useless and dangerous … disunited, ambitious and without discipline, unfaithful, valiant before friends, cowardly before enemies,” and the colonists would likely have agreed with such assessments. Mercenaries were at the bottom of the hierarchy of military excellence, citizen-soldiers at the top. We can see this view reflected in George Washington’s message to his soldiers before the first major engagement of the Revolutionary War, the Battle of Long Island:
Remember, officers and Soldiers, that you are Freemen … Remember how your Courage and Spirit have been despised, and traduced by your cruel invaders, though they have found by dear experience at Boston, Charlestown and other places, what a few brave men contending in their own land, and in best of causes can do, against base hirelings and mercenaries.
This was in August 1776, and Washington’s 19,000 men were about to see whether their civic virtues would triumph over British military skill. The American line stretched out across central Brooklyn, with British troops advancing from the south and the east. Though there was skirmishing during the day on August 26, the real fighting began the next morning when a column of Hessians marched up Battle Pass, in modern day Prospect Park.
What followed was a disaster. In the unkind phrasing of historian W.J. Wood, “Washington and his commanders … performed like ungifted amateurs,” and that’s exactly how the Hessian mercenaries viewed them. “The rebels had a very advantageous position in the wood,” wrote one Hessian soldier, “but when we attacked them courageously in their hiding-places, they ran, as all mobs do.” Colonel Heinrich von Heeringen, the commander of a Hessian regiment, wrote, “The riflemen were mostly spitted to the trees with bayonets. These frightful people deserve pity rather than fear.” And looking over those he’d captured, von Heeringen sneered, “among the prisoners are many so-called colonels, lieutenant-colonels, majors, and other officers, who, however, are nothing but mechanics, tailors, shoe-makers, wig-makers, barbers, etc. Some of them were soundly beaten by our people, who would by no means let such persons pass for officers.”
It was a rough education for Washington. At the close of the war he would submit to Congress his Sentiments on a Peace Establishment, which noted that “Altho’ a large standing Army in time of Peace hath ever been considered dangerous to the liberties of a Country, yet a few Troops, under certain circumstances, are not only safe, but indispensably necessary.” Congress, however, rejected the idea of even a modest standing army for the nation, its only concession being to keep one standing regiment and a battery of artillery. The rest of the new nation’s defense would rely mostly on state militias. Hence the Second Amendment. This idealistic vision of militias as a bulwark of democracy would soon face a harsh reality check.
There continues to be a cynicism about the motives of those who volunteer for the military. I’ve been repeatedly told that people don’t really enlist because they want to, but because they have to.
In this case, it was not the British, but the Western Confederacy of American Indians who’d give the Americans their comeuppance. Mixed units of American regulars and militiamen had been fighting these tribes throughout the early 1790s. The first campaign, led by General Josiah Harmar, was meant “to chastise the Indian Nations who have of late been so troublesome.” Today, the campaign is known as Harmar’s Defeat, which tells you all you really need to know about whether or not that happened. The individual battles within that campaign don’t have much better titles. There’s Hardin’s Defeat, Hartshorn’s Defeat, the Battle of Pumpkin Fields. This last doesn’t sound so bad, until you learn that it supposedly got its name not because it was fought in a pumpkin field, but because the steam from the scalped skulls of militiamen reminded the victorious American Indians of squash steaming in the autumn air.
Harmar was succeeded by General Arthur St. Clair, who, though rather old, rather fat, and afflicted with gout, set out with “sanguine expectations that a severe blow might be given to the savages yet.” His poorly trained, undisciplined men engaged an equal-sized force at the Battle of the Wabash in November 1791, also known by the considerably more evocative title, the Battle of a Thousand Slain. What followed was the worst military disaster of U.S. history. Of St. Clair’s 920 troops, 632 were killed and 264 wounded, a casualty rate of just over 97 percent. Congress, finally conceding that professionalism did count for something, bowed to the creation of a standing army beyond absolute bare bones.
Of course, the creation of the Army hardly ended the complicated relationship Americans had with professional soldiers. When we come to the Civil War, the first war in which we instituted a national draft, none other than Ulysses S. Grant would call the professional soldiers who’d manned the Army prior to the war “men who could not do as well in any other occupation.” Naturally, he was not talking about his own men, fine citizen-soldiers who “risked life for a principle … often men of social standing, competence, or wealth and independence of character.” It took a grand cause, then, like the Civil War, for military service to count as a civic virtue.