The first step in avoiding future quagmires is to acknowledge the hazards of high-level decision-making. Fiercely ambitious and self-confident people, Presidents and their advisors like to think of themselves as in control—of events, of outcomes, of consequences. But what, really, is in their control and what is not? More than they know—or care to admit—they are like mountaineers on the upper reaches of Mount Everest whose fixation on the summit can dull them to the dangers of their surroundings. In both instances, the vistas are breathtaking. But the air is thin, the winds are strong, the hidden crevasses are deep, and if one slips and falls, the slopes are steep. Both environments are notoriously unforgiving of mistakes and misjudgments.

The demands of high-level decision-making are intensified by “information overload” under pressure of circumstances. Social psychologists Jacob Jacoby of New York University and Carol A. Kohn and Donald E. Speller of Purdue University showed that increases in information load cause decision-makers to pay less attention to relevant data. This leads them to examine only a small proportion of available information, making it less likely that they will tend to some critical facts.

On any given day, President Lyndon Johnson and his advisors dealt not just with Vietnam (as President John F. Kennedy and his advisors had dealt not just with Cuba), but troubles in Europe, China, India, and the Middle East—all of which had to be dealt with expeditiously. They confronted a myriad of problems, moreover, with incomplete information and a rapidly ticking clock, which gave them limited time for reflection. Crises, with all of their uncertainties, unknowns, and risks, had to be solved now. Such circumstances often led to reactive little decisions when creative big decisions were imperative.

There is no way to change the frenetic nature of high-level decision-making. Being bombarded with a multitude of pressing issues, day in and day out, is par for the course in the West Wing of the White House. All of this discourages the self-awareness and self-reflection necessary to offset heuristics and biases the human mind adopts when seeking solutions to problems. When faced with issues, we rarely deviate from past approaches, becoming entrenched in own point of view and overconfident in our assumptions. A powerful antidote is to regularly ask oneself, “What if I’m wrong in clinging to my assumptions without re-examining them and in reaching conclusions without questioning them?”

Vietnam War photo

Harnessing cognitive diversity is another part of the answer. One important step is to enlist a range of thinkers in the collaborative task of crafting creative solutions by networking minds to tap varied perspectives. “It’s the difference in how we think what perspectives we bring to a problem . . . that, when combined, unlock breakthrough results,” notes Amy Wilkinson of Stanford’s Graduate School of Business. She cites the paradigmatic example of Bletchley Park, the British code-breaking center during World War II that assembled an improbable mix of crossword-puzzlers, cryptographers, engineers, linguists, and mathematicians to break the Enigma Code protecting Nazi military communications, thereby saving thousands of Allied lives.

Brainstorming can help because it separates imagination from judgment, the creative act from the evaluative one. “A brainstorming session is designed to produce as many ideas as possible to solve the problem at hand,” Fisher and Ury write. “The key ground rule is to postpone all criticism and evaluation of ideas. The group simply invents ideas without pausing to consider whether they are good or bad, realistic or unrealistic. With those inhibitions removed, one idea should stimulate another like firecrackers setting off one another.”

Having invented the widest possible range of options, decision-makers can then choose among alternatives for action. Prior to the famously successful July 1976 raid on Entebbe Airport in Uganda that ended a hostage crisis, Israeli Defense Minister Shimon Peres convened what one member called a “fantasy council” that brought together creative thinkers to consider every known option, boldly imagine others, and game out all scenarios, no matter how fanciful. Daring thinking—envisioning the unimagined—led to innovation and success. Yet it only did so because Peres saw value in gathering unconventional perspectives, and because he was both humble enough to know he might be wrong and confident enough to know that if presented with radical ideas that contradicted his assumptions it was a sign of strength to change one’s mind.

Even when not harried by circumstances, decision-makers generally lack the capacity to think outside of hardened molds, recognize the parameters of a situation early on, and ponder consequences in a systematic and probing way. No one person can synthesize all the information around us and even very bright people fall into mental ruts and cling to old strategies. One way to overcome these limits is to seek outside expertise from varying disciplines—for example, historians, linguists, and cultural anthropologists—whose analysis is unencumbered by the political and bureaucratic constraints of insider status, in which the norm is not to rock the boat and which inhibits advisors from pushing decision-makers out of fear of getting fired or exiled.

The sources of regional expertise, moreover, are more abundant than they were fifty years ago. America’s increasingly diverse population means that today’s Vietnam specialists, for example, are more likely to be Vietnamese-Americans, which can mean deeper, richer insights and thus better advice. There is no disputing the value of knowledge and understanding that comes from integrating expertise from across different fields. Having such outside expertise at hand requires supporting the education and recruitment of such experts early on, before problems become acute and far less tractable. Creating a bench of available outside experts is not cheap, but the cost of doing so is a tiny fraction of the costs of war. And timing is crucial. During the Vietnam War, Washington began utilizing a cadre of outside experts on Southeast Asia after it had plunged into the conflict—firefighters, rung only after the embers had become a conflagration. Much the same proved true in 2001 and 2003: Washington did not think to cultivate a cohort of outside experts on Afghan and Iraqi issues and utilize their expertise until after the invasions of Afghanistan and Iraq were underway. The time to do so is before the tinder begins to smoke. Outside experts can help dampen the embers, even to the point of extinction.

Utilizing them requires opening oneself to new ideas. Embracing innovative thinking involves breaking established routines, which most decision-makers resist because they are very busy, they are inclined to listen to their existing information suppliers, and those existing

suppliers ferociously defend established procedures. Their reliance becomes institutionalized in bureaucratic processes (daily intelligence briefings, national estimates) in which organizational missions channel attention, affect the selection of information, and make it difficult to seek out and embrace new ideas. Harvard Business School professor Clayton M. Christensen has shown that innovations often begin as small-scale experiments, placing a small bet to test a big idea in a trial-and-error process. One such experiment might be to create an independent office operating under the protection of the President to explore and exploit innovating thinking. Being open to new ideas can make decision-makers more accurate in their predictions and more thoughtful in their judgments.

During the Vietnam War, Washington began utilizing a cadre of outside experts on Southeast Asia after it had plunged into the conflict—firefighters, rung only after the embers had become a conflagration

Another step is to adopt “a cognitive net” that puts in place a system to catch flaws of assumptions, reasoning, and thoroughness inherent in decision-making. Good decisions require looking at so many different factors in so many ways that even the smartest individual can make

mistakes. Such a cognitive net would mandate communication across the board to deal with the unexpected and uncertain. This proved essential—and successful—in improving the safety of surgical procedures, as Atul Gawande of the Harvard Medical School detailed in his book, The Checklist Manifesto. A blizzard of things occur whenever a patient is wheeled into an operating room: allergies are identified, medicines are given, anesthesia is administered, surgical instruments are laid out, equipment is prepared, specialists are summoned, among many other

things. All of this can lead to overlooked errors: in one hospital, a third of appendectomy patients failed to receive the right antibiotic at the right time.

To remedy this problem, the hospital’s administrator created a verbal checklist for operating room staffs. The checklist was

greeted with skepticism and resistance at first out of fear that it would consume precious time and increase an already heavy workload, but surgical teams quickly learned the benefits of orally confirming a series of steps before the first incision was made. After three months, 89 percent of appendectomy patients received the right antibiotic at the right time; after ten months, all such patients did. When a similar checklist was adopted by the World Health Organization and applied in eight hospitals in both developed and underdeveloped countries, the results were equally dramatic: major complications for surgical patients fell by 36 percent, deaths by 47 percent, and infections by almost half.

The same can be done by high-level decision-makers prone to cognitive error if they go through a checklist of steps including rigorously and ruthlessly questioning assumptions, candidly acknowledging unforeseen developments, and open-mindedly exploring the widest

possible range of options. “While no one anticipate all problems,” observes Gawande, adopting a cognitive net could allow decision-makers to “foresee where and when they might occur . . . If you got the right people together and had them take a moment to talk things over as

a team rather than as individuals, serious problems could be identified and averted.” A checklist will not be foolproof and it may slow down the decision-making process up front, but unlike a haphazard process, it may encourage people to talk through hard and unexpected problems, see subtleties, flag potential traps, and thus yield wiser decisions in far less time overall. As Gawande concludes, “under conditions of complexity, not only are checklists a help, they are required for success. There must always be room for judgment, but judgment aided—and even enhanced—by procedure.”

Dealing with immensely complex problems like Vietnam demands a disciplined routine in which decision-makers acknowledge their fallibility, talk frankly with one another—most especially, share their apprehensions (which Johnson, McNamara, and the Chiefs never really did)—and adopt methodical teamwork to catch problems and increase the probability they have the critical information they need when they need it in order to craft solutions to the problems facing them. Doing so could improve decision outcomes with no increase in individual decision-makers' skills. All of this may seem obvious, but it pushes against two abiding facts of Washington life: powerful egos who believe they have the right stuff, consider themselves to be their own experts, and don’t need checklists, and the strong bureaucratic culture of turf-consciousness and turf-protection.

Perhaps the most difficult problem to overcome in decision-making is the problem of immediacy. Most policymakers unsurprisingly prioritize the short-term. They find it difficult to look beyond the moment—not the minute, but the span of a few days or weeks. Short-term thinking helps them deal with crises and rapid change, and cope with an uncertain future. But it has costs. In a large survey of corporate chief financial officers, researchers found that 80 percent of them turned down lucrative projects because doing so would lower their companies’ quarterly earnings. A similar dynamic affects decision-makers when dealing with complex, fast-moving problems. President Johnson and his advisors fell into this trap. Preoccupied by Vietnam’s daily vexations, they paid scant attention to signals that their assumptions were dangerously obsolete or to the war’s long-term implications. When they did, it was only in fits and starts. The first of them to really do so in a systematic way, Robert McNamara, took two agonizingly long years from the fall of 1965 to the fall of 1967 to change his outlook on the war. Long-term thinking can be difficult, but it can also be transformational. What might look like weakness and failure now might later be seen as enlightened leadership.

Perhaps the most difficult problem to overcome in decision-making is the problem of immediacy. Most policymakers unsurprisingly prioritize the short-term. They find it difficult to look beyond the moment—not the minute, but the span of a few days or weeks

The examples of BASF and Unilever Corporations are two cases in point. In the early 1990s, BASF decided to stop manufacturing highly profitable plastic products that contained a flame retardant suspected of causing cancer. This decision by BASF’s president, Carles Navarro, was highly unpopular with employees and shareholders and resulted in a sharp drop in revenue. After two years, however, BASF returned to the market with substitute products, using different chemicals, which eventually allowed BASF to recoup—and exceed—its prior sales. Navarro’s decision resulted in short-term pain but long-term gain.

In 2010, Unilever announced that it would henceforth release semi-annual, rather than the customary quarterly, earnings statements. The company’s share price plummeted in the wake of the announcement. But two years later, Unilever’s stock had risen 35 percent above its

pre-announcement level. Through long-term action, the company had actually attracted more capital. Sometimes, looking far down the road is a necessity. As Amy Wilkinson observes, “Race-car drivers . . . go too fast to navigate by the lines on the pavement or the position of their

fellow drivers. Instead, they focus on the horizon.” Decision-making at the highest level is not that different: policymakers move very fast on a shifting course and face the ever-present chance of a crash. Presidents and their advisors, like race-car drivers, must keep their eyes fixed on the


The challenges of strategic thinking are real, however. The revolution in communications technology means that problems are now identified—and on decision-makers’ plates—faster than ever, creating even greater pressure for immediate action than in the past. It’s hard to think strategically when trying to solve immediate problems. Taking the long view, moreover, is a tall order to ask of elected leaders in a democracy, where polls and electoral accountability necessarily focus the mind and dull attention to longer-term considerations. The future casts no vote. In the day-to-day process of Washington decision-making, problems usually manifest themselves in the form of immediate pressures, and busy and harried decision-makers tend to look for correspondingly immediate solutions, quick-working remedies that tide them over a crisis. They tend to act in terms of the seen, with less attention to the unseen. They are discouraged from making politically risky decisions because the short-term pain is often obvious while the long-term gain is something avoided. Johnson wrestled with this dilemma during the critical years 1964-1965, and it powerfully and fatally reinforced his short-term thinking.

But short-term thinking, as the tragic unspooling of Vietnam showed, can lead to immensely damaging and destructive consequences. If decision-makers consciously strive to weigh the effect what they do on the more distant future, they are more likely to see their choices in a clearer light. This is true of what not to do as well. “Making ‘don’t do’ lists,” notes Wilkinson, helps “overcome hubris . . . that can hold people back.” It is well to remember that making history means understanding that history is sometimes “made” years after an action itself, and that leaders are ultimately judged not by their day-to-day choices but by the long-term consequences of their decisions.

Brian VanDeMark is a professor of history at the U.S. Naval Academy.

This excerpt fromRoad To Disaster: A New History Of America's Descent Into Vietnam, (Harper Collins, 2018) is used with permission.