In every age, some of the world’s leading thinkers have argued that the trajectory of humanity is a steady, even inevitable, advance toward ever-greater prosperity, peace, and moral enlightenment. In reality, the undeniable progress that humanity has made over the millennia has frequently been disrupted, even reversed, by catastrophe and collapse. In our competitive and anarchic world, the relationships between states and peoples have repeatedly been punctured by horrific breakdowns of peace and security. Societies are upended and even destroyed; human suffering unfolds on an epic scale; the world’s most advanced nations descend into depravity; the accumulated achievements of generations crumble amid shocking violence. From the Peloponnesian War in the fifth century B.C.E. to the world wars of the 20th century, the history of international affairs has often seemed a monument to tragedy.
If tragedy is a curse for those who endure it, it can be a blessing for those who draw strength and wisdom from it. The memory of tragedy has often impelled the building of international orders that have succeeded—if only for a time—in holding the forces of upheaval at bay. In the wake of great geopolitical crackups like the Thirty Years’ War and the wars of the French Revolution, leading statesmen have found the foresight to create new systems of rules to regulate the relationships between states, and—just as critically—to erect the stable balances of power that sustain them. Driven by painful experience, they have accepted the geopolitical hardships necessary to avoid the far greater costs of a return to upheaval. Many of the great diplomatic achievements of the modern era—the Peace of Westphalia, the Concert of Europe, and others—have rested on such an understanding. Ralph Waldo Emerson captured the basic ethos: “Great men, great nations, have not been boasters and buffoons, but perceivers of the terror of life, and have manned themselves to face it.”
There is, however, another kind of response to tragedy. If knowledge of tragedy can have an invigorating effect on those willing to fully profit from its lessons, it can also be enervating, even crippling to effective statecraft. After all, great efforts and prolonged exertions can ultimately lead to exhaustion and cause nations to flinch from the necessary application of power. Too much experience with a tragic world can tempt leaders and citizens to seek refuge in withdrawal, appeasement, or utopianism. Such human impulses are understandable enough after a period of trauma. Yet when they morph into an unwillingness to defend an existing order under assault, the results can themselves be tragic.
In this regard, the aftermath of World War I stands as the cautionary example. That conflict caused a greater spasm of violence than any previous upheaval, and inspired a near universal conviction that such carnage must never happen again. Yet the years thereafter did not see an effective order-building project in the mold of Westphalia or the Concert of Europe. Rather, they saw a well-meaning but quixotic attempt to escape the harsh constraints of power politics, followed by a catastrophic paralysis in the face of rising dangers.
The embodiment of the first tendency was Woodrow Wilson. Wilson was hardly the only person who believed that World War I must be “the war to end all wars”: The rapturous public reception he received in Europe and elsewhere after the war testifies to the widespread popularity of his ideas. But he was surely its most eloquent advocate. Wilson had no lack of appreciation for tragedy, and his vision for the postwar world was deeply rooted in his revulsion at the great horror that had befallen humanity in this “most terrible and disastrous of all wars.” His solution, breathtaking in its ambition, was to create a fundamentally new world order that would allow humanity to break free of the depravities that, he believed, had ushered in such a cataclysm in the first place.
In his Fourteen Points speech in January 1918, Wilson promoted what we would now call a liberal international order—one that sought to address the perceived causes of instability and aggression by promoting national self-determination and disarmament, enshrining a liberal trading system and freedom of the seas, strengthening international law, and creating a global organization that would arbitrate grievances and thwart conquest. Most importantly, Wilson shunned the idea that statecraft should consist of the search for equilibrium and the pursuit of national self-interest, arguing instead that the world’s nations must stand on moral principle and practice collective security. “There must be, not a balance of power, but a community of power,” he told the Senate in 1917; “not organized rivalries, but an organized common peace.”
This “community of power” sounded, at least superficially, somewhat similar to what had emerged after Westphalia and Vienna. It also featured an unprecedented leadership role for the United States not just as the conscience of humanity, but as a coordinator and convener of collective action. Crucially, however, the primary currency of power in Wilson’s new order would shift from military force to reason and morality. “We are depending primarily and chiefly upon one great force, and that is the moral force of the public opinion of the world,” Wilson informed his fellow leaders at the Versailles peace conference. If coercion was required, it would be undertaken on behalf of humanity as a whole through the unanimous action of an international community. There could be no going back, no return to the old ways of secret diplomacy, shifting coalitions, and cold-eyed geopolitical competition. For Wilson, a world in which common rules could be identified and accepted, international moral opinion could restrain threats, and nations could cooperate on the basis of the global good was the prerequisite for escaping future tragedies. Once this true peace was achieved, he promised, “Men in khaki will not have to cross the seas again.”
At Versailles, however, Wilson’s desire for a transformative peace collided both with his own animus against German militarism and with the desires of America’s European allies—namely France—for a more punitive settlement. For French Prime Minister Georges Clemenceau, the cause of World War I was not the existence of the balance of power, but its breakdown under pressure from a rising Germany. The solution was to reduce German power and aggressively enforce that outcome over time. “If we have no means of imposing our will,” he warned, “everything will slip away bit by bit.”
The resulting settlement was an awkward hybrid. The Treaty of Versailles saddled Germany with the blame for World War I, while also seeking to contain future German militarism through restrictive measures. The treaty adjusted territorial boundaries in Europe in an attempt to create geopolitical buffers around Germany, authorized the allied occupation of the Rhineland for up to 15 years, and stripped Germany of its overseas possessions. It called for strict curbs on Germany’s armed forces and required the German government to pay reparations to the Allies.
Yet the treaty was not as harsh as sometimes believed, because it neither permanently dismembered Germany nor permanently crushed its economic capacity. The treaty, moreover, aimed to do much more than just punish Germany, because it reflected Wilson’s spirit and many of his guiding ideas. Among other things, the treaty provided for an unprecedented degree of national self-determination within Europe; it essentially codified the destruction of four European empires by blessing the emergence of smaller independent states. Most notably, the treaty created the League of Nations, a body that built on earlier precedents and ideas yet nonetheless represented a revolutionary effort to forge an international community dedicated to confronting aggressors and preserving the peace. “The treaty constitutes nothing less than a world settlement,” Wilson declared upon his return to America in July 1919. It marked a visionary effort “to get away from the bad influences, the illegitimate purposes, the demoralizing ambitions, the international counsels and expedients out of which the sinister designs of Germany had sprung as a natural growth.” The trouble, however, was that the settlement Wilson did so much to shape contained the seeds of future upheavals, precisely because it—like the President himself—was not attentive enough to the tragic geopolitics he aimed to escape.
The settlement left Germany deeply embittered but mostly intact and therefore only temporarily constrained—a combination that practically ensured future revisionism. In fact, Germany’s geopolitical position had arguably been enhanced by the end of the war. Before 1914, Germany had been surrounded by great powers: the Russians, the Austro-Hungarians, and the French. By 1919, the Communist Revolution in Russia and the breakup of the Austro-Hungarian Empire had left an exhausted France as Germany’s only formidable neighbor. The triumph of self-determination, meanwhile, was simply encouraging German revanchism: first, by surrounding Germany with weak states in the east; and second, by giving its future leaders a pretext for seeking to assert control over foreign lands—in Austria, Czechoslovakia, and Poland—where ethnic Germans were numerous.
For its part, the League of Nations was an indisputably progressive effort to safeguard the peace, but it also suffered from critical flaws. In particular, it left the two most powerful European countries—Germany and the Soviet Union—on the outside of a settlement they had great incentive to disrupt. Moreover, its collective security role hinged on the assumption that its leading members could act unanimously in the face of aggression, a Wilsonian conceit that would prove impossible to realize. Two earlier postwar settlements—the Peace of Westphalia and the Concert of Europe—had proven comparatively durable because they rested on both a commitment to shared values and a stable geopolitical foundation. The post–World War I settlement, by contrast, was biased toward revanchism and instability. “This is not a peace,” Marshal Ferdinand Foch, the Supreme Allied Commander during World War I, declared. “It is an armistice for 20 years.” When the U.S. Senate declined to ratify American participation in the League, in part because of Wilson’s obstinate refusal to accept any conditions on U.S. involvement, the postwar system became more precarious still.
That rejection was the product of another type of American escapism in the interwar era—the tendency to withdraw at a time when there appeared to be no immediate threats to U.S. security. Domestic opposition to the League and other parts of the Versailles settlement arose from a variety of concerns: that they would undermine U.S. sovereignty, usurp Congress’s constitutional prerogatives with respect to declaring war, and abrogate the tradition of strategic non-entanglement in Europe. Underlying all this, however, was a sense of strategic complacency brought on by the fact that, with Germany’s defeat, geopolitical dangers to America seemed to have retreated far over the horizon. Had Wilson been more of a political realist, he might nonetheless have salvaged a compromise with the treaty’s more moderate opponents and thereby preserved a strong, if modified, American leadership role in the order he sought to create. In the event, however, the combination of domestic reluctance and Wilsonian intransigence ensured that the Senate eventually rejected American participation in the League. The United States would stay deeply involved economically in Europe during the 1920s, but it never committed strategically either to the community of power Wilson envisioned or to a more traditional balance of power that might have better underwritten the peace.
These escapist tendencies persisted into the interwar era, with mostly pernicious results. Wilson’s League may have been defeated at home, but his core ideas remained influential both in the United States and abroad. Indeed, leading thinkers often found Wilson’s thesis more persuasive than Clemenceau’s—they argued that the problem was not that the balance of power had collapsed but that such a mechanism had ever been relied upon. They therefore determined to set aside the traditional instruments of statecraft in hopes that moral pressure and communal adherence to liberal principles would make war a thing of the past. This movement was exemplified by the myriad disarmament conferences that followed World War I, and by the signing of the Kellogg-Briand Pact of 1928, which outlawed war as an instrument of national policy. “This should be a day of rejoicing among the nations of the world,” the Washington Star opined after the conclusion of that agreement. War, it appeared, was being banished into illegality.
George Kennan would later describe this period of American statecraft as “utopian in its expectations, legalistic in its concept of methodology, moralistic in the demands it seemed to place on others, and self-righteous in the degree of high-mindedness and rectitude it imputed to ourselves.” War was no longer to be prevented through deterrence, alliances, and the willingness to use force, but through the willingness to abjure precisely these measures. Other Americans, disillusioned by the failure of the postwar settlement to live up to Wilson’s grand ambitions, or simply convinced that the geopolitical sky would remain cloudless for years to come, were happy to “return to normalcy” and steer clear of European security matters. All of these impulses—idealism, cynicism, and disengagement—were understandable responses to World War I. All, unfortunately, did more to weaken than fortify the constraints on future aggression.
The same could be said of another response to the tragedy of World War I—the democratic powers’ unwillingness to forcefully resist growing challenges to the settlement they had created. During the 1920s, memories of the last war were strong, but the dangers of the next one still seemed largely hypothetical. Over the course of the 1930s, the international landscape darkened. The world sank into depression; protectionism ran rampant as international cooperation collapsed and nations pursued beggar-thy-neighbor policies. More ominous still, aggressive authoritarianism returned in Europe and Asia alike.
Radical ideologies flourished in some of the most powerful states on earth; the fascist nations armed themselves and used violence and coercion to alter the status quo from Manchuria to Central Europe. One by one the advances accumulated; slowly but unmistakably the geopolitical balance shifted against the democratic powers. Despite all this, the democracies often seemed frozen, unable to stir themselves to multilateral action or an effective response. The United States remained mostly geopolitically absent as the situation in Europe progressively worsened; the other Western democracies mostly sought to avoid confrontation until 1939, after Hitler had built up great strength and momentum. As Joseph Goebbels, Hitler’s propaganda chief, later remarked, “They let us through the danger zone. . . . They left us alone and let us slip through the risky zone and we were able to sail around all dangerous reefs. And when we were done and well armed, better than they, then they started the war.”
Far from moving aggressively to thwart the revisionist powers, the democracies often handcuffed themselves strategically. The French adopted a military system that made it nearly impossible to use force absent general mobilization; that requirement, in turn, made even the limited use of force nearly inconceivable in the 1930s. The British slashed real defense expenditures to pay for the rising costs of social services. In absolute terms, the money spent on the army and navy hardly increased between 1913 and 1932, despite the vast diminution of purchasing power caused by two decades’ worth of inflation. Into the early 1930s, defense budgets reflected the assumption that no major conflict would occur for at least a decade—a rule that gave London tremendous incentive to avoid such a confrontation.
The interwar statesmen were not cowards or fools. There were many reasons, all seemingly plausible at the time, why the democracies adopted a posture that appears so disastrously naïve and misguided in retrospect. Collective action was hard to organize amid divergent national interests and the economic rivalries caused by depression and protectionism. Feelings of guilt that the postwar peace had been too harsh discouraged confrontation, while budgetary pressures and desires for normalcy inhibited rearmament. There persisted a strong Enlightenment belief in the power of dialogue and diplomacy to resolve disagreements. Even in the late 1930s, British Prime Minister Neville Chamberlain would say that “if we could only sit down at a table with the Germans and run through all their complaints and claims with a pencil, this would greatly relieve all tensions.” And, as is often the case in international politics, citizens and leaders found it difficult to understand how crises occurring in faraway places, or involving seemingly abstract principles such as non-aggression, really mattered to their own security.
Yet the most fundamental factor was simply that all of the democratic powers were deeply scarred by memories of what had come before and seized with fear that another great conflict might occur. Upon returning from Versailles in 1919, Walter Lippmann had concluded that “we seem to be the most frightened lot of victors that the world ever saw.” Throughout the interwar period, the haunting memory of World War I hung over the Western powers, menacing them with visions of new destruction should conflict return.
Central to these fears were the jaded interpretations of World War I that increasingly took hold in the 1920s and 1930s. In the United States, historical revisionism took the form of accusations that the “merchants of death”—the arms industry and the financial sector—had manipulated America into joining a costly war that did not serve its national interests. By 1937, a full 70 percent of Americans polled believed that entering the war had been a mistake. In Europe, a generation of disillusioned observers argued that the great nations of the world had stumbled into a catastrophic conflict that none of them had wanted or fully anticipated, and from which none of them had benefited. As David Lloyd George wrote in his Memoirs, “The nations slithered over the brink into the boiling cauldron of war without any trace of apprehension or dismay.” According to this interpretation, a willingness to act boldly in the face of crisis led not to stability and deterrence but to a deadly escalatory spiral. The implication was that the greatest risk of another awful conflagration lay in overreacting rather than under-reacting to threats.
Indeed, World War I had been so searing an experience—even for the victors—that it convinced many thinkers and statesmen that nothing could be worse than another major struggle. Stanley Baldwin, three times Prime Minister of England between 1923 and 1937, thought that the war had demonstrated “how thin is the crust of civilisation on which this generation is walking,” and he frequently declared that another conflict would plunge the world into an unrecoverable abyss. This attitude permeated Western society and politics in the years preceding World War II.
It was evident in the infamous resolution of the Oxford Union in 1934 that its members would fight for neither king nor country, and in the profusion of antiwar literature that emerged on both sides of the Atlantic in the 1920s and 1930s. “They wrote in the old days that it is sweet and fitting to die for one’s country,” Ernest Hemingway wrote in 1935. In modern war, however, “You will die like a dog for no good reason.” It was evident in the series of Neutrality Acts passed by the U.S. Congress out of conviction that the greatest danger to America was not passivity but entanglement in another European war. It was evident in France’s reluctance to use or even threaten force against Hitler when his troops reoccupied the Rhineland in 1936, despite the extreme weakness of Berlin’s position at that time.
Finally, it was evident in the crippling fear that the result of another war would be to lose another generation of soldiers in the fields of France and a great mass of civilians to indiscriminate terror attacks from the air. British Foreign Secretary Lord Halifax put the basic attitude bluntly in explaining the government’s reluctance to push Germany too hard, stating that “he could not feel we were justified in embarking on an action which would result in such untold suffering.” Or as Neville Chamberlain stated, more infamously, at the time of the Munich crisis, “How horrible, fantastic, incredible it is that we should be digging trenches and trying on gas masks here because of a quarrel in a faraway country between people of whom we know nothing.” Tragedy, for the interwar generation, was not a source of resolve in the face of danger. It was an inducement to an inaction that contributed, in its turn, to still greater horrors to come.
The great order-building achievements of the modern era have flowed from the fact that leading powers were able to turn an acquaintance with tragedy into the mixture of diplomatic creativity and strategic determination necessary to hold dangerous forces at bay. The great failure of the interwar period was that the democracies were too often paralyzed by the past. Donald Kagan concluded his sweeping book On the Origins of War and the Preservation of Peace with the declaration that “a persistent and repeated error through the ages has been the failure to understand that the preservation of peace requires active effort, planning, the expenditure of resources, and sacrifices, just as war does.” This is a lesson that too many in the interwar era forgot in their efforts to escape, rather than confront, the tragic patterns of global politics. In doing so, however, they helped ensure that their post–World War II successors would not make the same mistake.