Editor’s note: Read part one here.
Most Americans who engage in foreign and national security policy as participants, and those who analyze it professionally, never recognize the structural/cognitive assumptions that frame what they do—or even try to recognize them. They are “fish” engaged in first or second echelon forms of analysis in which the constructed character of the subject simply never arises. This works fine most of the time, just as the residents of Plato’s Cave never have reason to complain about the quality of the shadows before them. But when foundational parameters begin to shift or decay—when the problem is no longer about risk but is rather about uncertainty—standard analysis and the policies that flow from it will not work well, not for long anyway, and the policies can even become counterproductive.
So what are the framework or structural assumptions at play here? The answer is no secret: U.S. foreign and national security policies are anchored to assumptions defined by a denatured form of Enlightenment liberalism, which itself cannot be disentangled from the origins of modernity. To see why and how the wheels are indeed coming off, we need first to replace the loose and lazy common definition of the term “modern,” which refers vaguely to whatever is contemporary, with one possessing some useful meaning.
Various definitions of modernity are on offer, but the best one for our purposes is a variation Daniel Bell’s, proposed in The Cultural Contradictions of Capitalism (1976) as a way to distinguish modernity from the European medieval/feudal era that came before. It has three elements: the ascent of individual over communal agency; the rise of secular space not just in politics, but also in the cultural sense that the arts are autonomous from the purview of religious institutions; and the emergence of a linear teleology of progress, based on human reason, out of the cyclical “chain of being” conceptions of pre-modern times. Modernity, by this definition, emerged in one place—Western Europe—at a particular time stretched out over about two centuries.
Liberal political ideas, and the Westphalian international system that developed amid them, are clearly connected to all three of these elements of modernity. In Lockean thinking, for example, the individual is the ultimate metric of rights and responsibilities. He is Donne’s phoenix, “None of that kind, of which he is, but he.” This is where social contract thinking finds its basis. The secular postulate dispenses, as it must, with political authority based on the divine right of kings. Hence Hobbes had to be summoned to translate the “new philosophy” into politics. As to a teleology of progress, moral and material progress walk hand in hand; things can and do get better because it is within the power of human communities to privilege reason as supreme over the passions. Doing so establishes both mankind’s essential freedom and the responsibilities that go with it, progressively so as science establishes its status and social utility.
Taken together, this definitional triad of modernity birthed many signs and consequences. One was that the concept of the soul, bound up as it was in European Christendom with communalist and determinist religious beliefs, had to give way to the concept of the autonomous self—the individual whose own choices, rather than some notion of fate, shape his future. No one accelerated the movement from one to the other in the English-speaking world more than William Shakespeare, that incomparable avatar of the modern. As to downstream social and political outcomes of modernity, its unusual optimism in due course brought forth the English Whigs, and the articulation, by Adam Smith and others, of a basis for entrepreneurial capitalism and the conceptual birth of markets. And in due course this summoned Max Weber to explain how it all arose from novel if seemingly uncongenial religious sources, like Calvinism, in the Protestant Reformation. And explain he did how a form of this-worldly progressive secularism arose out of an epic religious disputation.
These three elements of modernity did not pop suddenly into being in the 16th and 17th centuries out of nowhere; they all had precursors in Greco-Roman and Judaic antiquity, and they all took circuitous routes under various labels to their assembly point in Western modernity. The outcome was not entirely monadic either: The Enlightenment, and the political ideas and institutions that flowed from it, differed from England to Scotland to France, and thence to America. And the development did not cease of a sudden but continued to evolve in both cultural and political realms such that, for example, the legitimacy of the imperial principle in relations among nations ultimately gave way toward the end of the 19th century to the normative ideal of the “self-determined” nation-state.
As it happened, America grew into its status as a continental-scale world power as this evolution went forward, and its ideology amounts to what has been until recently a more or less stable reification of the most prominent of these liberal ideas. With the ideas came certain supporting circumstances attending their institutionalization, and of these circumstances none was so important as the global rise of the post-patrimonial state as by far the most important locus of political authority within Western society.1
It wasn’t always that way; premodern states were, as a rule, ordered and legitimated by dynastic principles, and authority within society tended to be institutionally more diffuse. The dynastic principle is a way station of sorts historically between the tribal organization of politics and the modern Weberian form. To get from the tribal to the post-patrimonial took a long time and depended on a rising capacity for abstract thinking, a quality aided by growing literacy rates. As Michael Walzer once put it,
In a sense, the union of men can only be symbolized; it has no palpable shape or substance. The state is invisible; it must be personified before it can be seen, symbolized before it can be loved, imagined before it can be conceived.2
In other words, politics’ reality status—as well as the ability of people who have never laid eyes on each other to believe that they form part of the same political community or “nation”—owes its existence to the power of metaphor, which is to say to a socially constructed mythology shared by enough people to sustain it through some chunk of time. This understanding informs Hans Kohn’s famous definition of nationalism as what happens when enough people believe they share fates in common and manage to get away with it.
Most Americans, however, have never thought of themselves as having an ideology and certainly not as having a mythology. They have thought of themselves as rather having been blessed with knowing some eternal truths about public life that just happen to align, they suppose, with the basic moral precepts of their religious beliefs (not generally aware of the fact that these “truths about public life” actually derive from their religious beliefs). They do not think of those truths as time-bound or contingent upon historical developments—and hence, like all human productions, in general flux and perishable. They do not think of the emergence of the post-patrimonial state, their own or anyone else’s, as the outcome of a long process. Most Americans—educated elites as well as ordinary citizens—have until recently thought of these beliefs and the institutional markers that correspond to them as eternally true and ever-in-being, and most probably still do. This amounts to a faith-based assumption of secular revelation.
This is not unique to Americans, of course. It is impossible to explain Hapsburg ideology without reference to the Catholic Church, and only slightly less difficult to explain British self-conceptions without reference to the Anglican faith and the embedding of the monarchy in it. But America is special—or perhaps we should say exceptional—in this regard for its capacity to be so religiously oriented that the mere absence of an established church is somehow taken as evidence of secularity. We have no end of foreign policy “doctrines.” Our military goes on “missions.” The most common stylistic form of critical political discourse is the Jeremiad. One could go on, yet Americans’ self-image is hilariously that of inhabiting a secular realm. It all serves to corroborate of G.K. Chesterton’s famous observation that “American is a nation with the soul of a church.”
And of course this is why Americans have tended not to think about the role of the United States in world affairs in realist terms, much to the consternation of pragmatic policymakers from George Kennan to Henry Kissinger and beyond, but rather as a kind of passion play which, though filled with pain, deceit, and challenge, is bound eventually to end a certain benign way. Americans have tended to believe themselves, as Reinhold Niebuhr wryly put it in 1952, “Tutors of mankind in its pilgrimage to perfection.”3
In short, the American way of thinking about politics, and by extension about international politics and foreign/national security policy, has been and remains inextricably bound up with the modern as it came to be defined in Western and American history. It depends on the “constructed” tenets of individual agency, avowed secularism in both politics and culture, and, above all, a highly optimistic teleology of progress aligned with reason and its institutionalizing protégé, science. That progress projected onto global affairs has clear referents: a world headed away from tyranny and oppression toward democracy and human rights as defined and developed in the West. It even has a dynamic theory of interaction between the here and now and the eschaton: democratic peace theory. It doesn’t matter if democratic peace theory can be empirically validated; it’s not a matter of (social) science but of (barely secularized) faith. It must be true; and so, to most of those who need and want it to be true, it is.
So now consider: If we have come to the end of modernity as specifically defined here—for whatever reasons out in the world and inside American society—the fundamental beliefs upon which “normal science” American foreign policy thinking depends are going to become accident-prone.4 Consider, too, that if the world behaves on balance in a way that disconfirms our beliefs about how it operates, we are liable to get frustrated and confused. U.S. policymakers, who are a subset of “we” Americans, may then determine to reinvigorate or incrementally adjust the standard policy procedures to which we have become inured in the belief that we’re just not doing things right or enough, only to find that key assets in what has been a successful risk management strategy for more than half a century end up making things not better but worse. That would be, in this case, two such assets, or risk-management tools: maximal global free trade (to include the freeing of capital as well as goods and services); and the use of military power (and more recently militarized “nation-building” efforts) to prevent backsliding on the escalator of historical progress.
Alas, at least insofar as it informs the international order, if not in other ways as well, modernity is ending. This does not mean that modernity is ending everywhere or in the same way; in some parts of the world modernity, as specifically defined here, is waxing, and in a few others it has never quite come to exist in the first place.5 That is certainly true of secularism: The world as a whole seems to be growing more, not less religious. This is too obvious to belabor in the Muslim world, but it is true in many other parts of the world as well: Political theology of several sorts is waxing and secularism is waning. Also waning in many places is the aura of science as the authoritative oracle of modernity.
Modernity is certainly ending when we consider the great powers now coming to define the international system, and that’s why our standard procedures are producing counterproductive outcomes. And that, at base, is why the wheels are coming off U.S. foreign policy. It’s not George W. Bush’s fault or Barack Obama’s fault; it’s not anyone in particular’s fault. They both made mistakes, to be sure, in some ways in opposite directions. But their mistakes only illustrate the collapse of the system parameters they both implicitly assumed.
With his counterproductive “forward strategy for freedom” Bush failed to heed Eric Vogelin’s advice not to “immanentize the eschaton”; every time he tried to fix or improve a state using military power and militarized nation-building as instrumentalities the state either fell apart (Iraq) or failed to cohere (Afghanistan) or both, releasing unanticipated demons hither and yon. Obama, seeing the result, determined just “not to do stupid shit.” He probably had a better, if still vague, sense that something major is happening in the world than Bush did, but he ended up through sins of commission (Libya) and omission (Syria) doing his share of similarly stupid shit anyway. It is, at any rate, hard to argue that the world is in better shape or the United States more secure today than was the case eight years ago.
Neither of the past two Presidents seems to have recognized fully, or maybe at all, that a structural or parameter shift is in progress; this is why their failures qualify as category errors. The main but not the only characteristic of this shift is the variable-speed deterioration of the state as the pre-eminent actor in global affairs. Part three will explain.
1See Francis Fukuyama’s two-volume masterwork on this process: The Origins of Political Order (2011) and Political Order and Political Decay (2014),
2Walzer, “On the Role of Symbolism in Political Thought,” Political Science Quarterly (June 1967), p. 195n.
3Niebuhr, The Irony of American History, p. 71.
4“Normal Science” is of course Thomas Kuhn’s famous term in The Structure of Scientific Revolutions (1962).
5I have addressed this question specifically with regard to the Arab world in “The Fall of Empires and the Formation of the Modern Middle East,” Orbis (Spring 2016).