The 20th century was famously called “the American century”, yet its being so called occurred in an improbable way. The phrase itself was actually not used until Time publisher Henry Luce coined it in a special issue of Life magazine in 1941—by which time 40 percent of the 20th century had already passed. Moreover, 1941 was a year in which the superiority of America and of the American way of life appeared decidedly problematic. Only the year before had the United States finally exited, statistically speaking, the decade of the Great Depression. Nazi Germany’s armies occupied most of Europe, stretching from the Atlantic coast of France to the heartland of the Soviet Union. At the same time, Imperial Japan’s armies occupied most of East Asia, stretching from Manchuria through much of China to Indochina. No objective observer could have been blamed for entertaining a whiff of pessimism about America’s prospects.
Nevertheless, Luce was truly prescient. By the end of the 20th century, nearly everyone widely acknowledged that it had, indeed, been the American one. Certainly, no other power and way of life could claim that title. Moreover, as the 20th century passed into the 21st, it seemed reasonable and even self-evident to say that the 21st century, too, would be an American century. In the first couple years of this century there was a little boom in the publication of books and articles—some admiring, some disparaging—that even went so far as to proclaim an American empire. Then, in an amazingly short time, a relentless series of events—almost a staccato burst—perforated and punctured this centennial and imperial dream: the 9/11 attacks, the setbacks of the Iraq war and then of the Afghan War, and particularly the American-originated global economic crisis and Great Recession of 2008–09. The frustrations in Iraq and Afghanistan have largely discredited the American reputation for high moral character and judicious strategic judgment, and the global economic crisis has largely discredited the longstanding U.S. globalization project. More generally, these burdens have raised questions about the applicability abroad of such fundamental American values as liberal democracy and free markets, of “the American way” and “the Washington consensus.”
At the same time that confident visions of a second American century and a new American empire (whether benign or not) have dissipated, another great power with its own distinctive culture and way of life has been steadily rising. In the past decade, China’s ascent has neatly paralleled America’s descent. And so, in the autumn of 2009—one year into the global economic crisis—no one is making a convincing case that the 21st century could still become an American one. Conversely, amid rather a lot of declinist muttering, there is already thoughtful commentary to the effect that this century is more likely to be seen in retrospect as having been a Chinese one.1
I doubt it. The United States can still be the most prominent—although not dominant—of the great powers, and it can still offer the most attractive way of life. But to do this, America will have to become more American than it has been in recent years. This means it will have to renovate or reinvent certain pillars that raised the United States to the heights of global power and prosperity in the second half of the 20th century. These pillars remain the only solid and enduring supports for a prominent American role in the 21st century, so we need to be clear about what they are.
Pillars of the First American Century
When discussing power, most international affairs analysts reasonably focus upon military power (“hard power”), in this case America’s large-scale and high-tech military forces. The United States first achieved supremacy in vast conventional forces (World War II), then in nuclear weapons (most of the Cold War), and most recently in information-age warfare (the “Revolution in Military Affairs” that began in the 1980s). And when gauging the attractiveness of the American way of life, many analysts focus upon particular American ideas and ideals, or ideological power (“soft power”)—in this case, liberal democracy, free markets and the open society. These ideas and ideals have been grouped together and advanced under a variety of slogans, some meant to encompass the globe, some of more limited scope, some emphasizing political and others economic aspects of ideology, to wit: “the Free World”, “the Alliance for Progress”, the idea of “universal human rights”, “the Washington Consensus”, “the Freedom Agenda.”
There is no doubt that both military power and ideological power were central pillars of the first American century. However, the essential base for these, and for all power in international affairs, remains economic power. (This may sound like economic reductionism or even Marxism, but it is not: I do not argue that economic power is sufficient to account for supremacy in world affairs, only that it is necessary). Economic power in turn entails strength in three component dimensions: industrial, financial and technological (in other words, manufacturing, banking and innovation). During the first American century, which spanned from the high industrial era to the early information era, the United States obviously led the world in each of these three dimensions.
Industrial superiority. Throughout the 20th century, the United States was the largest industrial or manufacturing economy in the world. Its industrial products were generally competitive in world markets, with the U.S. economy earning substantial foreign exchange from their export. Although the United States lost its competitive advantage successively in older industrial sectors like steel, automobiles and consumer electronics, it demonstrated an extraordinary capacity to innovate whole new industrial sectors like aerospace, computers and telecommunications, each of which then gave the United States a new competitive edge in world markets for several decades.
Of course, many economists argue that as an economy becomes more sophisticated, it can leave behind its manufacturing component altogether and simply move upward into a variety of service sectors (of which finance is one). This argument is partly correct. However, although an economy may cease to produce industrial goods, it will continue to consume them, just as it earlier may have ceased to produce agricultural goods, but it obviously had to continue to consume them. Indeed, as an economy becomes more developed and richer, it may consume even more industrial products than it did before, and these products have to come from somewhere. Indeed, they have to be imported and these imports have to be paid for with exports, which would now have to come in the form of services. But only some services are exportable (“internationally tradable”), of which finance is the most important. Others have turned out to be importable: Advanced service economies are now importing services as well as manufactures, as seen in the outsourcing of data processing and telephone call centers from the United States to India.
The real issue in economic development is not the simple move from manufacturing to services, but rather the more complex move from older, static sectors that are no longer capable of generating export earnings to newer, dynamic sectors that can. Moreover, these new sectors have to be of sufficient scale to cover the costs of all those industrial products now being imported. Some of them might have industrial features, such as the new products of the renewable-energy and biotechnology sectors; some might have service features, such as new processes in the medical field.
Financial superiority. During much of the 20th century, the United States was a creditor nation. It achieved this partly because of its vast foreign-exchange earnings, but also because of political stability (and therefore political predictability) that led to the U.S. dollar becoming the principal international reserve currency. With its own vast amounts of capital and with foreign investors having great confidence in the stability of both the U.S. dollar and U.S. banks, the United States was overwhelmingly the world’s leading financial power during most of the 20th century.
Technological superiority. The reason the United States could continually create new industrial sectors was that, for most of the 20th century, it was also the leader in developing new technologies. As late as the 1930s, scientists and engineers in other nations (especially in Britain and Germany) might introduce some new invention, but then Americans would take the lead in expanding this invention into a new innovation and expanding this innovation into a new industry. With World War II, Americans also assumed the lead in new inventions, a lead which has largely continued down to the present.
American technological superiority has been grounded in several unique or unusual features. Most obviously, the United States has long had the largest—and since World War II also the best—university system in the world.2 This has provided a vast pool of scientists and engineers to develop new inventions and innovations. Second, the American free market system has enabled entrepreneurs to harness these new inventions and innovations to build new industries. Indeed, the combination of advanced universities and energetic entrepreneurs (often headquartered together in metropoles like Boston, the San Francisco Bay area and Silicon Valley) has birthed virtually all the new industrial sectors created in the United States since World War II. Third, the U.S. general population long led the world in average educational level. Although this advantage has disappeared in the past two decades, it obtained during most of the 20th century. This educated general population of Americans provided a plentiful supply of efficient and productive workers for the new industrial sectors.
From Economic to Military Superiority
The great strength of the American economy enabled the United States to possess great military power, as well. The immense U.S. industrial capacity that existed in 1941, even after a decade of Great Depression, soon overwhelmed Nazi Germany and Imperial Japan with hitherto unimaginable quantities of tanks, artillery, warships, transports, bombers and fighters. Military historians generally acknowledge that the German Army and the Japanese Army were both superb at the level of military operations or “operational art.” But the U.S. military trumped their advantage with its own in materiel and logistics. (The U.S. military was also often superior at the level of military strategy, but on this point there is more controversy among historians.)
Military historians have also often discussed what they see as a distinctive “American Way of War.” They agree that its two central features are overwhelming mass, in both men and materiel, and wide-ranging mobility—the projection and sustained support of that overwhelming mass across great distances. But after these features reached their apotheosis in World War II, the American military soon faced the fact that Soviet armies were even more massive than its own. The United States responded to this challenge by drawing upon a third military feature—advanced technology—in which it had recently acquired a substantial advantage. The United States first trumped the large Soviet armies with nuclear technology and weaponry and then, when the Soviets developed their own nuclear weapons, with the computer and telecommunications tools of the information age. These U.S. military innovations amounted to new versions of the American way of war. Just as the American economy kept re-creating itself, so did the American military. Thus, for most of the 20th century no other great power could match America’s military power, and the main reason was the dominance of American economic power as manifested in all three of the dimensions of industry, finance and technology.
Beginning with the Vietnam War, however, and again with the wars in Iraq and Afghanistan, the United States has confronted a new problem. Neither its advantages in massive industrial-age armies, nor in nuclear weapons, nor even in high-tech information-age weapons, have been effective in putting down a determined and sustained insurgency (a sort of pre-industrial adversary). And so now, in the first decade of the 21st century, the U.S. military is engaged in inventing yet another effective American way of war. Its success or failure in doing so will play a large role in determining whether the 21st century can become a second American one—as will, of course, the success of American elites in re-creating an effective formula for economic dynamism.
The Pillars Today: America versus China
This review of the pillars of the first American century may occasion some discouraging thoughts. Many of those pillars have been squandered or abandoned—as Daniel Bell foresaw in his Cultural Contradictions of Capitalism—by successive generations of Americans during the very decades that comprise much of the golden age of the American century. It is obvious today that two of the economic pillars, the industrial and the financial ones, are particularly diminished. A comparison with China makes this clear.
Although the United States remains the largest manufacturing economy in the world, China is projected to overtake it by 2015 or so. And China, of course, is the largest and often most competitive producer in such basic sectors as steel, shipbuilding and consumer goods. It is rapidly expanding and upgrading its automobile and chemical sectors as well. These have been the basic sectors of any robust industrial economy, and they usually have been the generators of large export earnings. (Along with aircraft production, these sectors enabled the United States to win World War II, and long served as the basis for the American way of war.)
China’s industrial superiority, and the export earnings it brings, has of course translated into financial strength. At $2 trillion, China’s reserves of foreign currencies—especially the U.S. dollar—now exceed those of any other country. In the past year, the Chinese government has used the leverage afforded by its $800 billion in Treasury securities to pressure the U.S. Treasury and the Federal Reserve with respect to their policies affecting the value of the dollar. Even more important, it has used its financial strength to implement the most successful economic stimulus program any government has yet deployed to address the global economic crisis. In 2009, the world’s most effective practitioners of Keynesian fiscal policy are the Chinese.
Indeed, the Chinese government’s response to the current global economic crisis is remarkably similar to President Franklin Roosevelt’s response to the Great Depression. Like FDR’s New Deal, the Chinese version centers on large-scale spending on big infrastructure projects like highways, railroads, bridges, dams, rural electrification and public buildings. These infrastructure projects not only provide steady markets and continuing employment for such basic industries as steel, cement, heavy machinery and construction; they also bring long-term productivity gains to the national economy. In contrast to both the Roosevelt Administration in the 1930s and the Chinese government today, the Obama Administration is spending little on new infrastructure. Most of its stimulus program is directed at simply maintaining existing assets and employment in selected service sectors (and big Democratic Party constituencies), particularly state and local governments and public education.
The similarities between the U.S. response to the Great Depression and the Chinese response to today’s global economic crisis are not accidental. Both the United States then and China today possessed a vast industrial structure that suddenly suffered underutilization and excess capacity. With so much of the economy devoted to industry, and with industry thus having so much political influence, it is natural for governments to emphasize the revival of industry and manufacturing. An industry-centered (and industry-influenced) economic recovery program will normally emphasize government spending and some kind of Keynesian fiscal policy.
However, in the United States of recent years, industry has been a much smaller part of the economy than it was in the 1930s and than it is in China now. Rather, finance became the largest single economic sector, along with becoming the most profitable and prestigious one; it is therefore not surprising that finance became the most politically influential economic sector as well. This has meant that the U.S. response to the current economic crisis—first that of the Bush Administration in 2008 and now that of the Obama Administration in 2009—has been finance-centered (and finance-influenced). That is why it has emphasized bailouts of “too big to fail” financial institutions, the manipulation of interest rates and monetary policy (a kind of Friedmanism).
The real, and ominous historical analogue to the U.S. economy and economic policies of today, therefore, is not the United States of the 1930s, but rather the United Kingdom of the 1930s. By then, Britain’s decades as “the workshop of the world” were long past; the British economy centered on finance, and British governments devised economic policy accordingly. The City (and Lombard Street) was even more authoritative there than Wall Street has been here. The result was that, during the Great Depression, Britain never saw anything approaching New Deal deficit spending and fiscal policy (never anything like Keynesianism in Keynes’s own country). Instead, it experienced a “lost decade” of dreary stagnation, which led in turn to its inability to sustain its world power status thereafter.
In short, if China’s present trends and economic policies continue, it will likely make its exit from the current global economic crisis with its economy more developed and diverse than it was when the crisis began. Conversely, if America’s own present trends and economic policies continue, it will make its exit from the crisis with its economy more distorted and debilitated than it was before.
Technological Superiority
It is worth remembering that the economic policies of the Roosevelt Administration—both the New Deal and military spending, both civilian Keynesianism and military Keynesianism—resulted in a vast and varied industrial structure that was not just the workshop of the world, but also its wonder (as exemplified in the 1939 New York World’s Fair). This industrial structure was fully in place in 1941, and it proved to be the basic foundation of the American century. If we are to make use of the current crisis, we must produce a similar outcome, and we can do so by building on the one strong pillar that remains to us: our longstanding technological superiority.
China is clearly investing a great deal to achieve its own technological strength, rapidly expanding and upgrading universities and research institutes, as well as investing in the rigorous education of the general population. While these measures have been effective in steadily increasing its economic productivity, historically it has taken many years for an economy to translate industrial and financial superiority into technological superiority. (For example, the United States reached industrial superiority in the 1890s and financial superiority in the 1910s, but its universities did not clearly surpass the top British and German ones until World War II.) The central and strategic question about which country will achieve the technological superiority of the future will turn upon which leads in the new economic sectors of the future.
Today, the most obvious candidates for these sectors are new sustainable or “green” energy sources and uses, new biotechnology-based products and processes, and new medical and health treatments. (One might think that the latter two candidates are not really distinct, but this would be mistaken: The economic implications of biotechnology and uses of biomimicry far surpass medical applications, and not all new medical treatments need be based on biotechnology alone.) It is interesting that the Obama Administration has specified energy and medical-related advances as being at the center of its own vision for America’s economic future, and that they, along with the education sector, occupied a prominent place in the Administration’s public depiction of its own economic stimulus program and budget priorities.
The potential economic sectors of sustainable energy, biotechnology and medicine/health are clearly of vital importance to vast numbers of people around the world. Moreover, those countries with advanced or advancing economies would be able and willing to spend vast sums to import the new products and processes of these sectors. If the United States can achieve leadership in them, as it did during the 20th century in aerospace, computers and telecommunications, it will have secured a robust pillar for even broader American leadership in the world in the 21st century. The Chinese are not oblivious, however, to the promise of at least one of these new sectors, renewable energy, which they now call a strategic industry. In the past few years, as part of their own economic stimulus program, they have begun to construct large wind power farms and solar power plants and to develop promising battery-powered automobiles.
It should be a prime objective of the U.S. government to maintain and even enhance America’s technological superiority, particularly with respect to developing new economic sectors that will be leaders in global markets. This entails encouraging and enabling the traditional bases for U.S. technological superiority: the university system, with its numerous scientists and engineers; the free-market system, with its numerous innovators and entrepreneurs; and the education system for the general population (obviously in great need of improvement).
Some economists have argued that only the quality of top scientists and engineers is important for economic productivity and international competitiveness, and that the education level of the general population is not. However, the inventions of these scientists and engineers have to be transformed and expanded into entire economic sectors. That requires support from a large base of intelligent, skilled and diligent technical, clerical and industrial workers, a base that must continually be reproduced and upgraded by the education system. In any event, the United States is unlikely to continue to enjoy a productive and competitive economy if it must continue to support the large and growing number of its people who are so poorly educated as to be permanently un- and under-employed.
In order to improve general education, it is perhaps time to return to the traditional American value of competition. Numerous attempts to reform the monopolistic public schools (more accurately, government schools) have failed; the solution will come by enabling a large variety of private schools to freely compete with the government ones. All good schools could receive public assistance; none should receive a public monopoly. Unfortunately, since one of the Democratic Party’s main constituencies is the public-school teachers’ associations, the education policies of the Obama Administration will likely only make things worse.
The Military Corollary
Even if we succeed in revitalizing our economy by depending on scientific-technological leadership, we will still need to re-create a successful American way of war for current circumstances. This begs the question of how we will prevail over insurgent movements and the other slings and arrows of hostile non-state actors.
On the one hand, the dreary (but still debated) U.S. experience with counterinsurgency in Vietnam—which came at the height of the first American century—convinced the U.S. military for more than a generation thereafter that counterinsurgency warfare was incompatible with any version of the American way of war. On the other hand, the recent success in Iraq of the new (actually re-newed) U.S. counterinsurgency doctrine offers some hope.
The clue to the conundrum posed by insurgent warfare lies in looking even more closely at the features of the American way of war as they have actually been demonstrated in U.S. military history. We have already mentioned the well-known features of overwhelming mass and wide-ranging mobility, along with the later addition of high technology. But when the United States fought wars in the 20th century, it added yet another largely unacknowledged feature: a heavy reliance upon the ground forces of allies. In World War I, these were the French and the British armies; in World War II, the British and Soviet armies; in the Korean War, the South Korean army; and in the Vietnam War, the South Vietnamese army. Even in the Gulf War of 1991, the U.S. military operated with substantial ground units provided by other members of its “coalition of the willing” (for example, those of Britain, France and Saudi Arabia). In short, the “overwhelming mass” of U.S. ground forces has always been something of an illusion; the ground forces of U.S. allies were often more numerous (although less efficient and effective) than the ground forces of the United States itself, and these allied forces often assumed many of the more labor-intensive military tasks. The dirty little secret of the American way of war is that America’s allies frequently did much of the dirty work.
It was this secret that the U.S. Army and Marine Corps rediscovered and applied in Iraq in 2006–07. They realized that the key to successful counterinsurgency was to ally with local forces—in this case the Sunni tribes of the “Anbar Awakening”—who had their own reasons for opposing al-Qaeda insurgents. The U.S. military is now trying to apply a similar strategy in Afghanistan by seeking to split various Pashtun tribes from the Taliban insurgents. However, one of the reasons the Sunni tribes allied with the U.S. military in Iraq was that they feared the majority Shi‘a government as well as the al-Qaeda insurgents. The Pashtun tribes in Afghanistan lack any comparable fear, and hence any comparable incentive, to push them into alliance with U.S. forces.
The general lesson to be learned about the potential for any American way of counterinsurgency warfare is that the United States will always have to rely upon local forces, whether local militaries or merely local militias, who have their own capabilities for effective counterinsurgency. The U.S. military may be able to add certain essential ingredients or necessary conditions (such as, for example, effective weapons, professional training, mobility and logistics, or simply ample pay), but it can never successfully do the grueling job and dirty work of counterinsurgency all by itself. This means that the United States should not undertake a counterinsurgency campaign until it has developed a thorough knowledge and clear view of local forces and potential allies in a given theater.
In practice, this means too that the United States should normally seek to solve its problems without resorting to using the regular U.S. military for any counterinsurgency operations at all. Rather, the primary focus of the U.S. military should be on deterring war and, if war comes, defeating the military forces of other great powers in all forms of 21st-century warfare. The reason we are now attacked only at sub-conventional levels is not that no motive can exist for attacks against us at other levels; it is because no one dares. If we lose our superiority at these levels, however, someone might well dare.
Popular Culture and American Idealism
The reinvention and renovation of its economic and military pillars would put the United States once again in a position to exercise leadership in the world. However, having recreated its ability to be a world leader, the United States would also have to learn again how to act like one. For almost two decades, U.S. political leaders have often acted toward other nations, and particularly toward other great powers, in a way guaranteed to provoke their annoyance and disdain, and even their anger and contempt. This requires us to pay some attention to both the cultural style of American leadership and the power context in which it is exercised.
With all the talk among American political commentators about “soft power” and the attractiveness of American popular culture to the rest of the world, it is usually forgotten that this popular culture is chiefly popular with the young—particularly those young who are still irresponsible, rebellious and feckless. It does not often attract the mature, particularly those mature enough to be the leaders of their families, communities or countries who are responsible for their security and prosperity. In short, American popular culture is a culture for adolescents, not for adults, and adults around the world know and act upon this truth. If American leaders want to lead the leaders of other countries, they will have to act like mature adults, not like the attention-seeking celebrities of American popular culture.
Similarly, with all the talk among American political leaders and commentators about American “idealism”, and the attractiveness of American values to the rest of the world, it is usually forgotten that most of the political leaders in other countries are realistic men making sensible calculations about their nation’s interests (and their own). They expect the leaders of other countries, including the United States, to do the same. This is particularly true of the current leaders of China and Russia. Having learned all about the claims of ideology when they were growing up, and having put ideology aside when they became adults, they cannot really believe that U.S. political leaders in turn really believe that American ideals should be promoted for their own sake, for their “universal validity”, rather than as a legitimation or cover for U.S. interests. If American leaders want to lead such leaders of other countries, they will have to act in the style of realists, and not in the style of idealists.
That begs a key choice. Realism requires us to specify the new, 21st-century context of great powers in which the United States would exercise leadership. Although rebuilding its economic and military power pillars will make the United States the most prominent power in the world, it will no longer be a dominant one. There will be other great powers as well: some rising, like China and India; some declining, like the European Union and Japan; and some rising in some respects but declining or unstable in others, like Russia, Iran and Brazil. If the United States is to be an effective and constructive leader in world affairs, it must be able to lead at least some of these powers on issues of world importance. These include threats from transnational terrorist networks, nuclear proliferation, the global economy, global epidemics and global warming. In particular, it will have to deal in an effective and constructive way with China, India and Russia, powers that have risen or revived to the point that they seek to be the pre-eminent or even dominant power in a particular region—which is to say, to have something like a traditional sphere of influence. For China, this is Southeast Asia; for India (not quite yet, but likely within a decade), this will be South Asia and possibly the shores of the Arab Gulf; and for Russia, this is Central Asia and the Caucasus, but also the neighboring Slavic (and Orthodox) states of Belarus and Ukraine.
With respect to these great powers and these regions, the United States will have to make a choice. It can try to lead the small countries in a region in some kind of opposition or even alliance against the aspiring regional power, as the United States has done with Georgia and Ukraine against Russia. Or it can allow the regional power to exercise leadership in its region, while that power in turn allows the United States to exercise a broader leadership on issues of world importance.
Choosing this latter option would not signify anything particularly new or novel. Even when the United States was at its height in the role of a superpower, the United States reluctantly but realistically allowed the Soviet Union to dominate Eastern Europe. However, that kind of intrusive political and economic control went far beyond the traditional norms for a sphere of influence. For the most part, great powers dominant in their particular regions have been satisfied with having their security interests preserved, along with some economic presence, while allowing a large swath of political autonomy within the smaller states. In this regard, it was the Soviet relationship with Finland rather than its relationship with those neighbors upon whom it had imposed Communist regimes that fit the traditional norm. Indeed, the current Russian relationship with most of the former Soviet republics in Central Asia now largely fits this norm as well, suggesting that the traditional pattern (which the Bush and Obama Administrations have derided as so 19th century) can be reasonably updated to fit the conditions of the 21st.
The 19th century had its own distinctive features. Some historians have redefined it to be the “century” between 1815 and 1914—between the end of the Napoleonic Wars and the onset of the First World War. That 19th century then becomes an era distinguished by no general wars and by rapid economic growth, a rare era of peace and prosperity. And if any one nation was identified with that peace and prosperity, it was Britain. By the end of the 19th century, it was widely acknowledged that the century had been a British one. Certainly, no other power or way of life could claim that title.
But although Britain was the most prominent of the great powers, it was not a dominant one on the scale that the United States was dominant in the immediate post-World War II period. It certainly dominated the world’s oceans with its Royal Navy; it was the leader in the world economy, first in industry and then in finance; and it was the pre-eminent power on many issues of world importance, such as the repression of the slave trade and piracy and the development of international law. But Britain was not a dominant power on any particular continent (except Australia) or in any particular region (except in South Asia during the time of the Raj). Rather, it generally was satisfied with a division of the continents into competing spheres of influence, which then might result in continental-scale balances of power (in Europe, Africa, East Asia and even South America). Britain was the leading world power because it largely allowed other great powers to be the leaders in their own immediate regions. This allowed Britain to be the leader of the leaders without having to ask their explicit permission.
The United States will never again be a dominant power like it was during the American century, particularly in the period from the late 1940s into the early 1970s. Historically, that was an anomalous time in many respects. But a century can still be shaped and defined—and can still be guided toward greater peace and prosperity—by a nation that is only the most prominent of the great powers. And a grateful posterity can later look back upon that century and honor that nation by bestowing upon the century that nation’s very own name.
1China’s potential for rising to global power—with attention to both its strengths and its weaknesses—is debated by Aaron L. Friedberg versus Robert S. Ross, “Here be Dragons: Is China a Military Threat?” The National Interest (September/October 2009); and by Minxin Pei versus Jonathan Anderson, “Great Debate: The Color of China”, The National Interest (March/April 2009).
2See Itamar Rabinovich, “The American Advantage”, The American Interest (May/June 2009).