Forecasting the impact of technological innovation on economic, social, cultural, and political realities is a time-honored vocation. We even have a name for it, invented in 1943 by an exiled German Jew named Ossip Flechtheim: futurology.
Starting in the early 1960s and restarting recently after a lengthy respite, automation has occupied pride of place in policy-connected futurology debates. As Steve Lagerfeld has pointed out, the earlier debate among social scientists and associated scholars and intellectuals yielded mostly misjudgments and exaggerations, to the point that it is an open question whether a random assemblage of tarot card readers would have done much worse. The current automation debate seems on track to repeat the trick, possibly because, as before, the subject is just too complex to wrestle down to useful predictions. Specifically, excessive concentration on the impact of technology may muffle awareness of other relevant factors.
Perhaps it would help to flip the process: Instead of focusing on one narrow set of outcomes relevant to, say, automation and labor profiles, with many likely causes, what if we focus instead on one inherent characteristic of the key technology with many outcomes?
This “flipped” essay seeks to make only one point: that an inherent characteristic of information technology innovation is to aggregate human transactions that were formerly more dispersed. These transactions can concern information/data, money, perceptions, and audience size, to name perhaps the four most important categories. It suggests further that this characteristic generates both obvious and opaque, beneficial and problematic, streams as it spills into the wider river of social causality. I call this “the net effect.”1
My method is limited to a synoptic observation of eight basic data points. A more rigorous argument would require at least a book’s worth of empirical documentation. Such a book might one day be written, if not by me then by someone else. But today isn’t that day.
Of what use is making this one point in this limited and self-limiting fashion? We tend to see the benefits and problems of the net effect as unrelated one-offs. In fact, they demonstrate, as Kant would have put it, a unity in the manifold (Einheit in der Mannigfaltigkeit). We should ponder this unity because, unless the levels of analysis we use in studying the impact of technology on society are made to vary, whether concerning automation or anything else, we may overlook useful insights.
The One Big Business Cycle
Consider first that, in the still fairly recent past, national and regional business cycles were more or less segregated from one another. A few big investors could leap from region to region, but rarely held the variance in determining macroeconomic trends, and only a few banks were able to facilitate such region-leaping investment strategies. So slumps in Latin America or Europe could coexist with growth in East Asia or the Middle East, and even within regions different countries could be doing more or less well. That meant, typically, that at least one demand “engine” somewhere was hale enough to help other economies out, assuming that the transactional environment was porous enough, by design or otherwise, to allow it. This usually meant, in turn, that national governments and central banks even in smaller economies retained at least some sway over their economic fate.
Today, thanks to the technologies that have enabled what we generically call globalization, and the techniques that have adopted those technologies to particular purposes, we have for most practical purposes one huge and constant global business cycle. When it’s clicking, everything is sweet, notwithstanding the social downsides that inevitably attend rapid periods of growth. In theory at least, unfettered capital flows within this planetary-scale business cycle enable money to get into the hands of those who can make best productive use of it. It follows that global linkages arguably enable Benthamite utilitarian balm to descend on nearly all peoples, as recent reductions in global inequality seem to validate.
In reality, things are not so simple at a time when the technology has also enabled the exquisite, granular financialization of advanced economies. Many of the new financial “products” that have resulted are not really products at all, in the sense that no one except bankers and their hangers-on can use them. Such developments have frayed the connection between the movement and manipulation of money and its constructive uses, as the subprime mortgage crisis illustrated a dozen or so years ago.
The great single cycle has also made control of economic policy on lower governance levels elusive if not impossible. It’s no fun being a central banker in a small or mid-sized country these days—including member-states within the European Union, to take an interstitial example. You might as well trust an astrologer as trust your own planning staff for all the good it will do you.
Worse still, the volume of activity and invested value in a single global business cycle can create higher amplitudes of up-cycles, which means that when the single global business cycle takes a dive, as it invariably must from time to time—as in 2007-09 and now with the COVID-19 pandemic—the scale of the fall is proportionately greater, too. In other words, there are Minsky moments. There have long been some such moments, but expansive ones that jumped across national economies used to be so rare that economic historians remember them unusually well: the 1720 South Sea Bubble that burst nearly simultaneously in London, Paris, and Amsterdam; and the 1873 “global” stock market collapse, to take just two examples. Today there can be very large Minsky moments, covering essentially all national economies except the most isolated ones, and they come at a much faster rate than just one per century.
Everybody suffers from the panic contagion when the great single cycle noses downward, whether they deserve to or not. Just as disease spreads faster and more pervasively when population is denser, financial contagion spreads faster and more pervasively when connectivity is greater.
And yes, the COVID-19 pandemic obviously qualifies as a “net effect.” Again, with some notable exceptions—Justinian’s Plague, the Black Death, the Spanish Flu—disease outbreaks used to be segmented regionally, too. That is no longer so much the case thanks to a vastly expanded international transactional metabolism involving the massive movement of people and things. More granularly, too, geography matters again. New Zealand has much more control over its public health situation than does, say, any country bordering Iran. It also gives new definition to the basket of implications of massive refugee flows, like a probable one out of Idlib province into Turkey and on into the European Union.
Why mention something so obvious? To draw attention to a slightly less obvious shadow effect from one agglomeration to others. In Singapore, for example, the premier rentier hub of global business operations for Southeast Asia, several major European companies—auto manufacturer Daimler, for example, according to the latest (as of mid-February 2020) rumor—are temporarily shuttering offices until the pandemic passes, the result being depressed economic activity that ramifies into airline companies and many dozens of other support businesses.
It doesn’t matter whether such decisions are either prudent or hysterical. Either way, it’s not only nasty viruses that jump out of China, or could in future jump out of lots of other places, so too does economic injury—and the larger the scale of the activity, the larger the perturbation and the larger the injury. It took several weeks before global markets reacted to the virus, but starting on February 24 they did react, thanks to news of outbreaks that “shouldn’t have happened” in South Korea and northern Italy—news that spread around the globe in mere minutes. The Dow Jones experienced its largest single-day point decline in history on February 27. Explanation? It fell from the aforementioned higher amplitude, indeed, a record high amplitude, in which many stocks were probably overvalued.
Suffering from net effect economic debilities is not proportionately equal either among countries or within them, however. Richer countries enjoy various buffers that enable them to ride out rough patches, and they tend to have public health capacities able to tamp down any propensity toward social panic. Poorer countries fall faster, having to go hat in hand to the International Monetary Fund if they decide to accept the conditions of a loan. This peripheralizes their sovereignty, in effect, to the core OECD countries. It’s a difference similar to that between a richer person, for whom the second $150,000 of earnings in a given year is invested or taxed as opposed to being spent on putative necessities, and the poorer person, for whom the second $15,000 of earnings in a year is indistinguishable in functionality and spending pattern from the first $15,000.
Within countries too, wealthy individuals can ride out what poorer individuals often cannot. Poor individuals and families within wealthier countries may sense inequality that can translate politically into grievances concerning basic fairness and social justice. Poor individuals and families within poorer countries generally have little time for the luxury of comparison and complaint. When they get truly desperate they do not protest, they riot. First- and third-world priorities on the ground are as different as first- and third-world problems in theory.
The gist here is that the homogenization of heretofore regional and national business cycles causes many poorer people in rich countries to have standards of living more like upwardly mobile people in emerging market countries, and causes many richer people in poorer countries to enjoy standards of living more like middle-class people in richer countries. Amalgamation is the name of the game, which until very recently saw only yellow lights, no red ones, at national borders. Class boundaries have been fuzzing in strange but on balance homogenizing ways, whether for better or worse depends on one’s grounding perspective.
The electronic knitting going on even has palpable symbols—automatic teller machines, to take a prominent example. In continental Southeast Asia, places like Laos, Cambodia, and Myanmar had very few ATMs just a decade ago, and those few only in big cities near or within tourist-oriented hotels. Where there were only four or five in Laos a decade ago and 24 five years ago, there are 1,000 or more now. At one level this depresses the hell out of adventurous travelers even as it constitutes a useful convenience, for there is a certain cloying aesthetic monotony in seeing the same bank logos wherever one happens to be. It’s no such problem for locals, however, who never really got the hang of the telegenic noble savage role in the first place.
Indeed, the wider politics of the economic amalgamation trend are noteworthy. The re-slicing and dicing of global economic/class trajectories that information technology is driving forward creates lots more déclassé families and individuals in richer countries and more upwardly mobile ones in poorer countries. The déclassé have been known to wax bitter—note the background of Joseph Arthur de Gobineau, for example—and large cohorts of déclassé families often turn out to be much more influential politically, usually in a reactionary mode, than much larger cohorts of ever-poor families. Upwardly mobile people tend to be more optimistic, and when cohorts of such people are large, it can shape national moods. Optimism is indeed a force-multiplier, as a former boss of mine likes to say. But so is pessimism, in the opposite direction. In an odd way, Kipling’s “twain” of East and West have met, but the accompanying mood music is not what anyone expected.
Rich and Richer
Consider, second, that very wealthy people nowadays are much wealthier, in both absolute and relative terms in many cases, than very wealthy people were a century ago, or even 30 years ago. Why? Because global markets for first-tier goods and services are much larger and move more quickly than such markets in national ones. Faster-moving and higher-volume markets both have been enabled by the fact that information about first-tier goods and services has become virtually free, save for the modest and still-shrinking cost of time, thanks to the diffusion of ever more sophisticated cybernetic tools.
The result, as many have noted, is a first-past-the post phenomenon wherein the number one supplier hauls in a significantly higher percentage of the relevant market than before. This can mean that a supplier who comes in at number three in order of preference might as well be number 13, or number 30 and possibly out of business. All else equal, therefore, the phenomenon stokes industry consolidation—and consolidation, left untended, can yield oligopolistic consequences, which in turn can double back to accentuate the first-past-the-post phenomenon. In this sense, bigness detracts at least marginally from maximally efficient markets, and possibly a lot more than marginally.
What applies to individual vendors and buyers also applies, in some respects, to entire economies. Trading patterns seek efficiency and stability through the creation of high-volume supply chains. Furthermore, certain governments whose policies can create comparative advantage—artificially through currency manipulation and subsidization, say—may dominate certain supply chain niches to the detriment of competitors. This capacity for overcentralizing market supply chains raises the prospect of oligopolistic behavior relative to the system as a whole. It means, again as many have noted, that supply-chain disruptions, for whatever reasons they may occur, tend to be much more destabilizing to the trading system than would be the case were supply chains more diverse. That in turn gives revisionists or spoilers a proportionately greater incentive to disrupt or manipulate supply chains as part of a political strategy.
Obviously, too, as many have pointed out, the technology enables big data collection, manipulation, and deployment, but concentrations of wealth and power can avail themselves of this advantage far more readily than average people or small businesses. Big data, all else equal, magnifies inequality, in this case, of both opportunity and outcomes. The same goes more broadly in the sense that the upper fifth of American earners, those of the well-educated, assortative marriage, symbol-manipulating kind (like me, and probably you who are reading this) will weather the COVID-19 pandemic period better and come out relatively further ahead than the bottom four-fifths—people less able to work from home and who have thinner buffers, if any, to protect against very lean times. Note lastly in this regard that market swoons always increase inequality, since only those with very deep pockets can afford to bargain shop at the bottom of market dives, whatever their causes.
Even further and more obviously, large successful companies owned by very rich people now have an overriding interest in remaining number one, or in becoming number one if possible. The stakes have never been higher. Since money generally purchases disproportionate political influence to one extent or another and always pushes on regulatory boundaries put in place in democracies to limit the influence of the very rich, it follows that these very rich companies and individuals will seek to protect or advance their positions in one of three generic ways.
First, rich individuals and corporations will try to maximize their influence within legal limits, and will focus on employing the best lawyers and lobbyists to interpret those limits favorably.
Second, if the wealthy and politically advantaged cannot do what they desire legally, they will often press to make it legal. In genuine Weberian rule-of-law democracies, like the United States, this can involve stretching anti-trust statutes right up to the breaking point. It can involve the strategic warping of the tax code, as with, say, the carried interest provision. In semi-modern polities with weak rule-of-law and contract rights enforcement, oligarchies simply seize control of the state, buy protection through the “official” security services, and give themselves titles that sound “Weberian” but really aren’t.
Third, if the very wealthy and politically influential in democratic states cannot achieve making the prohibited permissible by dint of lobbying or outright corruption, they will often act illegally anyway, daring state authorities to do more than lightly slap their wrists if caught. Large globe-spanning banks have become particularly adept at this, having armed themselves with formidable accounting and legal resources that typically dwarf those even of the U.S government. In case you missed the memo, banks also lobby governments and contribute significant amounts to political campaigns. If you think this has no effect on the propensity of those elected to prosecute white-collar crime, then I have a bridge in Brooklyn you might wish to consider for investment.
The overall result of the net effect as regards democratic polities is to stack the deck further in favor of plutocrats and against those seeking to limit their range and opportunities for plunder. In the United States, Supreme Court decisions like Citizens United and other, mostly lower court decisions that have defined corruption down in ways that, for example, have kept Senator Bob Menendez (D–NJ) out of jail, are examples of gratuitous piling on—unless you can prove that plutocrats have corrupted the Judiciary itself beyond just the Legislative and Executive Branches.
We also have more do-gooder very wealthy people than we used to, notably those attuned to politics. There were the Cornelius Vanderbilts and J.P. Morgans of the first Gilded Age, of course, and the William Randolph Hearsts. But these were few and smaller fries compared to figures like Bezos, Soros, Gates, and Thiel running around today. Even the left-leaning ones who posture against standard-issue plutocrats and corporate swagger still swagger when they need to—as with Bezos’s Amazon legal complaint against a large Defense Department contract being recently awarded to Microsoft. However they posture, very rich people still act like very rich people most of the time, and it is not clear that philanthropy as a social institution benefits on balance, again from technology-driven amalgamation propensities, when very large donors skew its proceedings and judgments.
Shell Games and Other Externalities
Consider, third, that, since not all capital flows to points of constructive use, the size of the offshore shell company economy has mushroomed since the early 1990s. The fiction of the moral distinction between tax avoidance and tax evasion, and the ability of high-skilled and even more highly paid lawyers and accountants in service to the very rich to arbitrage different tax and legal-regime jurisdictions, has resulted in a massive increase in the offshore haven business that was already well known before the April 2016 Panama Papers revelations. Since those revelations, no one can honestly feign ignorance.
A huge surge of new money into the offshore system was Russian, as former Soviet officials of many shapes and sizes looted their country. That was not especially surprising, given the history of Russian officialdom stealing according to rank for ten generations or more, owning to the perpetually shallow institutionalization of the Russian state. What is surprising is that increasingly high-tech Western banks happily aided and abetted this process, thus importing far more kleptocracy into the West than the West exported liberal democracy into the rapidly deconstructing communist world.
After the Patriot Act inadvertently put a crimp into this flow, a real estate exception to the law ended up funneling most of the stolen money headed westward into mostly high-end real estate (including, not incidentally, real estate owned and managed by the Trump Organization). That funneled-down inflow helped create the real estate market bubble that popped to disastrous effect in 2007-08. Yet still when supposedly competent analysts perform post-mortems on the financial crisis’s origins, they only occasionally include this factor in their analysis.
There is more. The Great Recession led to foreclosures on many single-family mortgaged homes. Banks gobbled them up, and what then did they do with them? They often pooled them and sold them as REITs (special real estate investment trusts), frequently to foreign investors—individuals and companies set up in ways designed to mask true beneficial ownership—and to U.S. private equity firms. Some of the largest foreign investor companies were registered in the United Kingdom, but much of their equity came from individuals and small partnerships in the former Soviet Union and, secondarily, the Arab Gulf. This has led to what one investigator has aptly called “a $60 billion housing grab by Wall Street” involving at least 260,000 homes that are now cash-squeezing rentals for their owners. Note the sequence here: The banks invite dirty foreign money, then channel it into real estate which tanks the market, then buy the market’s detritus at rock bottom prices, then sell it to foreigners and domestic scorpions who rent the properties, sometimes to the original owners, at monthly rates often exceeding their original monthly payments. Everyone makes out great—except ordinary American families.
Last on this point, we must not lose sight of the origins of this problem in the character of the technology itself. The size and speed of offshore transactions, such that they have metathesized from an ancient form of marginal friction into a structural factor in the global economy, could not have happened without the transformation of international banking that the technology enabled and drove forward. It is not possible to have such concentrations of money without prior concentrations of usable information. It is not possible to generate such velocity and flexibility of movement, which is the sine qua non of the secrecy element, without the technology.
Speaking about illegality and obliviousness, another obvious net effect phenomenon concerns the need for defenses against the theft of electronic data: cybersecurity, in other words. Cybersecurity is a booming field that did not even exist 25 years ago and which, overall, has to be considered an almost pure externality. Today cybersecurity involves not only computer scientists and systems engineers but also a new plague of lawyers called upon to sort out liability issues. That is a very bad, because very pricey, sign for normal people.
But, it will be objected, the economic benefits of the information revolution far outweigh the costs of cybersecurity, which should be considered a mere and marginal insurance provision. That proposition would be easier to accept if our productivity indices actually were measuring increases due to information technology innovation, but so far they do not do so very well. The result is a desultory debate about how much of a productivity boost information science technology is actually providing. Many economists have argued that the figure is low, and that advanced economies have reached a dire and ominous technological plateau. This has become the consensus view, but I suspect it is wrong. Historical analysis suggests that technique adaptations to achieve productivity increases lag behind technological innovation as such, and the faster and more generatively novel the innovation, the greater the lag. Probably our industrial age metrics underestimate productivity increases from information technology adaptation. But because we’re still largely measuring a misaligned basket of things, we don’t really know.
More complicating, perhaps, the vast increases in cybersecurity costs are not all a function of actual need. The sociology of corporate behavior shows that factors other than rational need figure in to determining hiring levels and other decisions to expend corporate assets. There is, for example, the phenomenon of what the anthropologist David Graeber has aptly called “bullshit jobs.” Probably more important, there is what the iconoclastic Vanderbilt University economist Nicholas Georgescu-Roegen identified long ago as “transactional costs” that expand exponentially as the size of enterprise organizations expand incrementally, leading often enough to a tipping point that eventually makes expanding scale negative cost-effective. Cybersecurity is by definition a transactional cost in this sense, and for practical purposes its overall effect on a company cannot be reckoned separate from the other transactional costs it bears.
Protecting data agglomerations from tampering or theft may therefore be subject to the same tipping point phenomenon, where cybersecurity costs may outweigh benefits from increases to organizational/corporate scale, depending in part on whether data is merely used internally or also sold to other companies at a profit. We don’t know where such tipping points might be, partly because the possibility of their existence rarely dawns on corporate managers caught up in the next latest thing that “everyone” is doing.
The lag time between technology and technique adaptation, in turn, produces a paradox: If we now have the fastest-ever information technology to process transactions involving data, then why do so many routine clerical tasks seem to take longer than ever, and seem to get screwed up pretty much as often as always? It’s because the people who enter into the transactional process hither and yon, one way or another, are no faster or more efficient than they ever were, and programmers have a hell of a time integrating their arational behavior into whole-system designs. Humans remain the process chain’s weak link.
It is a weakness usually dependent on some combination of management technique and scale. If management ordains standard operating procedures on employees using new quicker machines, those employees become more like ciphers who are prohibited from using common sense to troubleshoot and solve problems. Machines are far less flexible and hence in a sense stupider than people as on-the-job decision-makers, so subordinating a person to a task-linked machine underutilizes human intelligence. This is often why fast and seemingly efficient machines do not necessarily produce fast and efficient “real world” outcomes.
We also know of generic cybersecurity cases that have wrought much more complex if also unhappy outcomes. The original 1996 HIPAA (Health Insurance Portability and Accountability Act) legislation in the United States was necessitated by the advent of electronic medical records. The law’s Title II privacy rule became known as HIPPA (Health Information Privacy and Portability Act), and HIPPA soon became the better-known acronym.
Now, stealing medical records had long been a concern, but a minor one, since it was never practical for thieves to cart away tons of paper and then cost-effectively analyze what had been stolen. Medical records rendered into electronic form made both mass theft and quick and easy analysis possible, yielding a novel and commercially viable form of crime. Such behavior had to be criminalized and electronic record keeping made safe from filching. So far, so alright. But now add lawyers and Federal bureaucrats to the mix, and you get a horror story of huge gratuitous costs from typical overlawyering and mountainous paperwork requirements devolving on medical offices and hospitals. Interference in the sharing of critical information among medical professionals has cost lives and stunted important research. We have been grappling with aspects of the dilemma ever since, not very successfully.
Needless to say too, as the names Manning, Snowden, and Equifax will magically bring to mind, the same problem plagues newly agglomerated proprietary information of all kinds. The most secure way to convey sensitive and critical information to those with a need to know in the U.S. government right now, with respect to national security concerns at least, is to print them on pieces of old-fashioned paper, or write them down with a pen or pencil by hand, and restrict access to those with the right to enter a SCIF (secure compartmentalized information facility) protected by a cipher lock. This is how anyone who had ever had an SCI clearance knew from the start that Manning did not reveal classified information above the secret level, because nothing more sensitive than that can be purloined from a State Department computer.
All these examples have in common the observation that law, and more particularly beyond conceptual challenges the implementation of law, is having a hard time keeping up with the pace of innovation and adaptation in information technology. One reason for this is that law is an inherent governmental function, and cutting-edge innovation is itself no longer dominated by government investment and management—at least not in Western countries.
It is important to understand this because of onrushing artificial intelligence, which is so complex that normal people cannot yet discern even the levels of definition that must apply to the phenomenon for it to make any sense. To keep it simple, AI involves information technology married to neuroscience. This is what produces a neural net, the sine qua non of GANs (generative adversarial network) technology essential to creating deepfakes (see below) and other less nefarious capabilities. AI therefore involves a pairing of data aggregation with the processing power and speed of the human brain. That makes it an aggregation capacity squared, so to speak.
It makes it, too, a generative technology, like steam and electricity, that can affect all human-extension machines, if allowed to do so. This is such a novel capability that it will almost certainly produce unanticipatable state changes in the interface between technology and society. And the last thing we should do, despite the obvious temptation, is use AI to predict the social and political impacts of AI, for the method is almost sure to bias the outcome. The thinking we need to do must remain at human scale, for it is at human scale that the values we care most about still exist and function.
Debts and Deficits
Consider, fourth, that, as large national economies like that of the United States became integrated over time, national banking systems emerged. And then at warp speed since the end of the Cold War an international banking system emerged, at least for use by wealthier individuals and corporations. One of many consequences is that locally generated savings and deposits no longer remain “local” for very long.
Starting already back in the 1920s, Americans’ savings began flowing to Wall Street, but this flow waxed greatly after World War II and grew by orders of magnitude with the coming of the cybernetic revolution that dovetailed temporally with the end of the Cold War. Capital flows to Wall Street back then mainly stayed in the United States, notwithstanding the rise of the U.S. financial industry to global status after World War I. Today money from, say, a typical U.S. pension entrusted to some Wall Street bank or investment firm can end up pretty much anywhere in the world in a remarkably short time.
One result is that local savings can no longer be as readily used for local investment and growth. That has several implications, all of them unfortunate. Those invested in the pension fund may gain an extra percentage or two on their money over time, but when their community loses control over the uses to which the money might have been put, social capital often hemorrhages as a result. If you get your mortgage from a local bank where the banker’s kids and your kids play in the same sports league, shop at the same grocery store, and maybe attend the same church or public school, is that bank more or less like to treat you honestly and respectfully than if your mortgage is owned by some bank 2,500 miles away? What if that bank thinks nothing about bundling and selling your mortgage, and couldn’t care less if you’re good for the loan, because it makes decisions on the basis of statistical probabilities served up by powerful data-chewing machines, not on you as a human being and member of society? A stupid question, maybe, but an irrelevant one?
Moreover, at a time when rural/urban divides are defining the problematics of inequality, urban areas and urbanites are almost invariably the winners from the new financialized dispensation of capital flows and rural people and their communities are usually the losers. Capital flows now sharply exacerbate the segmentation of regional economies within given state economies. In rural places, where low wage levels match low price levels for staples, no one is bothered much by the coexistence of urban areas where both wage and price levels tend to be higher for several reasons, some social-psychological and not macroeconomic in nature. They are not bothered, that is, until higher urban, national, or international prices for certain goods and services seep or insinuate their way into rural areas.
This is precisely what has been happening with alacrity in recent years, thanks in part to the connectivity established by IT networks. Health care costs, transportation costs, post-secondary education costs, prices for energy and communications services that are national and sometimes global in generation now envelop rural populations, who desire things and services heretofore unavailable and in some cases unimagined. But with local wage levels mostly stagnant, household budgets eventually incline to burst into debt. Think Yellow Vests, at their origins at least, and the point comes clear not just for France, but for a wide range of cases usually thought of, again and erroneously, as one-offs.
New divisions and distinctions have arisen even among urban areas. The aforementioned first-past-the-post phenomenon and the rural/urban divide also combine to create supercities—cities where high-end job growth (and real estate prices) are highest and are still pulling away from the rest of the urban pack. This is not another one-off; it is part of the unity in the manifold.
Consider, fifth, that the U.S. government has frequently run a deficit over the past half century or so and has piled up a huge cumulative national debt. Before the post-Cold War phase of globalization, the size of the debt relative to GNP was relatively small for the most part, except during wartime. (So was the capitalization of the stock market relative to GNP, but that’s another matter, sort of.) No longer. Today the U.S. national debt exceeds $23 trillion, and a much larger percentage of those holding the debt are not U.S. nationals. Of the roughly 30 percent held by foreigners, a good chunk is held not by individuals but by sovereign wealth funds. China’s roughly 5.6 percent share, totaling more than $1.18 trillion, is a key case in point.
Why does this matter? Simple: A large collection of individuals cannot act in concert, but a state can. This is why, for an example of a reflected media fact, the state-centric nature of the Chinese economy will always draw more attention to Chinese project developments than to those of others. The Malaysian government negotiated with the Chinese government for the East Coast Rail Link project, but Putrajaya would not negotiate with the U.S. government over Dell or Microsoft investments in-country, even investments of larger economic import.
Now, an unfriendly state would cut off its nose to spite its face if it acted aggressively with regard to its financial leverage in ordinary times, so we assume such a state will not do self-injurious things. In a geopolitical/military crisis, however, no such certainties exist. Historically, governments have on many occasions made themselves vulnerable to others—nobles and bankers mainly, and mainly from within—by borrowing from them. But major-power governments making themselves vulnerable to potentially hostile state actors? This is historically uncommon. The fact that just the interest from China’s holdings of U.S. debt is enough to finance its estimated annual defense budget—something like $177.6 billion—about twice over is also, almost needless to say, very uncommon, to the point of being unique.
Even more notable, perhaps, in the past the percentage of states in any given state system that decided to engage in deficit spending and financing was relatively small. It happened mainly in wartime, the common hope being, if we go back far enough in history, that loot would pay for ongoing operations. Today a large percentage of the 192 extant states engages in deficit spending and borrowing to finance it. Why? Because they can, given the vast liquidity that now resides within the aggregated pools of an expanded international banking setup, and the far advanced technical/administrative capacity to keep track of and service the loans.
This does not, however, necessarily make the practice wise, even at relatively low interest rates, and even when growth rates stimulated by borrowing exceed interest rates. That is because the distribution of benefits from growth may be skewed by the function of borrowing itself, which can generate rentier and patronage phenomena within political economies that may have long-term negative consequences for most people. Let me put this bluntly: Large borrowing creates new accumulations of power, and that power can and often will feed new patronage and rentier cores. All else equal, it makes “special interests” politically stronger.
We know that elites in authoritarian states often do not care much about ordinary people, at least compared to their own fortunes and those they choose to patronize for political purposes. But what about democratic states? Can special interests feeding off the net effect become so much stronger that it fundamentally warps their sense of community with their own countrymen? You work that one out for yourself.
But why specifically has the U.S. national debt grown so large so fast? The answer is not analogous to the causes, say, of a toothache. The causes of a toothache can usually be identified as having to do with a decayed and infected tooth or its adjacent gum tissue. The causes of a national debt metastasizing are not so simple.
One cause is cultural. In the case of the United States, but not only the United States, most people have come to believe that they are entitled to maximum feasible and virtually unlimited material accumulation, environmental consequences be damned. And if they cannot pay cash on the barrelhead for it, little stigma remains against borrowing to acquire it. Banks now make more money from teasing un-toilet-trained acquisition habits into debt than from any other source; advertising chicanery abets the lurid material cravings by manufacturing ever more other-directed people determined to keep up with the Joneses; and the death of traditional religion abets all that and more, its absence enabling the substitution of values like socially unhinged expressive individualism and spectacle-driven conspicuous consumption for older virtues like patience, providence, humility, and intergenerational responsibility.
But cultural factors only account for the demand for improvident behavior; without a supply of funds to borrow to answer that demand, the toxic desires of American hyper-consumers would remain mostly unrequited. But there is now a supply: A huge percentage of the aforementioned massive reservoir of global cash liquidity migrates regularly to be invested in the United States for lack of safer or more lucrative destinations. The numbers have become so large that in many cases they have come to dwarf the legacy administrative techniques formerly evolved to manage them safely. It is impossible to take in so much money and still guard against consequences that in selected but hardly trivial cases derange market psychology—“irrational exuberance,” Alan Greenspan once called it, possibly because he would not be caught dead uttering Keynes’s perfectly adequate coinage, “animal spirits.”
Can the American nation really afford all the stuff it buys, much of it in the form of imports that create massive trade imbalances, if it must go unfathomably into debt to do it? Any sensible person would answer “no, not really, not for long.” Marx, wherever he is in the afterlife of philosophers and other troublemakers, must be smiling and saying, “Well, at least I got the materialist fetishism thing right!”
So why, then, does the American political class enable such individual and socially self-destructive behavior, as it has done for decades? Ask them, why don’t you? If they don’t answer you, ask their bankers and financial consultants at Goldman Sachs et al. instead. If they won’t answer you either, what’s left of the Sanders campaign will be happy to try.
Risk and Uncertainty
Consider, sixth, that the sum of items one through five, taken together, constitute a higher-order phenomenon: the inversion of risk and uncertainty.
The current system of global financial interactions arguably reduces risk for individual corporate and high net-worth actors. Data and the means to analyze it have never been more abundant and easier to use. Business owners and managers can now be more precise than ever in personnel and resource decisions, as well as with calculating inventory, transportation, and insurance costs. These capabilities in effect discount the cost of time, which is never free, and magnify the efficacy of scale, at least to a point. Not only do leaders and managers typically know more about the environments in which they operate than ever before, they can also now manipulate and shape consumer markets to their advantage as never before.
But while risk has arguably declined, uncertainty has grown. To understand this, one needs to know the difference between risk and uncertainty. As Jessica Einhorn has explained, think of a fisherman in some rural valley far away and long ago. When he makes his way to the river in the morning with rod, tackle, and bait, he takes a risk that the fish won’t be biting, that his line will get snagged, that the weather will turn foul, and so on. But he can mitigate the risks, the major ones of which he understands from experience. He knows the relevant variables; he just doesn’t know their magnitudes on any given morning. Now imagine that one day the fisherman goes out to the river and finds it gone because an earthquake upstream, far beyond range of his hearing, diverted the river’s flow during the night, probably flooding out someone else’s crop patch over the next ridge. That was a variable he could not reasonably have been expected to anticipate. That is not risk but structural uncertainty.
The difference between risk and uncertainty in business life, whether for individuals, corporations, or even sometimes whole economies, resembles in some ways a mini-max problem in game theory. Incursions of risk into normal life are frequent-to-constant factors, but low-impact ones; incursions of uncertainty into normal life are rare but high-impact. Globalization, by amalgamating separate small business cycles into one large cycle, by raising the stakes of being first-past-the-post providers of high-end goods and services, by magnifying the incentives to employ offshore tax havens, by amalgamating capital pools and starving local economies of investment funds and control and exacerbating the rural/urban divide, and by enabling massive and dangerous levels of personal and national indebtedness, has increased uncertainty relative to risk. This shows that complexity and diversity are not the same, despite having certain base characteristics in common. To equate this situation to one prone to generating the occasional “black swan” is so ludicrously understated as to embarrass any self-respecting swan.
Since the incidence of structural uncertainty is very rare, it is possible—indeed inevitable given human nature—that the danger of low-probability but high-impact structural uncertainty incursions will be reckoned as zero. Same, for example, for a superpower nuclear war. But the probability of both is only low—not zero. In this sense, the broad financial aspects of globalization constitute not only a capacious set of interlocked processes, but also a portentous wager. Science fiction already offers futuristic scenarios of an age of “expansion” collapsing into an age of “contraction”—Paolo Bacigalupi’s novel The Windup Girl (2009), for example. The question is: Will this scenario remain fictive? Or are we bound to see a remake of the Daedalus and Icarus Show, not as myth but (with apologies to Giraudoux) as the privilege of the great to watch catastrophe from a distance?
Protest
So much for mere money and related minor annoyances. Now consider, seventh, the dramatic expansion of extra-parliamentary protest movements in recent years. More people in more different countries for more different (and some of the same) reasons, using more different kinds (and sometimes the same kinds) of tactics, have taken to the streets and stayed there longer than ever before in recorded history. How do we explain this?
Again, this is no mere toothache. At first glance it looks like a new form of globe-spanning St. Vitus’ Dance. Ah, but what most people think of as antique peer-induced lunacy is, or was, actually caused by a virus: Sydenham’s chorea, associated with rheumatic fever. So it has, or had, actual neurophysiological causes. Similarly, possible reasons for the proliferation of protest movements are many and their mutual relationships are hard to stipulate, especially since no two protest movements have been exactly alike, but they do have actual causes. One is that technology is aggregating human emotions and perceptions, and is further enabling an intersubjective sharing of selected aggregations.
It is obvious that smart phones, which in many countries are more ubiquitous than landlines, have facilitated the organization and actual conduct of protests within countries. That’s clear from Hong Kong to Egypt to India to Bolivia. They probably also have magnified the demonstration effect of protest from country to country, especially within the same region. Television news saturation, which in advanced countries we take for granted, has also figured here and should not be overlooked in the many countries where it is relatively new (Sudan, say). In short, virtual electronic mobs can give rise to actual mobs.
But information technology proliferation doesn’t explain the entire phenomenon. Another factor is increased urbanization in many countries that have lately experienced protest movements. Another is gradually advancing literacy, especially female literacy. Another is the aging and loss of coercive reputation in a range of brittle authoritarian regimes. Yet another is the fact that media makes the foibles of elite misbehavior harder to hide and far more vivid than it used to be when revealed. Smart-phone video capacity and proliferation is a big deal here. The technology simultaneously disintermediates authority structures and reduces their capacity for social control and abets hyperconnectivity. The combination is a game-changer—a very big deal.
Another factor is the spread of the Western egalitarian ethos, in tandem with the idea of individual agency as opposed to hegemonic communal agency, to places where these concepts are still novel, such that people are on balance less tolerant of privilege, inherited or otherwise, and imposed conformity than they once were.
And yet another factor, interestingly, is economic growth. Very poor folk don’t typically get political; they’re too busy holding together what passes for normal life. And as already noted, when they get desperate enough to express political voice, they incline more to rioting than to protesting. To join a typical street protest, people must have the luxury of discretionary time and a financial buffer of one kind or another against the opportunity costs of lost income (unless, of course, someone is paying them to hit the pavement, which is hardly unknown in many Middle Eastern rent-a-mob cases). As the famous Davies J Curve theory of revolutions holds, the propensity for protest is especially large for populations whose members have been led to expect continuous improvement in their material situation, only to have the rug pulled out from under them by unanticipated developments. That applies to many, though not all, of the recent cases of protest. The idea, very popular among denatured American Calvinists, that rapid economic growth, which always encounters speed bumps sooner or later, conduces to political stability, has to rank as one of the most ahistorical and downright stupid ideas in circulation. This is not an argument against seeking growth, for oneself and for others; it’s just an argument against smug delusion about the consequences and, if it is to one’s tastes, a reminder of Isaiah Berlin’s point about values equally good not necessarily being commensurate with each other.
Robust and broadly distributed economic growth also conduces to other highly surprising conditions. If you happen to find yourself in Cambodia, say, you’ll see that, between the reasonably well-manicured small central precincts of the country’s few cities and the mainly pristine countryside, there is a wide and growing zone of hideous, litter-strewn filth. Not even Amy Wax could imagine litter, mainly of the plastic wrapper and bottle variety, on this scale. Five or six years ago, no such major problem existed. So what’s that about?
The Cambodian economy has been growing, supposedly, at an average of 6.7 percent per year for several years now (although in an only partially monetized economy that statistic is about as soft as they come). Whatever the actual figure, there is no doubt about the growth, which has created a new para-urban lumpenproletariat, highly concentrated as the country’s labor profile goes, in the garment and footwear assembly business, which uses fabric and materials imported mainly from China, India, and Bangladesh. That lumpenproletariat now has money enough to support imports of stuff packaged in plastic. The speed of this importation and consumption has far outrun public-services expansion capable of collecting it, whether for recycling or disposal. The typical para-urban, still pretty poor but upwardly mobile Cambodian consumer, in short, cannot identify the location of “away,” as in a place to “throw something away.”
Why belabor the litter issue, since Cambodia has not been a scene of recent mass protest despite having every right to be, and since it hardly poses the kind of danger that the synchronization of the world’s regional business cycles does? Because it illustrates the same basic phenomenon: an amalgamation trend that brings in its wake externalities of one kind or another, extant or potential.
Is that all? No. Some have suggested that the protests, in many countries at least, have transcended specific grievances against specific governmental leadership cadres, and really amount to a demand for no leadership at all—or for a new form of leadership that is non-hierarchical, completely flat—which is of course an oxymoron. In some cases, this amounts to emotional spasms of radical egalitarianism, people channeling Bakunin, Krapotkin, and Emma Goldman despite never having heard of them, let alone read them. Maybe they have all read Neal Stephenson’s The Diamond Age (1995) and liked the basic idea in it: the mystical, aura-driven Avatar-like crowdsourcing of everything, so that no administration, let alone government, is anymore necessary.
Is this a serious possibility? Well, in some countries, not to exclude the United Kingdom and the United States, the attack against expertise of all kinds is well documented. In many cases, too, mostly younger people have grown so used to rapid and efficient service provision via their smart phones—online banking and payment software, for example—that the ponderous pace of government bureaucracies doing pretty much anything has become incitement to disparagement, mockery, ridicule, and even occasionally violence. It may well be that the technology speeds up its increasingly addicted users to the point that both cognitive patience and attention spans are shot to hell. Government thus becomes almost too easy a target for spoiled, impetuous people, even the parts of it that are not the Motor Vehicles Administration.
The recent Chilean protests seem the poster child for this sort of thing. Compared to, say, Venezuelans, Bolivians, Ecuadoreans, and Colombians, Chileans have been living in the lap of luxury with a government tuned well enough at least for folk music. Yet that did not stop them from trying to rip Santiago a new sewer line, so to speak, over a mere increase in public transportation fares.
This may seem quixotic, if not asinine behavior, but young people cannot be blamed entirely for thinking their elders have screwed up. Global warming fears are often the trigger for this, but other triggers exist as well—ones that fire up the transparency saints, for example. This sort of thing, too, can be the paradoxical consequence of sudden and significant affluence. On that point one may repair to Daniel Bell’s classic “cultural contradictions of capitalism,” or just recall Saul Bellow’s famous character Herzog, who observed:
. . . civilized individuals hate and resent the civilization that makes their lives possible. What they love is an imaginary human situation invented by their own genius and which they believe is the only true and the only human reality. How odd! But the best-treated, most favored, and intelligent part of any society is often the most ungrateful. Ingratitude, however, is its social function.
In this formulation, mass ingratitude, or in some cases mere entitled complacency layered onto a critical mass of smartphone addiction, is sufficient to bring people into the streets. If to this one adds the hubris characteristic of newly or half-educated people, who think they have a right to have political ideas when, most of the time, it’s somebody else’s political ideas that have them—and here our text is Jose Ortega y Gasset’s The Revolt of the Masses (1930)—you have a perfect storm for newly minted aspiring-bourgeoisie street demonstrations. The technology collects and helps share grievances, and so is a trigger for expressing frustrations whose sources usually have little to nothing to do with technology.
It is churlish, however, to generalize. The recent anti-Iranian protest movements in Lebanon and Iraq, for example, which involved Shi‘a as well as others, and which have been paid for already in many gallons of blood and several hundreds of corpses in the Iraqi case, represent on one level efforts to forge a modern nationalism to transcend the tribal-sectarian formulae that have been the ruination of both countries. Nationalism seems retrograde and dangerous to many Europeans, who unsurprisingly (but mistakenly) associate its excesses with inherent illiberalism, and not mistakenly with 20th-century continental self-immolation. But nationalism in the Levant is progressive by any reasonable measure compared to what it would displace. It just goes to show that the effects of the global forces of IT-technology-driven amalgamation and contagion play differently in different historico-cultural contexts. This is not, perhaps, a marginal observation.
Weaponizing Everything
Now consider, eighth and finally, that a whole range of human activity that used to be diffuse, episodic, and even to some extent random can now, by dint of “technovelties” like Generative Adversarial Networks (GANs), be focused, regularized, and aimed. GANs have many positive uses—even, for example, in clinical medical/pharmacological research. But as the technology underlying the fabrication of deepfakes, it has the potential to transform disinformation campaigns from marginal irritants into first-order strategic tools within political campaigns.2
Less dramatically—maybe, for it is yet to be demonstrated—think of the algorithms that have transformed advertising from hit-or-miss circus-barker appeals to highly sophisticated and stealthy neurocognitive management stratagems. It has still yet to dawn on most smartphone addicts that, while they technically own the products that are their phones, it’s actually the designers, manufacturers, marketers, and advertising platform users of those devices that are harvesting their money and personal data.
Something similar applies to Facebook and other kindred platforms. Facebook is an amalgamizer of countless discrete communications among individuals and small groups that, summed together, constitute the product for what is otherwise a deceptively simple advertising venue. As far as business models go, it’s brilliant. As far as its macrosocial and political impact is concerned, not so much. Facebook was famously weaponized in Myanmar against the Rohingya, but its less telegenic impact elsewhere is not necessarily more benign.
Perhaps subtler but not necessarily less important, the information technology behind Facebook, Twitter, and similar platforms enables individuals to summon larger audiences, sometimes very large audiences, for themselves. It is obvious that Donald Trump has used the technology to disintermediate the American Presidency, cutting out as many of the institutional layers as possible designed to help him (he thinks to constrain him) and to protect the rule of law, and with them a host of related if less formal norms. He has, in short, weaponized the technology for domestic political competitive purposes.
It may be, too, that the general transformation of institutions originally created and evolved to define rules, norms, and conduct for important social functions into platforms for performative acts depends at base on this characteristic of the technology. We note an arc of audience augmentation rising over many years from radio to television to the internet and its appurtenances, so what is being described is not entirely new. But the technology of audience augmentation is now available in essence to nearly everyone. It may be that this is turning the social Zeitgeist as a whole toward performative behaviors, not to exclude the spectaclization of politics.
If so, this might mean that only other-oriented personalities need apply for political authority in democratic political cultures, since quieter, contemplative types would be at a vast comparative disadvantage in political life. Since some people thrive when confronted with crowd attention but others, more inner-oriented, are made uncomfortable by it, it might explain why Facebook participation energizes some users but makes others unhappy and even desperately depressed; and why some find their inner bully while others seem especially vulnerable to bullying.
Last in this final category, the commercial electronic mass media deserve a quick drive-by. Media concentration comes in three concentric circles of ill-smelling sewage pools. We see it in the entertainment industry: Corporate ownership and hence marketing and distribution are all more concentrated than ever. Information technology makes all aspects of creating and running such amalgamations more cost-effective. Indie efforts persist, mercifully, but the lowest-common-denominator output—salacious, violent, dumbed-down, gimmicky, and crass—has over time trashed cultural norms and sleazed its way into American political discourse to the point where the behavior of the American political class has become indistinguishable from any random episode of the Jerry Springer Show.
Certainly, absent the influence of the mainstream electronic mass media, it is impossible to imagine Donald Trump as President of the United States. His entire personality is a composite affectation designed to protect his ego—a man for whom acting is a way of life, a man who thus became a successful reality-TV personality because he required no additional training for the task of eliciting and manipulating outrage, crassness, bullying, and deception.
As for the so-called news, nearly all non-print versions have become biographized infotainment, long on ad hominem-driven emotion and outrage incitement, short on facts and analysis. That it has become polar-ideologized into clickbait-stimulated self-referential wormholes is by now a commonplace, though it is not less accurate for so being. What the big-money advertisers want, they get, unless those in positions of political authority stop them. But those in positions of political authority show no inclination to stop them because the same agglomerated pools of money increasingly finance their political careers.
Meanwhile, in the third outermost pool, the tech giants’ business model has strip-mined investigative journalism and turned the local newspaper into an endangered species. The result is a mediocritized homogeneity or worse (note ONA’s near-monopoly in many parts of the United States), pockmarked with hollow “believe it or not” spectacle, which has the general effect of disorganizing the stock of knowledge a reader or viewer might have gained about pretty much every subject it touches.
What’s New?
Little of the foregoing is particularly new. Appropriate footnotes could probably be appended to every paragraph. The problem, as is often the case these days, is, again, that every one of these data is apprehended as a one-off. Yet every one of them has either a little or, more often, a lot to do with the amalgamizing impact of information technology on aspects of human behavior whose basic purposefulness has been with us for eons. It’s a very long time now that people have cared about business trends, sought to get rich, saved and borrowed money and tried to avoid paying taxes, stole state and other property according to rank and access level, understood risk but dismissed uncertainty, protested against their ruling elites who in turn invariably tried to deceive one another, tried to sell each other things, and occasionally displayed bad taste and worse manners. So what?
So what is that all these functions, and many others left unmentioned, have been warped into top-heavy amalgamations spewing occasionally angry bottom-heavy reactions. Economies of scale are real, to a point. But excessive concentrations of power and money, which often amount to the same thing, are always problematic, whether public or private, individual or social, corporate or governmental—especially so when they display the yawning hypocrisy of societies purporting to be egalitarian in aspiration and democratic in form. These conglomerations in every domain sacrifice balance, moderation, and healthy diversity in human—and hence imperfect, frail, and accident-prone—systems. The general impact of the net effect, illustrated in all eight manifestations described above, is to make human social subsystems simultaneously more opaque from complexity and more brittle, hence more prone to “surprise” catastrophic failure when they do fail.
What is becoming clearer now, to bring all the foregoing together, is that we face a perfect storm of concentrations of power simultaneously, involving data, money, perceptions, audience sizes, and media-enabled disinformation efforts. We have, in short, an unprecedented concentration of concentrations, an unprecedented bundling of amalgamations so complicated that, I dare say, no one really understands it completely.
Consider, to conclude, fish. We all now appreciate biodiversity. We revere the wisdom of evolution or divine design, depending on one’s tastes, in building interdependent balances in nature such that the whole displays remarkable resiliency. In the ocean’s food chains, for example, we recognize the existence of buffer mechanisms that enable adaptation to any one species’ diminishment or even disappearance, such that the entire system does not collapse. So how has it come to be that, in essence, fish are smarter than people, who have in recent times been frantically racing with each other to achieve precisely the opposite result?
We like to say that fish are the last to discover water. At least, in this formulation anyway, they do eventually discover it. What about us?
1Thomas Streeter’s deployment of the term in The Net Effect: Romanticism, Capitalism, and the Internet (2011) overlaps only some with mine. Others have used the phase to call attention to the rich becoming better able to become very rich, owing to larger market scale. No one, to my knowledge, has used the term as I am here: to refer to an entire class of technology-driven effects.
2See my “A Phenomenology of Disinformation,” Inference (Paris), Spring 2020.