Not long ago the New York Times ran a major, front-page story on the military-strategic implications of “fifth generation” information technology—the so-called 5G network. Specialists probably found little new in the article, but for a general reader it both provided a useful summary and carried a judicious tone of stolid concern short of alarm. We at TAI have been focused on this subject as well, employing a similar tone (note our essay coauthored by a former DNI making similar points, published about four and half months before the NYT piece).
What caught my attention in this New York Times feature was less the main lines of the story than the matter-of-fact way the piece was set up. Inadvertency often signals submerged assumptions, rather like Freudian slips but without their supposed psycho-diagnostic value. Even before the page jump in the print edition, the reader is informed as follows:
It is the first network built to serve the sensors, robots, autonomous vehicles and other devices that will continuously feed each other vast amounts of data, allowing factories, construction sites and even the whole cities to be run with less moment-to-moment human intervention. It will also enable greater use of virtual reality and artificial intelligence tools.
But what is good for consumers is also good for intelligence services and cyberattackers….
So, good for consumers as well as for cyberattackers, with the latter long-since neologized as a single word—which tells us how deeply the focus of our language pioneers is sunk in the wonders of technovelty. But where does this bivalent choice leave me? I’m not a cyberattacker. Even if I wanted to be one, I wouldn’t know how. And I take umbrage at the notion that, here in the presumably post-Skinnerian 21st-century, “consumer” is a synonym for human being—and not just on Eighth Avenue in New York City.
To be a consumer, preferably a discerning consumer, has become by default the apogee of what it means to be a human being, at least among those on the dull cutting edge of lapsing Western modernity. No one has ever formally argued the point as such, as far as I am aware. As Oren Cass has argued brilliantly, it has rather oozed over us slowly and silently thanks to the seemingly antiseptic “scientific” ministrations of garden-variety macroeconomists, whose accrued authority beyond their data and competence has become a plague of our times.
So it is that, as older, religiously infused ways of comprehending human purposes and character have eroded, we are left with forms of materialism alone—in our case one that exalts consumption and satiety over production and work, the corpulent and demobilized over the sleek and creative. It’s ironic when you think about it, given the ubiquitous marquee presentation of Cold War ideologies not that long ago: The bad buys were the rank materialists, remember? (Some would, possibly, be reminded of the Civil War, Tilden-Hayes, and all that: The North won the war but in many non-trivial ways the South won the peace; mutatis mutandis, the West won the Cold War but the Soviet Union had to die to make Marxoid caricatures of social realities safe for Western academics, other intellectual poseurs, and Wall Street mavens.)
And so we have come to worship at the altar of GNP, as if most Americans knew where that metric came from or what it actually measures. We minister to the god of aggregate demand, whose proverbial apostle is Henry Ford—as in, pay the workers more so they can buy more stuff, and around we all go, and how much made-to-break garbage this routine dumps into the environment, nobody knows. We counterfeit the world by means of numbers, obsessing over employment, labor force participation, trade balance, and interest rate statistics, none of which existed in a ponderable format before around 1913—and yet somehow the nation persisted and even thrived.
What is beneath this proclivity to enthrone the consumer as the paragon of human virtue? As usual, it’s not any one thing alone. The power of the stripped down, secularized Calvinist premise—that material affluence is a sign of a moral and worthy people—should not be underestimated, and anyone wanting to understand its improbable origins should consult Max Weber’s still-standing 1905 stab at an explanation. But if I had to place my bet on one major driving cause, it would be our abiding technological optimism.
Technological optimism is hardwired into the American psyche, a psyche born in the womb of the Enlightenment and suckled by the astonishments of the Industrial Revolution. As a variant of developmentalist ideology, it is so strongly present in our presumptions about the connections between material and spiritual progress—presumptions that display the twinned Calvinist and Whiggish aspects of our inheritance—that no amount of disconfirming evidence can budge it.
That optimism walks hand in hand with visions of an endless parade of Horatio Algers going from rags to riches by dint of having been in on the commercialization of some new technological marvel, whether real or just snake oil made of metal. The American Dream may be unfettered equality of opportunity, but the zenith of the theologically denatured Calvinist business plan is to do well by doing good. Make it, market it, consume it, smile broadly at it, and don’t look back—at least until something or other bites us in the ass.
The only major technological breakthrough in American history that our leaders did not bid be taken to market immediately was nuclear energy. For that they created the Atomic Energy Commission in 1946 (and good thing they did). But the apparent motto for everything else—from electrification to mechanized agriculture, from television to the automobile and the highway system birthed in its wake, from the birth control pill to the iPhone—has been let ’er rip. And why not? A lot of people made a whole lot of money, and the almost unimaginable affluence we now experience as a society is unthinkable save in the context of general technological advancement.
So regular exposure to artificial lighting causes precocious puberty, especially in females. So what? So mechanized agriculture has inclined to corporate ownership of land and the food supply, as well as the rise of monoculture and all the environmental implications thereof. So what? So television has contributed mightily to the decline of deep literacy and the loss of both touch skills and integral community. So what? So cars and the highway system enabled the suburbs, which have eroded social capital and exacted major environmental costs. So what? The birth control pill, in tandem with other factors, has had dramatic effects on social structure, the labor profile, marriage and family norms, including child raising, not all of them unalloyed joys. So what? Indeed, our unprecedented and broadly shared level of affluence itself has turned out to be a mixed blessing, contributing both to burgeoning social isolation and an array of “lifestyle” diseases. That’s what.
Don’t get me wrong: I’m no Luddite. If I had a time machine I wouldn’t go back into history and negate these developments, or thousands of lesser others. I wouldn’t relish our standard of living being cut in half. I’m not sorry the genome has been decoded, despite the many ethical dilemmas that achievement already poses, to say nothing of the dilemmas to come. It is dangerous when man plays God, yes. But to be created in the image of God means to be endowed with creativity. We are, I believe, meant to use that creativity, but also to struggle with its ethical implications. That is what being human is really about, and somehow just being a “consumer” doesn’t quite match up to the challenge.
The struggle of which I speak is not, it seems to me, best executed solely by dint of post hoc regrets—DDT, thalidomide, and Love Canal come to mind, but there are many others. Of course it is impossible to anticipate all the ramifications of major technological advancements, and knee-jerk application of the so-called precautionary principle has always struck me as evidence of psychiatric need. But the degree of difficulty we have in anticipating the effects of technological change is to some extent a function of the quality attention we devote to trying. We don’t try too hard. Case in point: We once had an arm of Congress called the Office of Technology Assessment. It did some good work, so we abolished it.
Which brings me back to the point: The 5G network, effuses the NYT, “is likely to be more revolutionary then evolutionary. What consumers will notice first is that the network is faster—data should download almost instantly, even over cellphone networks.” Ah, our insatiable need for speed. Would it really be too much of a nuisance to just briefly review the known effects, so far, of third- and fourth-generation mobile communications technology on human cognitive functioning, and the various social implications flowing therefrom, before we run pell-mell through door number five?
Of course it’s too much. We as a society won’t do it. But here is just a short, raw list of what some of the downside implications may be, notwithstanding the upsides that comingle with them. We haven’t room here to argue and explain, merely to assert—and hope the perspicacious reader feels the thrill of verisimilitude.
First, the way we read has already changed profoundly. As a society we are progressively losing our capacity for deep literacy. We skim and jump around as we rush through materials, exchanging a fine sense of sequence, time, and logic for the getting of many gists, albeit poorly. Our attention spans have been shot to hell, and that already affects not only what we read but what we write, how we write it, and how publishers publish it. A collective dumbing down is unmistakable.
The growing inability to dwell with a text, whether fiction or non-fiction, is truncating our capacity for empathy, diminishing our facility for conceptual and syncretic thought, and discounting our orientation to genuinely difficult subjects. If it takes a longer time than we’re willing to invest to scope out a problem, then suddenly it’s not a problem to grapple with but a mystery to be warehoused like the Ark of the Covenant in the famous closing scene of Raiders of the Lost Ark.
Second are the physiological effects of marinating ourselves in mediated images (sight and sound, mainly) at the expense of direct ones. And ironically, this at a time when many of us increasingly credit the need to reconnect with nature—even as we retreat further from it despite ourselves.
Mediated images are counterfeit images. Mediated images are dis-embedded from tactile and social realities, lack full dimensionality, and so are more vulnerable to manipulation than direct images. Mediated images juice the cognitive demand to feed our novelty bias, tripping off micro-endorphin rushes and thus literally exhausting us. Hence the mass migration of media to shock-bar business calculations and the dominance of clickbait, and so the ease with which entrepreneurs of social division market political polarization via “fake” news and cacophonous accusations of fake news. Result: the more information we can access the dumber we get.
Then there is the social isolation and sheer loneliness abetted by the technology, as well as the artificially acquired social autism afflicting so many young people. Our youth are less emotionally mature than they used to be at comparable ages, and that partly accounts for the proliferation of campus safe spaces and trigger warnings to spare our precious snowflakes from any untoward encounters with reality. It’s probably also part of what is behind the enormity of student-on-student gun violence.
So yes, indeed, let’s keep doing all this, and let’s do it even faster! Great idea that 5G…and so much money involved.
One almost hopes the Chinese win the race.