Here’s an intriguing finding from comparative psychology from about a decade ago: When French people are jealous, they tend to get angry; when their Dutch counterparts are jealous, they tend to feel morose; but when Americans perceive they are jealous, they tend to check with acquaintances to make sure they haven’t let on. This finding comports with other studies suggesting that Americans are unusually prone to try to conceal emotions they don’t value, and particularly concerned about peer reactions. It also seems to track with the evolution of standards for jealousy in the United States over the past several decades. In the pantheon of emotions, jealousy was previously not given much weight one way or the other; more recently it has acquired the status of an undesirable emotion.
Not only do different societies manifest distinct patterns of emotional expression, but the same society can also change its signals over time. Americans in the 19th century were urged to play up grief, which was seen as an appropriate response to death and a bittersweet fulfillment of a family’s love. The result was elaborate cemetery markers, mourning symbols and rituals, and detailed etiquette for dealing with grieving survivors. Parents in the 1870s and 1880s could even order grief kits for their girls to use with dolls, complete with black clothing and appropriately sized caskets: It was never too early to learn this vital emotion. Obviously, these standards have almost entirely vanished. Dearly Departed Barbie or Grieving Ken are not among the otherwise innumerable options available to today’s youngsters in a society that tends to think that young people should be kept fairly distant from exposure to grief. In a remarkably short period of time, American standard-setters did an about face on this complex emotion.
As these sketches suggest, different societies, groups and time periods evaluate emotions according to distinctive standards. A core emotional experience may be invariable in terms of innate human physiological responses, but it will be shaped strongly by cultural norms. Thus jealousy may be indulged French-style, or repressed à l’américain. Grief may be taught or shunned, depending on the historical period. Of course, different individuals react to social norms according to their own personalities, so the subject has clear complexities. But there can be no doubt that emotions respond in part to standards societies explicitly or implicitly construct.
Sometimes these standards, though formed for apparently good reasons, turn out to have undesirable consequences. It was convenient in terms of family harmony in the 19th century, for example, to argue that “good” women should not be angry. But the injunction not only promoted inequality; it caused psychosomatic illnesses, as well. Thanks to a considerable overhaul of emotional culture in the United States beginning in the 1920s, American society has accumulated more than its share of counterproductive cultural baggage. Further, precisely because psychologists and popularizers have promoted new standards so vigorously, it has been difficult for most Americans to recognize the extent of damage.
To most of us, moreover, emotional reactions seem simply inevitable. They are what they are, for ordinary souls seem quite oblivious to the social construction of emotion. It’s high time to take a look at recent patterns. The results can be more than merely amusing, as with anecdotes of hidden jealousy or doll caskets. Some of these patterns, which we are not helpless to change, can be positively harmful.
Several factors have propelled the emotional redefinitions of the past several decades. One is growing consumerism’s role in encouraging emotions that help sell goods and discouraging those that don’t. One result: Americans make far clearer distinctions between “good” or happy emotions and “bad” or complex ones than they once did. It was in the 1920s and 1930s, for example, that anger, regarded as potentially useful for men in the 19th century, began to be labeled unequivocally bad by expert industrial psychologists who were eager to create a trouble-free workplace. There were more direct consequences from consumerism as well. Aside from advances in medical science and consequent lower mortality rates among the non-elderly, formal mourning declined from the 1920s onward because it did not sell many goods. Injunctions to get over grief and be happy might help a mourner put away the black dress and venture outdoors for a shopping spree.
A managerial and service economy also promotes reassessment of standards more suitable to an industrial age. Dale Carnegie and hosts of others taught salesmen to look cheerful all the time and never respond to customers with anger. The same lessons could apply to managerial hierarchies as well, where getting along is vital to a functioning enterprise. Industrial bosses increasingly told middle managers to keep their emotions in check, with industrial relations handbooks by the 1950s using revealing mottos like “impersonal, but friendly” for guidance.
Above all, a relatively new group of professionals—psychologists and their popularizers—translated changes in the social organization of work into the emotional field. In so doing, they became leading gurus, replacing religious authorities as prime definers of emotional standards. In the 1920s, people like Elton Mayo in the industrial-relations fields, D.H. Thom, a key expert for the new Federal Children’s Bureau, and behaviorists like James Watson headed what would become a prolonged parade. Not surprisingly, psychologists such as Watson often found it necessary to argue that the norms of past generations must be reconsidered in light of modern knowledge. A host of now-forgotten but once widely selling magazines like Current Opinion or Lippincott’s, as well as more familiar new staples like Reader’s Digest, began in the 1920s to label extensive grief as outdated as well as wasteful.
The role of media in the social construction of emotion began to increase, as well. This was not brand new: Powerful novels back in the 1800s, such as Goethe’s Sorrows of Young Werther, could move people to tears and even promote sympathetic suicides. Certainly the impact of Hollywood beginning in the 1920s showed the power of modeling presumably appropriate expressions of love as the basis for marriage. Later in the 20th century and into the 21st, however, the media became even more influential in shaping and defining emotions, suggesting, for example, what kinds of deaths were so tragic as to warrant massive public outpourings of grief—Princess Diana, the victims of 9/11. But by omission the media also suggested what kinds of tragedies could be accepted more passively—like deaths of foreigners in Rwanda or Baghdad.
The direct use of fear as a motivator increased, as well. A Lyndon Johnson ad in 1964—the one of the girl with a daisy under a mushroom cloud, used to symbolize Barry Goldwater’s dangerous bellicosity—was an early entrant in what turned into a media torrent. It became difficult for many people to know where their own emotions began and where those created by media representations ended. Thus public grief expands when media swarm to the scene of a tragedy, but not where the cameras are still. Scenes of anguished parents in a California child-abduction case prompt fears that make parents double-bolt their doors in New Jersey.
The result of these three factors—economic change, the transformation of expertise, and the “media-zation” of emotion—caused a substantial reassessment of what emotions were all about and how they should be handled. Three key principles seemed to flow from them.
First was a belief that children were emotionally fragile, requiring careful management by responsible parents. This finding resulted particularly from what psychologists thought they were learning about children, and also from the rapid reduction of the birth and infant-mortality rates, which gave adults more time to worry about the emotional vagaries of individual children. A whole spate of new childrearing literature poured out from the 1920s onward, with people like Sidonie Gruenberg becoming household names, even before the triumphant advent of Dr. Spock in the 1940s—all bent on teaching strategies to protect the emotionally vulnerable child. The increasing sense that parents were accountable for their children’s emotional maturation was a huge change in and of itself, adding greatly to what many parents worried about. By the 1930s, letters to outlets like Parent’s Magazine abundantly testified to this extension of parental responsibility. New attacks on using guilt to discipline children because of the damage to fragile self-esteem were another sign that times were changing on the childhood front. Experts like Gruenberg intoned that “traditions of guilt” must be overthrown in areas like toilet training. Several years of diapers were well worth protection from emotional trauma.
Second, shading off from new concerns about kids was a growing sense that emotional intensity, with few exceptions, was neither socially useful nor personally healthy even for adults. New attention to problems like high blood pressure contributed here, but so did the hope that people could get along smoothly in a corporate, sales-oriented work environment. This is where the idea of impersonal friendliness struck a chord. This was also the context in which the kind of intense mourning recommended in the later 19th century was now revisited. By modern standards, new authorities like Amy Vanderbilt argued in her Complete Book of Etiquette (1952), intense mourning would be a sign of emotional instability and an inappropriate burden on friends and acquaintances. Here, too, was a big shift from earlier times.
Third, as part of raising modern children and bypassing intense emotion, new popularizers from Gruenberg to television’s Mr. Rogers often told Americans that talking about emotion was a good path to mental health. Ventilating emotion was far better than actually experiencing it. So experts like marital counselors in the 1960s urged spouses to say, with complete calm, “Do you know that makes me mad?” (a tactic briefly and questionably known as “Rage Release”). That gave many Americans a misleading sense that they were freer and more relaxed than their stiffer, stuffier ancestors. It was alright, now, to tell one’s friends about how angry one had felt—often as a substitute for displaying anger directly. Indeed, the social standards for controlling anger remained quite demanding—the Rage Release folks quickly added, “You don’t want to worry or irritate anyone”—but the experts did not encourage Americans to think about this aspect of the matter.
Grief, Anger, Fear
These key changes in goals and tactics—talk about emotion rather than express it in behavior, monitor children’s feelings and avoid using guilt to socialize them, and so on—were applied to a range of emotions, but with some specific additional twists in each instance. This was the framework in which jealousy, for example, received new attention, with adults scurrying around to make sure they had not come on too strong. Popularizers like Gruenberg also raised new concerns about guilt, speculating that this, too, might be a counterproductive emotion. Sure enough, progressive school authorities, beginning in California in the 1950s, increasingly emphasized self-esteem over shame and guilt, with results that can certainly be debated.
But the three emotions for which redefinition mattered most were grief, anger and fear. Each received its own rendering, but experts and popularizers reviewed all three in light of the children’s supposed fragility and the new uneasiness with emotional intensity. The process began in the 1920s but rose steadily for decades, with a growing popular response and every sign of considerable internalization. It was from this emotional trinity, finally, that the most questionable public consequences emerged.
Grief suffered clearly in the new implicit ratings schema. Campaigns in the popular magazines began early in this instance, even before the 1920s. The denunciations of tradition were, ironically, surprisingly passionate. Grief was now cast as an outdated indulgence that caused people to waste time and money better devoted to more constructive goals. Etiquette manuals such as Amy Vanderbilt’s 1952 entry shifted focus from suggesting polite treatment of mourners to advising those grieving to be careful not to encumber others. According to the new standards, grief was best handled promptly and privately. People who could not manage this should see a therapist, because they were a nuisance to themselves and others. Sure enough, “grief work” in therapy, as social scientists such as Helena Lopata argued in counseling older widows in the 1970s, focused on the assumption that excessive grief was a problem to be overcome. Many trappings of grief, including mourning symbols, declined rapidly, as did time granted off from work to deal with sorrow—the new standards had teeth. Often, as well, children were simply kept away from grief scenes altogether.
Anger, never an emotion in which any society can indulge too freely, lost standing as a useful motivator. Not surprisingly, personnel manuals picked up the new signals first: Anger and modern work environments did not mix well. From Elton Mayo in the late 1920s onward, industrial psychologists urged managers and foremen to keep their own emotions under wraps and to manipulate employees so that their own anger could be dismissed as immature. A host of training programs—Total Quality Management is an example from the last decade that was widely deployed in corporations like Xerox—aimed at demonstrating that anger was an “unproductive” response. Childrearing manuals, including Dr. Spock’s, picked up the same signals: Revealingly, in their indexes, “anger” was now relabeled as “aggression.” Responsible parents, these authorities insisted, should teach children to keep their anger under wraps, because it could so readily veer out of control and might create dysfunctional adult personalities. In the normal course of daily life, at least as painted by modern experts, anger had no legitimate place at all.
Fear was another obvious target. Here was an emotion rarely welcomed in any event, now easily seen as an intrusion into the fragile psyches of the young. As early as the 1930s, childrearing manuals such as those by Gruenberg, who became director of the Child Study Association of America, began warning parents not to urge their kids to be brave. Their fears would be too overwhelming for this approach. Even the American military gradually picked up some of the new signals by World War II and even more in later conflicts, progressively allowing more open ventilation of fear, rather than assuming that only malingerers would not be able to keep it bottled up. Thus 1991 Gulf War pilots talked freely of their fears with the press—something that had been anathema in the stiff-upper-lip traditions of the world wars.
Fear, however, could not be handled as anger was now supposed to be. The idea that traditional injunctions to overcome fear through courageous self-control were dangerous in themselves was a key innovation. Childrearing manuals urged this explicitly: Fear must be avoided as much as possible. Revealingly, by the late 1930s a new breed of superheroes—Superman was the pioneer—were noteworthy for having no emotions to face down. As Parents’ Magazine put it directly in the early 1950s, “Unless some grown-up helps them, each frightening experience leaves [children] weakened for the next assault.” Strategies focused on keeping children away from fright and danger and providing massive reassurance when the emotion intruded anyway. Avoidance was the goal.
These new standards for grief, anger, fear and other emotions, which gradually made inroads in the American emotional landscape from the 1930s onward through the vigorous proselytizing of family experts and human relations authorities alike, did not take full command. Various individuals and regional, ethnic and religious subgroups accommodated the new standards in different ways. The spread of funeral homes, for example, facilitated some expressions of grief in controlled settings, modifying the fiercest hostilities of the modern keep-it-under-wraps camp. Individuals might still get angry, even in settings where the emotion was disallowed in principle. A chief executive might urge anger management on his subordinates, only to indulge his own rage in bossing them around—a not infrequent manipulation of the new norms. More broadly, a few outlets for anger did remain, though they were exceptions that to a large extent proved the rule in normal life: Sporting events, most obviously, gave many people a chance to watch anger in action and to express it in the stands, a popular safety valve. A number of Americans still preached and exemplified courage, insisting that challenges be met head on and fear overcome.
As new emotional standards made headway in American life, furthermore, they brought some obvious benefits. The new distaste for grief helped create the emotional environment in which medical responses to severe illness could gain ground: Death fighting took precedence over death indulgence, and the approach often worked. Doctors notoriously hated to deal with grief, but they were also eager to prevent its occurrence by pulling patients through. More broadly still, though this is partly a matter a taste, a society urged to be cheerful may be preferable to one easily moved to melancholy in any event.
New strictures against anger also arguably improved relationships in a number of settings. Partly, of course, because we are influenced by the new standards ourselves, most of us probably think that a limit on the use of anger by teachers against students is a good thing in modern classrooms. Having clerks and flight attendants trained to smile through annoyances is arguably preferable to the surly (though perhaps emotionally freer) shopkeepers of Soviet Russia. On a more basic level, street fighting has declined in part because most Americans have accepted at least a portion of the anger-control standards. Violence, to be sure, is widely represented, though not always with anger. (The subjects are actually different: It is a failing of contemporary American culture that anger and violence are routinely linked.) Political mudslinging appeals to anger, but candidates publicly treat each other politely, far more than was the case a century and more ago. The issue is not uncomplicated, but a host of standards and disciplines inhibit direct expression of anger against children, customers or co-workers—the settings in which most Americans live—and that is not such a bad thing.
Protecting children from fear obviously had a host of good results, too, particularly as they extended into new safety devices designed to make life more secure and less scary. To be sure, the nation became almost manic at some points: Witness the national embrace of plastic gloves, surgical masks and antibacterial soaps against fear of the wayward germ. It’s a price that might be worth paying. Attacks on venerable fear-testing institutions, like extreme fraternity hazing or unregulated Halloweens, probably seem progressive to most Americans. Surely letting soldiers admit they’re scared, rather than labeling them cowards, is an advance that arguably does not detract from actual military performance.
The list could go on, but two points are abundantly clear. First, the new standards have interacted with the variety and common sense of many Americans, so they were not always carried too far. Second, they brought some advantages in their wake, in emotional life and the kinds of behaviors and policies that result. But they have also brought disadvantages, and some of these are too rarely discussed, indeed too rarely perceived.
One of the drawbacks of any emotional culture is that it tends to hide its dirty linen amid repeated insistence on its standards and their apparent validity. To take the most obvious example, it’s hard to talk about the downsides of American insistence on anger control when advocates of such control—both institutions and individuals like Carol Tavris or George Will—keep telling us that we’re still too angry. In this case in particular, well-intentioned editorialists not only assume that anger is wrong; they also exaggerate lapses in the interest of promoting fuller adherence—obscuring the fact that it’s the standards, not the existence of anger, that are new. Road rage is an interesting case in point, seeming to highlight a growing emotional problem, when in fact behaviors have not measurably changed. It’s difficult to step back from the standards themselves and from their exaggerated claims, quite apart from the presumptuousness of putting a nation on an analytical couch.
That said, there are real costs to the emotional culture that surrounds us, and some of them are severe. They are personal, but also political. They even color foreign policy.
Grief provides a welcome entry point because the critique of the drawbacks of modern grief standards is already reasonably well established. The modern approach to grief, predicated on a low death rate before later age, is almost inherently vulnerable when a child or young adult dies, generating a level of emotion that has no ready outlet. Many marriages break up amid confusions over grief and guilt if a child does die. Many adults turn to strangers in groups like the Thanatos or Sympathetic Friends organizations, whose only links are a shared tragedy and exposure to the insensitivity of well-meaning friends who lack their experience. It’s easy to feel emotionally isolated in an uncomprehending environment where grief is so widely shunned, and no amount of therapy readily compensates.
For an even larger number of Americans, distaste for grief contributes to the constant pressure to add on more and more stages of heroic medicine, often well beyond what the patient actually wants. Better to keep fighting than to deal with the grief that would accompany an earlier acceptance of the inevitable. Even living wills are frequently ignored by relatives (and malpractice-wary hospitals) who press to keep terminal patients alive, lest they face an unacceptable mix of grief and guilt. Small wonder that many recent experts, from Elisabeth Kübler-Ross in the 1960s to social scientists like Margaret Stroebe currently, have called for a reassessment of the modern emotions surrounding death—without, to date, more than modest effect.
Efforts to evade fear have generated the ironic effect of contributing to radical miscalculations of risk. Hoping to prevent the need for fear, and sometimes lacking much socialization in dealing with the emotion, many Americans magnify a sense of danger: Since fear is unacceptable at any level, gradations may seem to be inconsequential. Barry Glassner’s 1999 book The Culture of Fear accurately pinpoints a variety of serious misperceptions of risk. Thus parents routinely exaggerate the dangers of child abduction and sexual predation by factors of several thousand percent (as against the less than a hundred average annual cases of stranger abduction), and some hovering contemporary parents keep their kids under almost constant watch to protect them from threats that are in fact quite remote. A recent Miami case involved parents who would not let their kids walk down the driveway to collect mail, lest they be spirited away. In the mid-1990s, 62 percent of polled Americans said they were “truly desperate” about crime, at a time when rates were in fact falling rapidly.
The personal results of these kinds of fear-soaked miscalculations are hard to calculate precisely, but many Americans are more anxious about certain issues than they objectively need to be. Too many worry far more about their children, and overprotect and over organize in consequence, than is desirable. A few recent observers like Jackie Orr even argue for more severe outcomes of poorly guided fear, such as an increase in the incidence of panic attacks, though the jury is still out on this one.
Does the modern campaign against anger have personal results? Certainly many people feel inchoate frustration that an emotion they experience has so few legitimate outlets. The popularity of scream therapy, developed by adepts of the Rage Release school, where people two decades ago were urged to go into their closets to shout at the top of their lungs, was revealing: Since it was not acceptable to show anger to customers or colleagues but it built up anyway, find a way to express it—but carefully, privately, where it would have no impact. Others worry that they don’t measure up to the modern standards, as pundits like George Will keep claiming that the emotion is out of control; the popularity of anger management programs suggests a perceived personal problem. Anger, in other words, can create personal uneasiness of several sorts.
The larger downside of the anger standards is more political than personal, however. The widespread attempt to brand public anger as a sign of immaturity obviously inhibits protest—which is exactly what the industrial psychologists who sought to limit emotion in the workplace intended. Leon Baritz’s survey of the field, Servants of Power, makes this point conclusively. After all, employers were paying the freight. Foremen were taught to make an aggrieved worker repeat a complaint three times, on the assumption that, by the end, embarrassment would override the anger and the tension would drain away. In personal and collective agitation alike, it is easy to brand anger as immature and its goals therefore less creditable.
The nation has certainly developed elaborate political symbolism around anger constraint. Robert Reich, the former Secretary of Labor, tells how colleagues insisted when he got to Washington that avoidance of anger was a top priority. Any public display would be a sign of weakness. Or think of Bill Clinton just a few months ago when he became visibly (and some might think, understandably) angry at accusations that his Administration’s laxity had led to 9/11. His annoyance, not the substance of the matter, is what hit the newscasts. More broadly, the media regularly subject leading political candidates to anger tests, with news “moderators” like Tim Russert tossing debate questions at them that might irritate any reasonable person, just to make sure that they will in fact show robot-like restraint.
Routines of this sort may unwisely constrain the political process, forcing an unhealthy level of dissimulation (as with jealousy, polled Americans are unusually interested in masking the anger they feel). A study by Shulamilt Angel showed Americans tops on the list of concealers, compared to Chinese (who saw useful purposes in some anger), Greeks and Jamaicans. But the main point is the way the pressures to downplay public emotion—a few conservative causes like right-to-life excepted—represent wider efforts to portray passion of the sort that was once labeled righteous indignation as childish at best and dangerous at worst. This is a needless, artificial constraint on the legitimate expression of grievance, limiting a desirable control mechanism in an American society heavily weighted toward protection of the well-to-do, indeed limiting a spur to social progress. It’s no accident that American labor protest declined after the 1960s just as the new anger standards were gaining ground, though other factors were clearly involved as well.
This is not a plea for anarchic rage, but for a recognition that we go too far in depriving social-justice efforts of the emotions that make them a vibrant force. A society that admits the validity of managed anger is arguably healthier than one that insists on maximum restraint, punctuated only by personal mudslinging as the admission fee for politics.
Modern grief standards also have policy implications. American efforts to constrain grief coexisted with the nation’s emergence as a global military power, compelled to risk military casualties in defense of its new position. During the world wars, the tension was handled simply: Patriotism called for control of grief, lest reactions to battlefield deaths give aid and comfort to the enemy. With subsequent conflicts, where goals and successes were less clear and media coverage more graphic, public response became more uncertain, and the resentment at being exposed to grief more acute. The change first emerged in the Korean War, when media shifted from stiff-upper-lip accounts of brave sacrifice to highly personalized stories and pictures of individual soldiers who died. Public emotional involvement has only increased since then.
The most obvious result of all this has been accelerated military efforts to limit American combat deaths, even at the cost of higher casualties on the other side, through high-altitude air strikes as in the Kosovo war, for example. More manipulative tactics emerged most blatantly with the current Iraq war, where military coffins are brought back at night, away from media attention. Death counts and moving personal stories have not proved avoidable, but at least grief-provoking symbolism may be minimized. That distaste for grief constrains and distorts military policy is both undeniable and intriguing, though some of the results may be defended, particularly in steadily growing efforts to save the lives of wounded soldiers.
The impact of the new uneasiness with fear is more unambiguously unhealthy. Americans have become increasingly manipulable through evocation of fear and through miscalculation of risk. Politicians have milked unrealistic fears of crime unconscionably, encouraging in the process a massive increase in prison populations. It was in the 1980s, after a period of hesitation about the backfire potential of using fear, that the emotion began to climb into the media big-time—as even weather reports became more dire, provoking needless panic buying. Manipulation and the larger unfamiliarity with controlling fear clearly have affected responses to terrorism, creating a sense of families in danger that has been distressingly widespread and durable.
Believing they should not have to face fear, Americans have found it difficult to draw boundaries when the emotion is unavoidable. Personal stories about 9/11, gathered in a George Mason University archive, detail an extraordinary sense of fright, even on the part of people far away from any actual outcropping, in contrast to dominant reactions a half century earlier to Pearl Harbor. Above all there was an immediate sense of one’s own family being in danger, combining arguably exaggerated fear with a striking degree of personalization. Not surprisingly in this context, the group that reacted with greatest panic to the subway bombs in London in 2005 was the American military, which initially banned travel by U.S. troops in Britain until the British reaction forced them to climb down. Arguably at least, fear bordering on panic has contributed to a narrowness of focus, retaliatory overreactions and heedless expenditure in national responses to terrorism. To take a single but revealing example, when a catalogue for a Worcester, Massachusetts, exhibition of medieval art can casually compare 9/11 to Europe’s Black Death (which wiped out more than a quarter of the population), there is something out of whack.
Policy responses are admittedly partisan issues, but clearly the new visa barriers thrown up against foreigners have been counterproductive, treatment of suspected terrorists has been occasionally unconstitutional, and even measures like shoe inspections at airports have been designed more for emotional show than for any security benefit. In an admittedly difficult circumstance, fear easily outstrips rational response. When political candidates in 2004, 2006 and again this year in anticipation of the next presidential sweepstakes try in essence to portray respectable opponents as weak on fear protection—“If my opponents win, terrorists will attack”—they demonstrate a dangerous emotional variable in the contemporary political process as well as a sad disregard for ethical campaigning.
Outright manipulations and over-reliance on media for emotional signals add to the burdens of dubious emotional standards. Many Americans have bought into beliefs about children’s emotional fragility that discourage sensible socialization in emotion management—what used to be called character building. Even beyond children, too many adults reflect excessive distaste for emotional intensity, a sense that if emotion is not avoided it will become overwhelming. The intriguing popularity of the idea of being “cool”, which has come to carry many meanings but stems from the new importance of veiling emotions (behind stylish dark glasses, if necessary), signals a nervousness about deep emotion that Americans should revisit. What we need, as against the ascendant emotional culture of recent decades, is greater emphasis on distinguishing between socially useful encounters with grief, anger or fear and counterproductive excess.
We haven’t managed to create the world that modern American emotional standards imply, a world where fear and grief can be avoided and anger is unnecessary. It’s time to scrap the project, born of excessive optimism about the potency of psychological advice and the emotional joys of consumer society. The task will be challenging, but standards constructed only within the past half-century plus can be reviewed and redone.
First, of course, we can and should take a new look at the manipulators: those pop liberal psychologists and well-intentioned counselors, those advertisers and officials who play on American anxieties about emotion for causes ranging from safe driving to “wars” on terror. They mislead us about the consequences of productive anger and play on our uneasiness with fear. We also need to probe our own assumptions about emotions themselves, to realize that we have options that current standards conceal—to take more open pleasure in courage rather than ventilating fear, to see uses in personal grief or collective anger rather than belittling their manifestations. We can and should rewire some of the connections that generate emotional response in contemporary American society. They are ours to construct.