Looking for another book not long ago, I stumbled upon Allan Bloom’s The Closing of the American Mind. In 1987, it was a national sensation, a trigger-point for debate over the legacy of the sixties and its “counter-culture.”
Subtitled “How Higher Education Has Failed Democracy and Impoverished the Souls of Today’s Students,” Bloom’s salvo attacked from the right. It was less a polemic than a closely reasoned argument, fortified with lofty philosophic learning and grounded classroom experience. A New York Times reviewer wrote that “it commands one’s attention and concentrates one’s mind more effectively than any other book I can think of in the past five years.” The Chicago Tribune said “it may be the most important work of its kind by an American since World War II.” Saul Bellow, in a gripping introduction, summarized: “It makes an important statement and deserves careful study. What it provides, whether or not one agrees with its conclusions, is an indispensable guide for discussion . . . a completely articulated, historically accurate summary, a trustworthy resume of the development of the higher mental life in the democratic U.S.A.”
My copy of The Closing of the American Mind is a paperback with scant evidence of close scrutiny. Some three dozen pages are heavily marked with dismissive marginalia. Bloom took aim at my own generation (I was born in 1948), and its political complexion was anathema.
But times have changed and so have I. Re-opening The Closing of the American Mind, I discovered that Allan Bloom was prophetic. Even Bellow’s introduction reads as if it were written yesterday: “The heat of dispute between Left and Right has grown so fierce in the last decade that the habits of civilized discourse have suffered a scorching. Antagonists seem no longer to listen to one another.”
Taking aim at “cultural relativism,” Bloom attacked what we now call identity politics and a linked discourse stigmatizing “cultural appropriation”—a discourse that, to many my age, seems more impoverishing than nourishing for the “souls of today’s students.” For Bloom, a mounting failure to appreciate Western traditions of culture and thought was eviscerating the academy. He deplored a tendency to ecumenically equalize all cultural endeavors, old and new, East and West. In effect, he foresaw today’s all-purpose denunciations of the “misappropriation” of victimized cultures. As for “identity politics,” the term isn’t there, but the concept is, extrapolated from an exaggerated regard for the “other” and otherness—for Bloom, a force fracturing democratic community.
Bloom’s ultimate claim was that a generation out of touch with great music, great literature, and great traditions of philosophic thought—all unabashedly Western—is a generation diminished personally and emotionally. He linked this estrangement to diminished character and moral force, to a shallower sense of self and shallower personal relationships. Whatever one makes of his notoriously presumptuous disparagement of rock music (“it artificially induces the exaltation naturally attached to the completion of the greatest endeavors”) and of students addicted to drugs (“their energy has been sapped and they do not expect their life’s activity to produce anything but a living”), the “closed minds” and “impoverished souls” Bloom reported may in fact have become a double American malaise.
Re-reading Bloom, I am thunderstruck, because my inclination is to blame it all on social media and attendant technologies favoring vicarious experience. But Bloom’s 1987 narrative establishes an earlier start. He distinguishes my sixties’ generation from his eighties’ students, in whom tendencies that we initiated yielded a dead end. It may in effect be read as a tale of unwanted, unanticipated consequences.
What happened first? Thinking back to my own collegiate education, I discover one answer of sorts. Whether my answer has national relevance I cannot say. But I know that Swarthmore College, as I encountered it in 1966, was—notwithstanding its reputation as the nation’s pre-eminent liberal arts institution—languishing in a state of advanced obsolescence. And at Swarthmore, at least, that obsolescence triggered the seismic upheaval that Bloom decried.
I graduated in 1970 Phi Beta Kappa with Highest Honors. I also graduated vowing that I would never again submit to learning in a classroom environment. My Swarthmore class of 1970 set some kind of record for the lowest percentage of graduates moving on to graduate school. We felt we had been schooled quite enough.
In four years, I did not have a single teacher who was not a white male. Though I majored in American History, there was no mention of Frederick Douglass or W. E. B. DuBois or Crazy Horse. Though my interests were broad, no interdisciplinary majors were permitted. Though I minored in Music, played the piano, and sang in the chorus, no academic credit was allowed for creative pursuits. In fact, the campus did not have a concert hall or theater of consequence.
At Swarthmore, in 1966, neither the Political Science nor the Philosophy Department offered any courses in Hegel or Marx, and the Frankfurt School was unheard of. The Department of Sociology and Anthropology was brand new, staffed by fresh hires certain not to rock the boat. Physical Education was mandatory for freshmen and sophomores.
So far as I could ascertain, the college’s major asset was its student body, culled by an Admissions Director who favored assertive Jewish types from New York City and its environs. The big personalities on campus were not the professors. When in 1970 Swarthmore students went on strike—an act of revulsion toward Nixon and Vietnam—the faculty response exacerbated the fracture. At a mass meeting in Clothier Hall, our sociologist-in-chief urged everyone to return to class and resume learning. He did not notice that we were in the midst of an institutional revolution crammed with pedagogical content. The senior member of the Economics Department told students they were “transient parasites” peripheral to the institution’s ongoing identity. And yet for many of us our profoundest, most charismatic teachers were our peers. I was myself delegated to inquire whether the Political Science Department would consider adding a course in Marx. I was informed by a sneering Associate Professor that a mini-course for one-quarter credit might be considered—and expanded if there was anything left over to teach.
All of this occurred a year after the Swarthmore African-American Students Society (SASS) occupied the admissions office and demanded that the college enroll more black students (there were 47 out of a student body numbering 1,150), black teachers (there was one), and black administrators (there weren’t any). Days later Swarthmore’s President, Courtney Smith, died of a heart attack.
After I graduated, I felt impelled to investigate what had happened over the course of two years of institutional chaos. I wrote a 9,000-word account based on personal experience and follow-up interviews: “When Laos Was Invaded, Nobody Budged.” My topic was the chill that had descended upon the campus, such that the Nixon/Kissinger incursion into Laos, in 1971, was a tragedy unnoticed a mere year after Vietnam had torn the place apart. My findings were published in Change Magazine (Summer 1971)—a journal, funded by the Ford Foundation, mounting “a national voice for campus reform.”
After re-encountering The Closing of the American Mind, I re-read my own counter-account of “how higher education has failed democracy.” I was unsurprised to discover that it utterly lacked Bloom’s gravitas and learning. But it proved exceptionally informative nevertheless, both for my detailed reportage and a self-report on my state of mind post-Swarthmore.
I was reminded that the college had in fact shown an incipient awareness of its obsolescence. In 1966, President Smith convened a Commission on Educational Policy (C.E.P.) with a mandate to recommend specific proposals for change. It swiftly proved too little too late. I remember my own brief involvement, being questioned by a distinguished literary historian, a pillar of the humanities faculty (at a time when the humanities defined the public face of Swarthmore and kindred top-tier colleges), about the “intellectual content” of playing a musical instrument. My answer was a fumbling attempt to articulate precisely that. In retrospect, I should have pointed out that this was the wrong question, that—as Bloom would write—the arts contribute invaluably to character and personality, to emotional and psychological well-being.
But Swarthmore’s criterion was inflexibly cerebral. The C.E.P. report wound up devoting 16 pages to “The Creative Arts.” It was determined that “artistic activity is intelligent activity” and that “creative work in the arts should be given a place in the college curriculum.” As I reported in Change:
But the stress was at least as much on “improving and expanding” the arts program for “amateurs” as for granting course credit for those students who “will have the desire and the talent to pursue their artistic work more deeply . . . than will be possible in spare time alone.” And it was proposed that work in the creative arts be limited to a maximum of a mere four credits (out of a four-year total of 32). This meant that no autonomous creative arts departments in any field would be set up, which meant that there would be no major in any field of creative art. Furthermore, only some of the creative arts were deemed sufficiently intellectual as to warrant credit; specifically, writing, theater, “visual arts” and music were okayed for credit, and dance, pottery, and film weren’t.
The C.E.P. proposals have since been adopted. Swarthmore’s fledgling community of creative artists has greeted these innovations with expressions of ingratitude ranging from fatalistic shrugs to bitterly sarcastic sermons. A group of students who formed a committee to work towards more credit for the arts has given up. . .
Superseding the C.E.P. was a radical faculty/student initiative. Two new hires in philosophy—one a Marxist, the other a Socratic Hegelian—proved intent on transforming the learning environment. They fundamentally rejected the Anglo-American empiricist tradition, including behaviorism in the social sciences. Their orientation, wholly new to the curriculum, was Germanic and holistic. Their acolytes read Hegel, not Marx. A new philosophy course, “Methods of Inquiry,” became a magnet for a small group of dissident teachers. Its overt purpose was to change Swarthmore College, if not the world.
The backlash—a virtual Thermidor—was piloted by the Political Science department. The faculty dissidents disappeared. Both the Director of Admissions and the Provost were Swarthmore political scientists; the latter, Charles Gilbert, had headed the CEP. Re-reading my article for Change, I am reminded that he regarded the college’s rigid departmental structure as a safeguard against “letting intellectual standards slide.” Rejecting American Studies as a proposed major, he said “there’s not really any kind of intellectual discipline there.” Swarthmore engaged a Columbia University Professor of Higher Education, Max Wise, to examine “college governance.” The Wise Report recommended open faculty meetings and governance responsibilities for students. It was tabled.
Robert Cross, who succeeded Courtney Smith as President in 1969, was a historian with a long view that proved paralyzing. In 1971 he was replaced with the aptly named Theodore Friend. I was one of the many recent Swarthmore graduates who mobbed the living room of Clark Kerr (Swarthmore ’32) when President Friend visited Berkeley to introduce himself to West Coast alumni. I was surprised to discover, from his smiling remarks, that the college had suffered a kind of head injury inflicted by hooligans—from which it would now speedily recover as from a bad memory. It seemed to have occurred to President Friend that in Berkeley, of all places, the hooligans would be in the room.
That was half a century ago. Swarthmore today has an African-American president and an African-American provost, both of whom are women. The campus has long enjoyed superior performing arts facilities. A 1986 Informal History of the college, by Richard Walton, painstakingly revisits the 1969 crisis, with the SASS students agents of necessary change. Walton writes: “It is generally agreed that Swarthmore had not conducted a vigorous campaign to obtain more black applicants, had not done enough to raise scholarship funds for them, and had not been sufficiently willing to accept ‘risk’ students.”
Swarthmore’s current Program of Study, on its website, invites students to “Design Your Own Major.” Dance, Theater, and Film & Media Studies are all new since the crisis years. Allan Bloom, I am sure, would not have approved of “Gender and Sexuality Studies” or “Peace and Conflict Studies,” social justice majors that in his view would “confuse learning with doing.” Re-encountering my 1971 self in Change Magazine, I find that I, too, was all about tearing down the Ivory Tower, impatient with disinterested inquiry, upset by Vietnam and the college’s failure to “take a stand.” In retrospect, our contempt for Nixon was justified (it wasn’t about the draft). Though some senior faculty members denounced us as naïve and intolerant (I remember being compared to the adherents of Adolf Hitler), the college’s intellectual stasis was itself naïve.
The resulting dynamics of campus change, nation-wide, were dialectical—Hegelian. And today’s culture of political rectitude is a fated over-reaction: a fulfillment of Allan Bloom’s prophecies. The Closing of the American Mind may have been aloof to the sources of campus discontent whose outcomes he decried. But I greatly fear that he got the outcomes right.
While I am long out of touch with the affairs of my alma mater, I have for four decades devoted my professional life to studying and writing about the history of classical music in the United States. As a concert producer, I frequently have occasion to partner with colleges, universities, and conservatories. I also teach as a visiting professor. I have discovered that it has become impossible to pursue historical inquiry without encountering new and confounding obstacles.
American classical music is today a scholarly minefield. The question “What is America?” is central. So is the topic of race. The American music that most matters, nationally and internationally, is black. But classical music in the United States has mainly rejected this influence, which is one reason it has remained impossibly Eurocentric. As the visiting Czech composer Antonin Dvorak emphasized in 1893, two obvious sources for an “American” concert idiom are the sorrow songs of the slave, and the songs and rituals of Native America. Issues of appropriation are front and center. It is a perfect storm.
Dvorak directed New York City’s National Conservatory of Music from 1892 to 1895, a period of peak promise and high achievement for American classical music. It speaks volumes that he chose as his personal assistant a young African-American baritone who had eloquently acquired the sorrow songs from his grandfather, a former slave. This was Harry Burleigh, who after Dvorak died turned spirituals into concert songs with electrifying success. (If you’ve ever heard Marian Anderson or Paul Robeson sing “Deep River,” that’s Burleigh.) During the Harlem Renaissance, Burleigh’s arrangements were reconsidered by Zora Neale Hurston and Langston Hughes, both of whom detected a “flight from blackness” to the white concert stage. Today, Burleigh’s “appropriation” of the black vernacular is newly controversial. That he was inspired by a white composer of genius becomes an uncomfortable fact. An alternative reading, based not on fact but on theory, is that racist Americans impelled him to “whiten” black roots. Burleigh emerges a victim, his agency diminished.
Compounding this confusion is another prophet: W E. B. Du Bois, who like Dvorak foresaw a genre of black American classical music to come. The pertinent lineage from Dvorak to Burleigh includes the ragtime king Scott Joplin (who considered himself a concert composer) and the once famous black British composer Samuel Coleridge-Taylor, urged by Du Bois, Burleigh, and Paul Lawrence Dunbar to take up Dvorak’s prophecy. After Coleridge-Taylor came notable black symphonists of the 1930s and 1940s: William Grant Still, William Dawson, and Florence Price, all of them today being belatedly and deservedly rediscovered. But the same lineage leads to George Gershwin and Porgy and Bess: a further source of discomfort. I have even been advised, at an American university, to omit Gershwin’s name from a two-day Coleridge-Taylor celebration. But Coleridge-Taylor’s failure to fulfill Dvorak’s prophecy—he was too decorous, too Victorian—cannot be contextualized without exploring the ways and reasons that Gershwin did it better. As for Gershwin’s opera: Even though Porgy is a hero, a moral paragon, it today seems virtually impossible to deflect accusations of derogatory “stereotyping.” The mere fact that he is a physical cripple, ambulating on a goat-cart, frightens producers and directors into minimizing Porgy’s physical debility. But a Porgy who can stand is paradoxically diminished: the trajectory of his triumphant odyssey—of a “cripple made whole”—is truncated.
Gershwin discomfort is mild compared to the consternation Arthur Farwell (1872-1952) invites. He, too, embraced Dvorak’s prophecy. As the leading composer in an “Indianist” movement lasting into the 1930s, Farwell believed it was a democratic obligation of Americans of European descent to try to understand the indigenous Americans they displaced and oppressed—to preserve something of their civilization; to find a path toward reconciliation. His Indianist compositions attempt to mediate between Native American ritual and the Western concert tradition. Like Bela Bartok in Transylvania, like Igor Stravinsky in rural Russia, he endeavored to fashion a concert idiom that would paradoxically project the integrity of unvarnished vernacular dance and song. He aspired to capture specific musical characteristics, but also something ineffable and elemental, “religious and legendary.” He called it—a phrase anachronistic today—“race spirit.”
As a young man, Farwell visited with Indians on Lake Superior. He hunted with Indian guides. He had out of body experiences. Later, in the Southwest, he collaborated with the charismatic Charles Lummis, a pioneer ethnographer. For Lummis, Farwell transcribed hundreds of Indian and Hispanic melodies, using either a phonograph or local singers. If he was subject to criticism during his lifetime, it was for being naïve and irrelevant, not disrespectful or false. The music historian Beth Levy—a rare contemporary student of the Indianists movement in music—pithily summarizes that Farwell embodies a state of tension intermingling “a scientific emphasis on anthropological fact” with “a subjective identification bordering on rapture.” Considered purely as music, his best Indianist compositions are memorably original—and so, to my ears, is their ecstasy.
These days, one of the challenges of presenting Farwell in concert is enlisting Native American participants. For a recent festival in Washington, DC—“Native American Inspirations,” surveying 125 years of music—I unsuccessfully attempted to engage Native American scholars and musicians from as far away as Texas, New Mexico, and California. My greatest disappointment was the Smithsonian Museum of the American Indian, which declined to partner. A staff member explained that Farwell lacked “authenticity.” But Farwell’s most ambitious Indianist composition—the Hako String Quartet (1922), a centerpiece of our festival—claims no authenticity. Though its inspiration is a Great Plains ritual celebrating a symbolic union of Father and Son, though it incorporates passages evoking a processional, or an owl, or a lighting storm, it does not chart a programmatic narrative. Rather, it is a 20-minute sonata-form that documents the composer’s enthralled subjective response to a gripping Native American ceremony.
A hostile newspaper review of “Native American Inspirations” ignited a torrent of tweets condemning Farwell for cultural appropriation. This crusade, mounted by cultural arbiters who have never heard a note of Farwell’s music, was moral, not aesthetic. It projected a chilling war cry. If Farwell is today off limits, it is partly because of fear—of castigation by a neighbor. I know because I have seen it.
Arthur Farwell is an essential component of the American musical odyssey. So is Harry Burleigh. So are the blackface minstrel shows Burleigh abhorred—they were a seedbed for ragtime and what came after. Even alongside the fullest possible acknowledgement of odious minstrel caricatures, a more nuanced reading of this most popular American entertainment genre is generally unwelcome. It is, for instance, not widely known that antebellum minstrelsy was an instrument of political dissent from below. Blackface minstrelsy was not invariably racist.
Charles Ives’s Second Symphony is one of the supreme American achievements in symphonic music. Its Civil War finale quotes Stephen Foster’s “Old Black Joe” by way of expressing sympathy for the slave. When there are students in the classroom who cannot get past that, the outcome is Bloomsian: closed minds.
Bloom wrote in The Closing of the American Mind:
Classical music is now a special taste, like Greek language or pre-Columbia archeology, not a common culture of reciprocal communication and psychological shorthand. Thirty years ago . . . university students usually had some early emotive association with Beethoven, Chopin and Brahms, which was a permanent part of their makeup and to which they were likely to respond throughout their lives. . . . [But] music was not all that important for the generation of students preceding the current one.
Well, no and yes. At Swarthmore, in 1970, classical music was not yet a “special taste.” But my guess is that it must be by now. My two children acquired an “emotional association with Beethoven, Chopin, and Brahms” through early exposure and parental enthusiasm, but their peers show no such affinity.
Maggie, now 23, was home-schooled after grade eight because she trained to become a ballerina. Then she changed course and decided to go to college. Touring prospective campuses with her was an informative experience. Whatever else it did or did not impart, ballet taught discipline and concentration. She had not set foot in an academic classroom for some five years.
At a college with an eminent arts program, Maggie met with the head of Dance Department—and emerged ready to leave. She had been assured that “anyone can dance.” The next day we visited an Ivy League university and were greeted by a phalanx of tour guides who competed with one another, comparing the range and number of their extracurricular activities. Our guide was a member of six clubs. She had recently left the Ballet Cub, but was thinking of rejoining. At Swarthmore in 1970, there were no clubs.
Maggie spent a semester in Budapest with a cheerful cohort of 40 American college students, who frequently travelled on weekends. When Maggie announced that they would be flying to Munich for Oktoberfest, I suggested that she attend Verdi’s Otello at Munich’s Bavarian State Opera—Kirill Petrenko was conducting with Jonas Kaufman in the title role. None of her friends would want to do that, she protested. And besides, the remaining tickets were too expensive: 210 euros. Hours later, she texted from the opera house that she had been moved to tears.
When Maggie had a ten-day break in October, she agreed to meet me in Greece. I brought along a favorite book: H. D. F. Kitto’s The Greeks (1951), once a ubiquitous guide but unread today because Kitto was no more a relativist than Allan Bloom. But he was a master of passionate, precise approbation. We spent the last day at Delphi, awed by the magnitude of the Greek achievement and setting aside for another day how the Greeks regarded women and slaves.
On the way back to Athens, I asked Maggie what her friends might have made of Otello had they joined her. They wouldn’t have liked it at all, she said. But what could be easier to grasp? A tale of love and jealousy. The warmth and immediacy of the human voice. You just don’t get it, she said. The opera barrier was insuperable.
I invited Maggie to ponder how such experiences as Otello might impact her character, her emotional vocabulary, her prospects for intense human intimacy. Five decades after Swarthmore College fractured, retreated, and regrouped, I had turned into Allan Bloom.