Contemporary culture (and by no means only in America) appears to be in the grip of two seemingly contradictory forces. One pushes the culture toward relativism, the view that there are no absolutes whatever, that moral or philosophical truth is inaccessible if not illusory. The other pushes toward a militant and uncompromising affirmation of this or that (alleged) absolute truth. There are idiomatic formulas for both relativism and what is commonly called fundamentalism: “Let us agree to disagree” as against “You just don’t get it.”
Beware of concluding too quickly that both can be legitimate components of civil discourse: Imagine the first being the response to an interlocutor who favors pedophile rape, the second uttered by someone who favors the mass murder of infidels. Rather, both formulas make civil discourse impossible, because both (albeit for opposite reasons) preclude a common and reasoned quest for moral or philosophical agreement. Relativism is bad for civility because it precludes the moral condemnation of virtually anything at all. Fundamentalism is bad for civility because it produces irresolvable conflict with those who do not share its beliefs. And both are bad for any hope of arriving at valid normative conclusions by means of rational discourse, relativism because there is no will to such a discourse, and fundamentalism because there is no way to it.
For reasons that may not be immediately obvious, relativism and fundamentalism as cultural forces are closely interlinked. This is not only because one can morph and, more often than may be appreciated, does morph into the other: In every relativist there is a fundamentalist about to be born, and in every fundamentalist there is a relativist waiting to be liberated. More basically, it is because both relativism and fundamentalism are products of the same process of modernization; indeed, both are intrinsically modern phenomena of going to extremes. What follows is an attempt, by means of a sociological analysis, to show how the two phenomena are related.
This is not to suggest that, having gained a better understanding of the situation, one will know just what to do about it. History is not made by committees of sociologists or philosophers. Truth may set many free, but reason has a mottled track record. Many relativists cannot be dissuaded from their position unless they confront what they are quite certain is a moral outrage—that is, at that point where they are unlikely to say, “Well, there is the victim’s narrative, and there is the rapist’s narrative, and both are equally valid.” At precisely this point they may become a convert to this or that version of fundamentalism. Often the only way to interact with a fundamentalist is by the instruments of violence—at which point the fundamentalist, looking at the shambles of an intended utopia, may lapse into relativism. However, it is probably safe to say that most human beings, busy living their lives, gravitate toward a more reasonable middle ground, though usually without being able to justify or even articulate why they believe and act as they do. They constitute the potential constituency for a much-needed declaration of civil moderation.
The Relativizing Process
In the 1950s many social scientists viewed modernization as a uniform and irreversible process. They believed that every society had to pass through a series of predictable stages that, while amenable to some local modifications, would essentially resemble the development of the Western world. It is in this theoretical context that the über-sociologist of that era, Talcott Parsons, called America the “vanguard society.”
This theory of modernization provided many valuable insights, and was much too cavalierly dismissed by the neo-Marxist and “postmodernist” theories that have more recently replaced it. Still, it is fair to say that the theory is much less plausible today. The “local modifications” have been far too many and too basic to be interpreted as minor variations on a dominant theme. It turns out that modernity is not a seamless robe. Social scientists today are more likely to agree with Shmuel Eisenstadt, who suggested the advent of “multiple modernities” (or, if one prefers, “alternate modernities”). Thus, to take one of the most important cases, Japan is a thoroughly modern society by any indicator, but it is certainly “alternate” compared with the West.
An important component of the earlier view of modernization has been so-called secularization theory. Simply put, it proposed that the more a society became modern, the less would it be religious. This view, of course, was congenial to an Enlightenment philosophy of progress within which the decline of religion was welcomed as a liberation from superstition and clerical tyranny. However, many of those who upheld secularization theory did not welcome it at all—indeed, many were Christian theologians. They just thought that the evidence, unfortunately, pointed in this direction. Again, it is fair to say that since the 1970s this theory has been massively falsified: Far from being increasingly secularized, the contemporary world is the scene of enormous explosions of religious passion.11.There are two exceptions to this statement: one geographical—western and central Europe; the other sociological—a thin but influential international intelligentsia that is indeed heavily secularized, even in strongly religious societies such as the United States. Analyzing these exceptions, though they are key to any sociology of contemporary religion, is beyond the scope of this essay. Modernity is not only quite variegated but, in most places, comfortably compatible with religion of one kind or another.
At least in the social sciences, it is prudent not to throw out the baby with the bathwater as one theoretical paradigm follows another. Secularization theory had one thing right: Modernization undermines taken-for-granted beliefs and values. But the theory was mistaken in assuming that this process of relativization would necessarily lead to a decline of religion or, for that matter, of other historical claims to truth. In retrospect—and here, frankly, I’m also being confessional in terms of my career as a sociologist—the mistake was grounded in a simple confusion—namely, between secularization and pluralization. Modernity does not necessarily secularize; however, probably necessarily, it does pluralize.
What does this mean? Through most of human history, most people have lived in communities in which there was a very high degree of consensus on basic cognitive and normative assumptions. This consensus is dependent on strong barriers of separation, geographical or social, between the members of the community and outsiders. Given such barriers, worldview and morality tend to take on a self-evident quality. While this was the usual situation in pre-modern times, one must not exaggerate this observation. The cognitive and normative consensus was challenged in various places and at various times by such collective experiences as intercultural migration, foreign invasion or natural catastrophe. And one may suppose that there have always been individuals, such as Socrates or Spinoza or Einstein, who questioned prevailing orthodoxies and managed to be heard. But such individuals were rare and very often they were prevented from having their say. Thus, while pluralism is not a uniquely modern phenomenon, modernity has enormously increased its scope and accelerated its impact. Today it is a global phenomenon. Even the most zealous promoters of ethno-tourism have a hard time finding pristine villages with cultures untouched by the turbulent pluralism of the contemporary world. (And if they do find any, they and their clients will almost instantaneously destroy the pristine quality! Twixt tribal village and tribal theme park is a short step, indeed.)
“Pluralism” is a less than fortunate term. The “ism” suggests an ideological position, as was intended by the American philosopher, Horace Kallen, who coined it in the 1920s to celebrate ethnic and religious diversity. I use the term here as it is now commonly used, namely to describe not an ideology but an empirical fact. “Pluralization” is more factual-sounding, but it is also more awkward. It usually makes little sense to fight common usage, so let “pluralism” stand. However, here is a more precise definition of it: Pluralism is a situation in which different ethnic or religious groups co-exist under conditions of civic peace and interact with each other socially. The latter phrase is important. There are situations in which groups live side by side peacefully, but have nothing to do with one another—the traditional Indian caste system being a good example. Such barriers to interaction prevent “cognitive contamination” (a phrase I invented in an earlier fit of terminological enthusiasm), which happens when the beliefs and values of others undermine the taken-for-granted status of one’s own.
There is no great mystery as to why modernity generates plurality. Modernity has led to massive urbanization, with highly diverse groups thrown into intense contact with each other. Unprecedented rates of international migration and travel have had similar consequences. Mass literacy has brought knowledge of other cultures and ways of life to numerous people. And of course, such knowledge has been greatly magnified by newer information technologies: telephone, radio, movies, television and now, exponentially, the computer revolution. Everyone now talks about globalization, and the phenomenon is real enough. But it only represents a vast amplification of the modernizing process that began with the great voyages of discovery and the printing press. The information technology of the globalization era has brought the dynamics of pluralizing modernity to all but the most remote corners of the world.
Pluralism relativizes. It does so both institutionally and in the consciousness of individuals. This relativization is obviously enhanced when the state does not try to impose uniformity of beliefs and values by means of coercion. However, as the fate of modern totalitarian regimes illustrates, even when the state makes this attempt, it is very difficult to block out every form of cognitive contamination. There is now a veritable market of worldviews and moralities. Every functioning society requires a certain degree of normative consensus, lest it fall apart.No society can tolerate a pluralism of norms concerning intracommunity violence—say, “I believe in my right to shoot anyone who takes my parking space.” But within these limits a wide diversity is possible. The American idiom contains the revealing phrase “religious preference”—a market term if ever there was one. But there are also moral, lifestyle, ethnic and even sexual preferences (and an accompanying cottage industry of counselors and therapists assisting consumers in selecting the preferences that are presumably right for them).
The institutional consequences of pluralism are most clearly evident in the case of religion. Whether they like it or not, and no matter whether this accords with their theological self-understanding, all churches become voluntary associations in post-traditional societies. Their lay members become consumers of the services provided by the clergy and, in the process, become more assertive. American Catholic writers have described this process as “Protestantization.” The term is misleading if it refers to some doctrinal adumbration of Protestantism, but it accurately describes how the social organization of Catholicism has come to resemble the voluntary character of Protestant denominations in America.
But the same move from taken-for-granted allegiance to freely chosen participation creates voluntary “denominations” in areas other than religion. People voluntarily adhere to this or that moral belief system (that is what the American culture war is about), this or that lifestyle (the cult of “wellness” has all the markings of a church), ethnic self-identification (Michael Novak shrewdly proposed years ago that ethnicity has become a matter of choice in America), and even sexual identity (thus many feminists have embraced the notion that gender—tellingly a term derived from the arbitrary realm of grammar—is a “social construction”). In this sense (and in this sense only)—to paraphrase Richard Nixon on Keynes—we are all Protestants now!
But pluralism also has profound consequences for individual life. As ever-wider areas of life lose their taken-for-granted norms, the individual must reflect upon and make choices among the alternatives that have become available. Indeed, modernization can be described as a gigantic shift in the human condition from one of fate to one of choice. This shift has been elegantly described by Arnold Gehlen in his two key categories of “de-institutionalization” and “subjectivization.” De-institutionalization refers to the process wherein traditional institutional programs for individual behavior are fragmented—where previously there was one taken-for-granted program for, say, raising children, there now are competing schools of childhood education. Subjectivization refers to the process wherein institutions lose their alleged objective status so that the individual is thrown back upon himself in constructing his own “patchwork” of meanings and norms.
The net effect of this transformation can be summed up thusly: Certainty becomes much harder to achieve. This means that even if the same traditional beliefs and values continue to be affirmed, the manner of affirmation has changed. Put simply, the what of belief may not change, but the how does. For many people, at least at an early stage of the process, this change is experienced as a great liberation—as indeed it is. But especially after a while, it may be experienced as a burden from which one wants to be freed. There ensues an often desperate quest for certainty, and where there is a demand, someone will proffer a supply. This is where the fundamentalists come in.
The Fundamentalist Response
Like “pluralism”, the term “fundamentalism” is not a fortunate one, though for different reasons. First, it has acquired a pejorative quality, and that is never a good thing if one wants to understand an empirical phenomenon. (After one has understood it, of course, one can be as pejorative as one wishes.) Second, it comes from an episode in the history of early 20th-century American Protestantism where it has very specific meanings, and meanings that are misleading when applied to movements unrelated to that history. One may as well go with common usage, but again with a more precise definition: Fundamentalism is the attempt to restore or create anew a taken-for-granted body of beliefs and values. In other words, fundamentalism is always reactive, and what it reacts against is precisely the aforementioned relativization process.
It follows that, however traditional its rhetoric may be, fundamentalism is intrinsically a modern phenomenon; it is not tradition. At most it can be called neotraditional, but that prefix denotes an abyss of difference. The difference is precisely between what is taken for granted and what is deliberately chosen.
What is taken for granted is by definition never objectified into a genuine question. But every choice can in principle be revoked, and that is what makes every fundamentalist project inevitably fragile—and, for that very reason, inclined toward intolerance. In a truly traditional community, those who do not share the prevailing worldview are not necessarily a threat—they are an interesting oddity, perhaps even amusing. In the fundamentalist worldview the unbeliever is a threat; he or she must be converted (the most satisfying option), shunned or eliminated, be it by expulsion or physical liquidation. This is not to say that there was no fanaticism or intolerance in pre-modern times. These are most likely to be found, however, in the early stages of a movement—before it has settled down into a taken-for-granted community. When the latter development has occurred, a greater measure of tolerance becomes psychologically feasible.
There is a wonderful 19th-century story that nicely illustrates this difference between tradition and fundamentalism, albeit in a context that has nothing to do with religion. The Empress Eugenie, the wife of Napoleon III, was on a state visit to London. Now it so happens that Eugenie’s background was rather unsavory and, though empress, she was very much an upstart. Not so, of course, Queen Victoria, her hostess. The two attended Covent Garden Opera together. Eugenie made an appearance, magnificently regal. She entered the royal box, graciously acknowledged the applause, looked behind her and slowly sat down. Then Victoria entered, just as regal. She too graciously acknowledged the applause and sat down slowly. But she did not look behind her. She knew that the chair would be there.
The definition of fundamentalism suggested here, in addition to freeing the concept from pejorative associations and from its particular American context, has the advantage of making clear that it can refer to secular as well as religious movements. All sorts of secular worldviews and value systems can give rise to fundamentalist movements—radical nationalism, political ideologies, “scientism”, even “secularism” as the belief that religion should be rigorously excluded from public life. In several countries today, secular and religious fundamentalists are pitted against each other politically—for example in Turkey and in France, and indeed in the United States. Though their political agendas are diametrically opposed, their social and psychological profiles are remarkably similar. Both have a black-or-white perception of society, both tend to demonize those who oppose them, and both would deliver the same underlying message to potential converts: “Come join us, and we will give you certainty as to what to believe, how to live, and who you are.” There is a very large market out there for this message.
Fundamentalists of whatever stripe must suppress doubt (in psychologists’ parlance, they must avoid cognitive dissonance). I will allow myself a personal anecdote here. Shortly after I came to America as a very young man I had a few dates with an attractive and intelligent young woman. I soon discovered that she was an ardent member of the American Communist Party, which somewhat dampened our relationship. Of course we argued about this. She was unwilling to accept any negative information about the Soviet Union. When I spoke about atrocities in Soviet-occupied Eastern Europe, she asked me whether I had personally witnessed these atrocities. When I said no, she said, “Well, I really would like to meet someone who has.” I quickly said, “This could be arranged.” Well, arrange it I did, and it was a revealing event.
I was friendly with a young couple recently arrived from Latvia. They invited me and my communist not-quite-girlfriend to supper. After some awkward chitchat, I asked them to talk about the Soviet occupation of Latvia. They told one horror story after another. My date sat quietly at first, then became increasingly agitated. After almost an hour she put her hands to her ears and said, “I don’t want to hear any more of this.” As we walked away from my friends’ apartment, I asked her if she thought that they were lying. No, she replied, these people did not impress her as liars. But then she added, “You know, I think that there is something, if we could only find it, that would completely change what they were saying.” Evidently she had found a magical pill against cognitive dissonance. The Party was well-equipped to provide such medicine. She never agreed to see me again.
To restate the argument: The fundamentalist project is the restoration, or the creation de novo, of a taken-for-granted definition of reality in the wake of relativization. This project can be realized in two ways, one more ambitious than the other.
The more ambitious version is to make an entire society the basis (in sociological terminology, the plausibility structure) of a newly taken-for-granted cognitive and normative order. This is what modern totalitarianism sought to achieve. It requires an enormous exercise of violence, not only in establishing the new order, but in maintaining it against the ever-present threat of cognitive contamination. Very importantly, the management of the project (the totalitarian regime) must control all communications with the outside world and all dissident sources of communication within the society.
The history of totalitarianism in the 20th century demonstrates how difficult such a fundamentalist project is. If realizable at all, it carries with it not only huge costs in human degradation and oppression, but also the cost of economic stagnation and decline. North Korea is the best current example of this. When a regime is unwilling or unable to bear these costs—and especially when it wishes to modernize its economy and is therefore obliged to have extensive communications with the outside world—cognitive contamination undermines the fundamentalist order. The internal disintegration of the Soviet Union is the best example of this; the recent history of China could be described (so far) as a better-managed version of a similar process.
The religious history of Europe is full of attempts to restore a challenged taken-for-granted order. Sooner or later, they all failed. The last significant Christian version of the totalitarian project was the Nationalist movement in the Spanish Civil War and the Franco regime that resulted from its victory. Blessed by the Catholic Church at that time (such a blessing would be unthinkable today), the movement intended the reconquest (reconquista) of Spain from what it perceived as the forces of atheism and immorality. Despite a savage apparatus of repression, the Franco victory turned out to be ephemeral. As soon as Spain opened itself to contacts with the outside world, even though initially this opening was to be limited to economic relations, the ideological unity sought by the regime disintegrated rapidly. No similar project has since been attempted in a Christian vein. (There have been some noises of this sort in Russian Orthodoxy, but they are unlikely to go very far unless conditions in Russia greatly deteriorate.)
Religious totalitarianism, of course, characterizes radical Islam. With the exception of the short-lived Taliban regime in Afghanistan, there have been no successes. The Iranian ayatollahs may entertain totalitarian ideals, but they also want Iran to be economically successful, and these two goals are sharply contradictory. Under modern conditions, any project of full-scope territorial reconquista is either unrealizable or forbiddingly costly.
There is a less ambitious and somewhat more realizable version of the fundamentalist project. That is to realize it not in an entire society, but in an enclave within that society. This could be called the sectarian or subcultural version of fundamentalism. Within the enclave, a taken-for-granted worldview is established; the rest of society is, as it were, abandoned to its path to perdition. The recipe for the maintenance of a fundamentalist subculture is simple enough: Control all communications between your members and the outside world, and especially control all social relations with outsiders. The early Christian movement was just such a subculture or sect, and the Apostle Paul was practicing good social psychology when he warned Christians not to be “yoked together with unbelievers.” In anthropological parlance: No commensality and no connubium with outsiders—don’t have them for dinner, and certainly don’t go to bed with them! This kind of control is easiest to achieve if the subcultural community is physically segregated from the larger society—often in remote rural villages or, less effectively, in compact urban neighborhoods. If physical segregation is not possible, controls over interaction and information have to be particularly stringent.
Now, the history of religion is full of such self-isolationist projects, and some of them have been successful over considerable periods of time. But this kind of subcultural fundamentalism becomes ever more difficult under modern conditions, because the walls of separation from the outside world have to be kept very strong and in good repair. Allow one little breach, and the turbulent forces of relativizing pluralism will come surging in. What has to be maintained, if you will, is a sort of mini-totalitarianism—not easy to achieve in a modern society. In sum, fundamentalists nowadays inherently have a hard time achieving their objectives. This is the good news. The bad news is that in the meantime they can cause an enormous amount of damage.
A New Normative Agenda
If one agrees with an agenda of articulating a middle ground between relativism and fundamentalism, several discrete issues need be addressed. There is the cognitive relativism, most eloquently expressed by so-called postmodernist theories, which denies the very possibility of objective criteria of truth or even validity. In the final resort, this is a philosophical problem that cannot be discussed at length here. What can be said, however, is that in the human sciences, not least in sociology, this type of relativism has done immense harm. It makes science itself an impossible project, since there is no reliable way of distinguishing acceptable and unacceptable propositions about the empirical world. In practice, what still goes under the name of science becomes an unfalsifiable exercise in propaganda, or perhaps, if one is in a generous mood, poetry.
It seems to me that recent debates about the possibility of scientific objectivity repeat in a far less sophisticated way what was discussed in great detail in the social sciences about a hundred years ago, and the methodological writings of Max Weber still provide the best guide through this theoretical labyrinth. Without going into this vast body of materials, let me merely repeat what I often say to students who come braying to me with postmodernist ideas: Suppose that I return a term paper to you with a failing grade and a note that reads, “I should give you a much better grade, but I hate your guts, so there.” You will scream bloody murder, go to the dean, possibly sue me. What is the assumption behind your outrage? Obviously, you expect me to grade your paper fairly—that is, objectively—regardless of my feelings about you. But why do you demand this of me as a teacher while denying my ability to do it as a researcher?
Then there is the issue of moral relativism, again a philosophical problem that cannot be developed here. It can be formulated with great erudition, but in the end it comes down to the philosopher saying to the cannibal: “You believe it is right to cook people and eat them. I don’t. Let us agree to disagree.” It seems to me that there is an important difference between moral relativism and cognitive relativism. Science can never give us certainty, it only provides probabilities, and it must always be open to the possibility that its hypotheses may be falsified. But there are moral judgments which, even if one understands that they are contingent on one’s position in time and space, attain a high degree of certainty. Slavery and torture provide good examples. I am not prepared to say that my moral condemnation of torture is a matter of taste or that it is a mere hypothesis. I am certain that torture is a totally unacceptable moral evil. And any argument to the effect that I would have a different view if, say, I lived as a magistrate in Tudor England will not move me from this conviction. Moral judgments come out of specific perceptions of the human condition formed in the course of specific historical developments, but this genesis does not explain away their validity. Einstein would not have come upon the theory of relativity if he had lived as a peasant in ancient Egypt, but this obvious observation does not invalidate the theory. Einstein’s scientific insights are not the same as his moral beliefs, but neither can be validated or invalidated by pointing out their social and historical context.
There are moral certainties that withstand relativization. There is the episode concerning General Napier who conquered the region of Sind for the British Raj in India. Upon establishing control over this area, he did what the British usually did in their empire—he left local customs pretty much as they were, except for a very few he deemed totally unacceptable. Among these was suttee—the burning alive of widows. A delegation of Brahmin priests came to see him and said, “You cannot ban suttee. It is an ancient tradition of our people.” Napier replied, “We British also have our ancient traditions. When men burn a woman alive, we hang them. Let us each follow our traditions.” It seems that Napier was not plagued by moral relativism.
My own interest has been mainly in the religious aspect of the relativist-fundamentalist dichotomy. My presupposition, again, is that both extremes are unacceptable: the relativist view that finally all religions are equally true (quite apart from theology, a philosophically untenable view); and the aggressive and intolerant fundamentalist claim to absolute truth (which even a modest acquaintance with historical scholarship about religion makes very hard to maintain). It is possible and desirable to stake out middle positions that use the resources available from within the major religious traditions. The traditions coming out of southern and eastern Asia—notably Hinduism, Buddhism and Confucianism—have never had much difficulty doing this theoretically (but which, by the way, did not stop them from being savagely intolerant in practice from time to time). The Abrahamic traditions emerging out of western Asia—Judaism, Christianity and Islam—have had greater difficulties. Monotheism does not easily develop an ethos of tolerance, especially when it is institutionalized and literally armed within the context of a state. Yet resources for such an ethos can be found in each tradition. What is more, modern ideas of human rights, including religious liberty, are historically rooted in the anthropological ideas of these traditions and, in recent times, have been explicitly legitimated by these ideas (as, for example, in the unfolding of Catholic social doctrine since the Second Vatican Council).
Why is such a religiously founded middle ground important? First, of course, for the obvious reason that so much contemporary fundamentalism has religious content (and not only among Muslims): One cannot oppose it without confronting its religious claims. The middle ground is thus politically important as a defense against the highly destructive potential of religious fanaticism. But this middle ground is also important for intellectual and spiritual reasons. It can be the location of those who want to be religious believers without emigrating from modernity.
Protestantism, as Max Weber and Ernst Troeltsch showed in the early 20th century, has had a special relationship with modernity. Its long struggle with the spirit of modernity cannot be replicated in other traditions, but it nevertheless holds lessons for the latter. One is the readiness to have faith without laying claim to certainty—from the sola fide of the Lutheran Reformation to Paul Tillich’s “Protestant principle.” Another lesson is the importance of coming to terms with modern historical scholarship. Protestantism was the first religious tradition that turned the critical instrument of this scholarship on its own scriptures—an historically unprecedented event, most of it carried on in 19th-century theological institutions by individuals who did not want to undermine faith but, on the contrary, wanted to strengthen it by showing its historical development.
There is a challenging agenda here, one of great interest both to religious believers and to others concerned with preserving a society in which diverse people can live together in civic peace. It is an agenda we must advance.
1.There are two exceptions to this statement: one geographical—western and central Europe; the other sociological—a thin but influential international intelligentsia that is indeed heavily secularized, even in strongly religious societies such as the United States. Analyzing these exceptions, though they are key to any sociology of contemporary religion, is beyond the scope of this essay.