I just finished reading an interesting book: Ephraim Meir, Interreligious Theology: Its Value and Mooring in Modern Jewish Philosophy (De Gruyter, 2015). The author is Professor of Jewish Thought at Bar-Ilan University, Ramat Gan, Israel. He is also a guest professor at the Academy of World Religions, University of Hamburg, a very productive center directed by Professor Wolfram Weisse. Meir’s starting point is that in our increasingly pluralistic world dialogue between religious traditions, quite apart from its intellectual interest, is a politically urgent task for the maintenance of peace between and within nations. Meir differentiates between plurality, which is a fact, and pluralism, which is an effort to cope with the fact intellectually. As far as I know, the term “pluralism” was coined by the philosopher Horace Kallen (1882-1974) in both these senses—as a description of the fact of American cultural and religious diversity, and as a celebration of it. This is a bit confusing. It seems to call for what Confucians call a “rectification of terms.” In imperial China there was a government department that made merchants use honest weights and officials use clear concepts in their language.
I will only note here that, ambiguous or not, “pluralism” is now generally used to describe a fact. For a while, as the fact became more and more central in my thinking about contemporary religion, I kept using “plurality” to refer to the fact—only to have people ask “you mean pluralism?” So there: By pluralism I mean the co-existence of different worldviews and value systems in the same society. (I have registered this usage with the Department of Rectifications in the Forbidden City of Peking.)
Meir builds on the writings of four prominent modern Jewish thinkers: Franz Rosenzweig (1886-1929), Martin Buber (1878-1965), Abraham Heschel (1907-1972), and Emmanuel Levinas (1906-1995). In different ways, each of these thinkers was concerned with two dialogues—between Judaism and other religions, especially Christianity, and between Judaism and modern secular philosophy. But Meir also refers to the roots of Judaism further back in the past, in the Hebrew Bible and the Talmud. The discipline that Meir entitles “interreligious theology” (a.k.a. as “theology of religions”) goes beyond simple dialogue—whose aim is respect, understanding, and possibly joint actions in “repair of the world” (the venerable Jewish idea of tikkun olam). It goes beyond dialogue to a project of doing one’s own theology by incorporating ideas and experiences from other traditions—such as doing Jewish theology by incorporating elements of Muslim mysticism, or doing what Raymond Panikkar did as a Catholic theologian by looking for “the hidden Christ of Hinduism,” and so on. This is an exciting and daring project. (Meir describes it as “interreligious hospitality”.) There are risks to sharing meals and open-ended conversation with “the other.” However unlikely this may be empirically (especially after a certain age), the ultimate risk is conversion to the other’s worldview. This is what anthropologists call “going native.” Very much short of this, out of “otherness” emerges a new “we.”
I cannot follow unreservedly Meir’s “mooring” in a long Jewish tradition of experience and thought. But I’m quite in agreement with his basic approach to an “interreligious theology.” There is a genre of American interreligious humor, in jokes that begin “A Catholic priest, a Protestant minister and a rabbi walk into a bar.” Then somebody objects: “Look, I’m tired of these jokes about a priest, a minister, and a rabbi. Don’t you have another joke?” Answer: “Okay—A priest, a rabbi, and a Buddhist monk walk into a bar….” I will not go on with a detailed commentary on Meir’s argument. What I will do is briefly relate the interreligious theology project to my own theory of pluralism (The Many Altars of Modernity, 2014). (I will once again cite my favorite Zulu proverb: If I don’t beat my drum, who will?)
My own starting point is the insight (which took me many years to reach) that it is not secularity or secularism that provides the main challenge to religion today, but rather pluralism, as I have defined it here. This insight rejects so-called secularization theory, which dominated the study of contemporary religion until recently. Its core proposition was that more modernity inexorably leads to less religion. That proposition has become empirically untenable. It did have a kernel of validity: Modernity has indeed created a secular space, without which it could not be sustained—within it religious discourses do not apply. Secularism is the ideology that celebrates this secularity and wants to expand it to dominate all of society. This secular space was originally occupied by science and technology, but subsequently expanded into other areas—notably law, the state, and the economy. To acknowledge this fact, I now like to speak of two pluralisms—one between the different religious discourses, the other between religion and the secular discourse.
You can have pluralism without religious freedom. The combination of the two has explosive consequences. In this connection a comparison between America and Israel is instructive. Ever since colonial times, America has witnessed the explosive combination of pluralism and religious freedom, so that the church historian Richard Niebuhr (not to be confused with Reinhold Niebuhr, his more famous brother) has argued that an innovative form of religious institution has been invented here—the “denomination,” a church that, whether it likes this or not, is in competition with other churches. This “denominationalism” appears elsewhere, especially in modern democracies, often without direct American influence, because pluralism has become globalized. In Israel the focus will be on Jews within the borders of the state—relations between Israel and the Palestinians, and the existential threat to Israel from both Sunni and Shi‘a Islamism, fall outside the problematic of pluralism. But the pluralism between Judaism and secularity within the Jewish population of Israel has a peculiarly American flavor. Different from America is the privileged status of the Orthodox rabbinate, started on the eve of independence by a pragmatic compromise between Ben Gurion and the then much smaller Orthodox community. Then the complexity of the Israeli electoral system has given large power to the religious political parties. However, direct American influence has been significant. The originally Protestant DNA of denominationalism has found fertile ground in American Judaism. It has re-appeared in Israel, though so far the Orthodox rabbinate has prevented its full flowering. Here are these American Jewish feminists, dressed in the religious vestments of their grandfathers, advancing toward the Western Wall carrying Torah scrolls.
Inevitably, I think, pluralism shakes the taken-for-granted status of parental traditions. Destiny becomes choice. This creates fragility—the memory survives that, after all, a different choice may have been made, perhaps still may be. Sometimes the sheer presence of “an other” starts a process of relativization: Here is somebody—evidently not stupid or crazy, perhaps even simpatico—who does not share my previous certainties. I recall an incident many years ago—soon after the historic moment when the Surgeon General of the United States, dressed in what looked like an admiral’s uniform in the Ruritanian navy, launched the war on tobacco, which has changed the lives of everybody from Vienna to Vladivostok (maybe not Vladivostok—cigarettes still go with vodka…). I was then teaching in Connecticut, at the Hartford Theological Seminary. My old friend and colleague Thomas Luckmann was visiting. Also visiting was another faculty member, his wife, and their five-year-old daughter. The couple were liberal Protestants, who had shed their old orthodoxy and were looking for substitutes. Luckmann, as was his habit, took out his pipe and started to smoke. The little girl, who had happily chatted with him, went silent, stared at him with wide-open eyes and a slightly open mouth: “You are smoking! Don’t you know that this is bad for you?” He went on smoking and said: “No. I don’t know this.” After a slight pause, the little girl got up and ran out, calling out to her parents, “Look, this man is smoking!”
When religion can no longer be taken for granted, it is still possible to have faith, but that faith will also be accompanied by a penumbra of doubt. One may decide to have faith, but every decision is in principle reversible. Religious certainty becomes a scarce commodity. Religion touches on the deepest hopes and fears of the human condition. There is a yearning for such certainty. Fundamentalism can be described as a project to renew a past certainty or to embrace a new one. Such a project may be religious or secular, but its psychology, I think, can be best explained by the sociology of religion. The basic aim must be to suppress or contain doubt. If the fundamentalist project is to be imposed on an entire society, it requires a totalitarian state that will effectively prevent any cognitive contamination from the outside. Modern information technology makes this very difficult, as do the communications requirements of a modern economy. Slightly less difficult is a tactic of giving up on the larger society and confining the fundamentalist project to a sectarian or subcultural community. It helps if the community can be located in a physically isolated space. Brigham Young understood this very well as he led the Mormons across an entire continent until they got to Utah, where he could say, “This is the place!” If the location is urban, the physical isolation must be replaced by social and psychological isolation. A perfect comparison is between the ultra-Orthodox Jewish communities in Brooklyn and in Jerusalem. Both have constructed quasi-Disneyland replications of a traditional east-European shtetl—rigorously segregated not only from the wider pluralistic society but from the wider Jewish community—separate schools, separate media, relations with the outside world limited as far as possible to economic ones. In both cases the residents know that they could leave—the police could not stop them—they would just have to leave behind every vestige of Hasidic garb, put on a baseball cap, and (as the case may be) take the subway from Brooklyn to Manhattan, or a bus from Jerusalem to Tel Aviv. I find it particularly instructive that in both cases there is even a linguistic barrier against the outside—Yiddish used as a vernacular, fending of the contamination that could come in through English or modern Israeli Hebrew! It is obviously difficult for an individual raised in this environment to “jump over the wall.” Difficult, but not impossible: See a fascinating account of just this transition by Lynn Davidman (a sociologist at the University of Kansas), Becoming Un-Orthodox (2015).
Existence in a world where nothing is taken for granted, living with doubt and uncertainty, having to make choices—one recalls here the famous statement by Jean-Paul Sartre, the father of post-World War II French existentialism, that “man is condemned to freedom.” As a universally applicable description of the human predicament, this is not a plausible proposition. It is quite plausible to describe the predicament of human beings who were thrown into a world shaped by modern pluralism. It is important to keep in mind that there are still large numbers of people who are indeed “condemned” to be what they were born as. “Existentialists” are not entitled to belittle the humanity of those whose world is still firmly anchored in tradition, kinship, tribe, or village. Also, I’m fully aware of the many people who, even today, claim to have had experiences of ultimate or supernatural reality that have left them with enduring certainties. (Not for nothing have I spent years studying Pentecostalism, which is the most rapidly growing religious movement of our time!) As one not so blessed (or “condemned”), I’m not entitled to diagnose such people as suffering from illusion or false consciousness. I can only say, “Sorry, but I have not had your experience—and, to be frank, I’m not sorry.”
These considerations give a different perspective on faith in our time: The opposite of faith is not unbelief, but knowledge. I don’t need faith to affirm what I know—for example, that the skyline I see from the window of my study is of Boston, not of New York. But when I believe that the concierge downstairs is not planning to kill me, I don’t really know this, but I have faith in this belief. It is not irrational—he has been around for several years, he has always been friendly and helpful. I would say that any religious affirmation I might be able to make (hesitantly) is closer to my faith in the benevolent concierge than to my knowledge of my geographical location. If we spiritual cousins of Sartre are honest, we should describe ourselves as agnostics/”not-knowers.” There are more and more of us in advanced capitalist democracies. Sociologists like to call us “nones”—people who say “none” when asked about religious affiliation. Thanks to the Pew Research Center, the Washington outfit that does religion surveys all over the world, we now have a lot of data about the ”nones.” They are certainly not a “league of the godless”—many say that they believe in God, that they regularly pray. But they have not found a church or temple with which they are comfortable. It is an interesting and important demographic. But I find another group more interesting—I call them the “buts.” These are people who are affiliated, but with reservations—like “I am Catholic, but…”—followed by a list of Catholic teachings and practices that they don’t agree with (often including the authority of the church in matters of personal morals).
There is much more to be discussed about what Meir calls a “dialogic theology. I want touch on one question on which I’m mildly (very mildly) critical of Meir’s project: In an honest dialogue does one sometimes have to say “no”? John Hicks (1922-2012), the British Protestant theologian who wrote influential books about interreligious dialogue, created a very telling metaphor: We need a “Copernican revolution” in theology—instead of looking at the earth/our own faith as the center round which everything revolves, we should see our faith as one of several planets revolving around the sun of ultimate reality. Each planet provides an instructive perspective on that reality. It is a very attractive picture, but it leaves out one possibility (which, I suspect, Meir also leaves out)—that some planets may not look at the sun at all, but are facing away from it. If all perspectives are equally true, there is no truth at all. I think that such sharp alternatives appear in what I call the dialogue between Benares and Jerusalem, between the perceptions of reality emerging from the religious experience of the Indian subcontinent, and the perceptions of the monotheistic faiths which originated in the Middle East. I want to emphasize that this dialogue too could occur in the amicable stance of “listening.”
But there could be a rather less amicable reason for saying “no” to a dialogue—a moral reason. This could be either because one wants to have nothing to do with the putative interlocutor: I don’t think I would want to enter unto dialogue with whatever degenerate imams legitimate the hell on earth being instituted by ISIS in the areas it controls in Iraq and Syria. Or suppose there still survived the cult of human sacrifice which existed in Mesoamerica in pre-Colombian times. Imagine, say, that a delegation of Aztec theologians were welcomed an interreligious conference at the World Council of Churches in Geneva: “Thank you very much for coming to this conference. We are greatly looking forward to hearing your paper explaining why the gods have to be fed by the blood of sacrificial victims….”
Or one may say “no” to dialogue because the divine being affirmed by the putative interlocutor is morally loathsome. I feel confident that the Calvinist doctrine of double predestination merits this designation—God has decided before the beginning of time who will be saved and who damned, and it is not up to us to question his sovereign will. Jonathan Edwards (1703-1758), who began his career as a Puritan minister in Northampton, Massachusetts, and ended up as president of the college that became Princeton University, was fiercely committed to Calvinist orthodoxy (rather in tension with his role in the First Great Awakening, which tried to save as many as possible to faith in Jesus Christ). He was author of the famous sermon “Sinners in the Hands of an Angry God,” which describes the saved in heaven (there through no merit of their own) looking down on those tormented in hell (there for no fault of their own), and praising God for his justice. I would award Edwards’ sermon the prize of author of the most repulsive document in the history of Christian thought. The only candidate who might capture that prize from Edwards would be Gregory of Rimini (1300-1358), who wrote that unbaptized children, innocent of sin, would not only be permanently confined to limbo where they suffer from being deprived of the presence of God (as Thomas Aquinas taught), but would suffer positive pain. There has been a curious revival of Calvinism in the Southern Baptist Convention. Roger Mason (an Evangelical theologian at Baylor University) authored Against Calvinism (2011), in which he described the Calvinist divinity as “loathsome,” a God whom he could not worship.
Personal disclosure: I call myself a “Lutheran, but”—actually several buts). I do believe in a positively painful version of hell—in which Edwards and Rimini are room-mates.