Thursday, December 24, 2015

Church and State

I’ve never fully understood how exactly it can be constitutional for Christmas to be a federal holiday in a nation that endlessly prides itself on how carefully it guards the boundary between church and state. I come to the issue, therefore, from precisely the opposite direction from all those outraged types who write letters to the editor at this time of year to express their indignation at having received a “Happy Holidays” card from their newspaper deliverer or local school board instead of a bona fide Christmas card, or their irritation over their end-of-the-year office party being thrown to celebrate not Christmas but something fully non-specific and only vaguely festive like “the holidays” or, even more bizarrely, winter itself. I didn’t get a “holiday card” this year from the Obamas—perhaps (but even I myself don’t really think this) a subtle response to all those e-mails about the Iran deal—but I did get one from the Bidens and the Andrew Cuomos…and, true to P.C. form, neither mentions any actual holiday. (The Cuomos’ card wishes us well during the “holiday season.” The Bidens’ wishes us “many blessings in 2016.”)  It’s an unsubtle ruse. I know what they mean. They know that I know what they mean. And I know that they know it too. (The holly wreaths with red ribbons adorning the windows of what I suppose must be the Biden home—their “real” home, I think, not Number One Observatory Circle—on the cover of their card are the giveaway.) Still, I’ve calmed down over the years. I no longer find it annoying to be wished a merry Christmas by salespeople trying to be friendly and pleasant, or not too annoying. I cleverly but probably over-subtly register my pique with the whole thing by avoiding malls and post offices, even banks, in December as best I can. I suppose I can live with the White House having a Christmas tree. But I still don’t fully understand how it can be legal for the government formally and purposefully to foster the public celebration of a religion-specific festival in a nation of self-proclaimed disestablishmentarians.

Nor is the point that I simply disagree. It’s also that I’ve never been able quite to understand why Christians who take their faith seriously would even want people outside the church to glom onto their best holiday, one possessed of the kind of deep spiritual significance that can only be diluted by bringing into the mix people for whom the holiday has no religious meaning at all. Isn’t it just a bit insulting to people who take their Christian faith seriously to suggest that even non-belief in the most basic articles of that faith does not constitute sufficient reason not to celebrate its festivals? I can’t see how it could not be! And so, when I see those bumper stickers encouraging Christians to put the Christ back into Christmas, I’m in complete agreement because I too would like nothing more than for Christmas to turn back into a Christian holiday possessed of deep meaning for the faithful, something that it would be absurd, even mildly offensive, for non-Christians to embrace at all, let alone enthusiastically. Is it really all about selling toys? I suppose that is probably is!

Nor do I feel this way only about other people’s religions: I am an equal-opportunity Grinch. When I hear that the White House is having yet another Pesach seder and that the President and First Lady are both planning to attend, I feel a sense of dismay tinged with guilt: the latter because I realize I’m supposed to be thrilled that the leader of the free world is willing to make such a public display of the warmth he feels towards his Jewish co-citizens, but the former because I don’t really want non-Jews to co-opt Jewish rituals to make some sort of dramatic statement about their own liberality without actually embracing any of the ideas or concepts that undergird the rituals in question. When I read a few weeks ago about the President hosting a festive menorah-lighting ceremony at the White House, I felt the same mix of pride and ill ease. I get it—I’m supposed to be thrilled that Jewish Americans are welcome to perform Jewish rituals in the White House. But shouldn’t the most public of our nation’s buildings specifically not be the backdrop for religion-specific rituals that all Americans neither can nor should embrace? Nor do I fix my gaze in this regard only on the government: I find the endless efforts of Chabad to set up those giant, weirdly-angular menorahs in the public square equally unsettling. Surely, they’re acting out of conviction. But I can’t help thinking that every step we take towards weakening the separation of church and state—an expression, by the way, that most seem to suppose comes from the Constitution, but which was actually coined by Thomas Jefferson years later—is a step towards weakening our right to pursue our spiritual path without interference from outside parties, most definitely including the federal government. Or any government.

This year, though, my feelings about the separation of church and state are different than the past because it seems to impossible to consider these issues any longer without bringing Muslim America into the mix. Our 2.7 million Muslim co-citizens are clearly having a rough time. Article after article in the newspapers I read and at the on-line news sites I frequent are detailing almost daily how complicated a time this is for Muslims who must grapple with the fact that there are lots of people out there who are selling a version of Islam radically (to use precisely the right word) different from their own. And it seems slowly to be dawning on American Muslims that, particularly after San Bernardino, it will no longer be enough merely to insist that the jihadist version of their faith is just a perversion of Islam and thus not something “regular” Muslims need to think or worry about. (That, of course, is precisely what the Islamicist radicals behind all these terrorist strikes say about non-radical Islam! For the most recent of these articles, this one by Laurie Goodman and published in the New York Times earlier this week, click here.) But precisely when it feels like the right thing to do would be to encourage American Muslims to break formally and absolutely with the extremists in their midst by getting the President to welcome American Muslims to the White House for another Eid al-Fitr banquet like the one he hosted last June (in other words, by creating the sense that American Muslims can be part of our national fabric in the same way that Christians and Jews can be and are), that is precisely when I think we should redouble our efforts to re-erect the once unscalable wall between church and state that has slowly been eroded over the last decades.

American Muslims have a huge problem on their hand. They themselves are not such a unified group. They are slowly awakening to the fact that there are among them jihadists like the San Bernardino killers…and that the responsibility for tolerating the kind of extremism that leads to violence cannot solely be set on the shoulders of overseas clerics. My sense is that we would do well to make it clear that our secular government does not instruct its citizens what to believe or what spiritual path to follow, that the whole concept of religious freedom only works if the sole role the government plays in the internal workings of American faith communities is to play no role at all. If Muslims wish to renounce jihadism and terror, then they are going to have to stand up and be counted…on their own and in their own communities and mosques. 

Just recently, I read about something called the Muslim Reform Movement, a tiny organization headed by just fifteen Muslim leaders from the U.S., Canada, the U.K., and Denmark that has begun to take matters into their own hands to foster a version of Islam that is liberal, tolerant, and broad-minded. (To see more about the organization, click here.  To read a very interesting editorial that appeared two weeks ago in the Boston Globe about the group, click here.) I know that many of us view efforts like this with extreme skepticism. I feel that way myself. And, given the fact that there are something like 1.6 billion Muslims in the world, the influence of these thirteen brave souls will be, at least at first, severely limited. Still, the solution cannot be imposed from without: what Islam will be like in 2070, or thereabouts, when the number of Muslims in the world surpasses the number of Christians, is in the hands of today’s Muslims. Merely paying lip service to pluralism and tolerance will not be anywhere near enough. And, yes, to raise the issue that (at least for most) dare not say its name, leaving Israel out of the mix would also constitute a grave error: if Muslims are going to foster an American version of Islam that is truly pluralistic and progressive, then they are going to have to find a way to embrace the reality of Israel and the presence of the Jewish state among the nations of the world. Absent that, the whole undertaking will be, at least as far as I myself am concerned, doomed to irrelevance. If I can live with an Islamic Iran, then America’s Muslims can live with a Jewish Israel.


American Muslims do not need to be patronized by the government with special White House photo ops; they need to be left alone to chart a course forward that will affect the history of the world in a positive way by renouncing violence and terror…and embracing the core values that rest at the center of American culture, and the separation of church and state foremost among them. Many of you—both congregants and readers—have responded negatively when I’ve written or preached about this possibility in the past, expressing the notion that I am living in a fool’s paradise if I think that Islam could possibly embrace the liberal values that are the beating heart of the Western democratic enterprise. I suppose I could be. (I’m a rabbi, not a prophet!) But the Pew Research institute projects that there will be 2.8 billion Muslims in the world by mid-century…and that number makes it crucial for us in this country to support the moderates and liberals who would reform Islam. Could these people succeed? It is hard to say. Certainly, the odds are against them. But it is precisely in our country, where the wall between religion and government was meant by our founders to be iron-clad, that the kind of protestant Islam that the world so desperately needs could possibly take root and flourish. The chances of success are not good at all. But not good is better than non-existent…and so, as a new year dawns on our troubled land, I suggest we take “not good” as the best option available and see how far we can get. 

Thursday, December 17, 2015

Compassion

I have generally been an admirer of Dennis Prager’s writing, and particularly of the books he jointly authored with Joseph Telushkin. Nonetheless, I found myself aghast at a piece he published the other week on the JewishJournal.com website, the on-line presence of the Los Angeles-based Jewish Journal, in which he writes acidulously about people who wish to find a dignified place in the world for transgender people. He admits readily that it must be “awful” to go through life possessed of the conviction that you are a prisoner in your own body, that your gender and sex are so out of sync that you can’t find a place for yourself in the world, that neither of the doors at the end of the hall leading to the restrooms (the one labelled “Men” and the other, “Women”) feels as though it describes you in quite the same way it appears to describe everybody else in the world, or almost everybody. But when he turns his withering gaze to people (like myself) who feel for such people and wish to find a way for them to function in society other than as outcasts and freaks (and other than by telling them they simply can’t go to school, can’t join a gym, can’t use a public restroom, can’t frequent a public swimming pool, etc.), he seems to have forgotten the pain that he himself acknowledges surely must result from feeling trapped in your own skin and writes as though gender dysphoria were just another thing someone somewhere made up to justify special treatment for some tiny group of whiners who don’t want to play by the same rules as the rest of the world.

Then, to add fuel to his fire, he turns to his readers and attempts to explain how, given the Torah’s prohibition of crossdressing, any Jewish person could possibly fall for the whole transgender scam in the first place. (The fact that transvestitism and gender dysphoria are not at all the same thing appears unknown to the author.) First, he suggests shamelessly that those who don’t share his view about transgender people must obviously also believe, and I quote, that “the Torah is essentially useless as a guide to living,” and that, whenever their own opinion differs from that put forth in Scripture, they must be the kind of spiritual egotists who simply assume that the Torah, not they themselves, must be wrong. And then, as if that line of thinking weren’t insulting enough, he offers an alternate explanation: that any who feel for transgender people must clearly have been tricked by their own sense of compassion into betraying the values of their faith and their God.

I don’t want to write here about the issues of transgendered people per se. It’s a big topic that I hope to address in more detail at a later time and it’s also true that my own thinking is evolving slowly as I learn more and read more. Instead, I’d like to discuss the concept of compassion…and particularly in light of the suggestion in Prager’s article that allowing one’s sense of compassion to justify the effort to bring one’s allegiance to Scripture into sync with one’s sense of right and wrong is a sign of spiritual depravity, or at least of one’s arrogant assumption that one is more able than God to find the boundary line between right and wrong.

Compassion, for most people, is an apple-pie value, a moral attribute so unambiguously virtuous so as to make it odd even to question its worth. The word in English suggests as much: the “com” part means “with,” and the “passion” part comes from the Greek word for “suffering.” Thus “compassion” is the quality of being able not solely to suffer in your own right, but to feel—or at least to feel sensitive to—the misery also of others. Compassionate people, therefore, live at the intersection of sympathy and empathy, always trying to be guided not only by the way they themselves feel, but also—and perhaps even more so—by the way they imagine other people feel. The compassionate individual, therefore, is someone who understands that kindness does not imply moral weakness, let alone depravity, but moral strength: it is the quality of seeing the world through another’s eyes and acting accordingly not because one is too stupid to have an opinion of one’s own, but because one has enough respect for others also to respect their opinions and the specific way they view and interpret the world.

To apply the concept to the transgender people of this world is not to be weak-willed or foolish, let alone immoral. It is merely to look at someone suffering in the world and, instead of mocking that person for having to struggle with issues with which you yourself have been spared from having to grapple, finding it in your heart to wish for that person to find a path forward in life that does not involve endless degradation or self-denial. Prager’s coarse prediction that treating transgendered people with compassion will lead directly to schools being required, eventually by law, to allow young people with boys’ bodies to parade around naked in the girls’ locker room seems beyond exaggerated: the idea that embracing compassion will lead directly to public vulgarity seems to me to posit a bizarrely narrow sense of how people suffering from gender dysphoria—and particularly young people—could be helped without that help impinging on the natural rights of all people to feel secure and safe in public washrooms and in their swimming pools’ changing areas and in the locker rooms at their gyms. (The locker room issue is real, to be sure: click here and here. My point is that there’s something inherently bogus about supposition that the only choices are to tolerate inappropriate, unsettling behavior or to treat transgender young people harshly and without compassion. Surely, a nation as clever as our own can come up with a solution that leaves the dignity of all parties intact!)

And then Prager goes on to give another example of misplaced compassion leading its adherents down the road to perdition: race-based affirmative action. Affirmative action is the kind of complicated concept, the constitutionality of which the Supreme Court itself is currently attempting to unravel. Nor is it obvious, constitutional or not, how effective a tool it actually is. The latest argument against, usually referenced with the word “mismatch,” implies that giving students drawn from underrepresented minorities places in colleges to which they might not otherwise be admitted is actually a disservice to them, since they cannot possibly compete with the “regular” students who got into those schools in the normal way and without any extra help. This, in Prager’s opinion, constitutes yet another example of how compassion can lead past “just” political correctness to actual harm.

Justice Roberts makes sense to me when he notes that “the way to stop discrimination on the basis of race is to stop discriminating on the basis of race.” And there are surely many real reasons to consider the whole “mismatch” issue seriously. Just lately I’ve read two pieces on the topic on the website of the Washington Post which impressed me and which I recommend to my readers. In one of them, Richard H. Sander, a professor of law at the U.C.L.A. School of Law, argues persuasively that the best interests of minority students are not served by helping them into schools in which they are unlikely to succeed. In the other, by Richard Rothstein, senior fellow at the Chief Justice Earl Warren Institute on Law and Social Policy at the University of California [Berkeley] School of Law, the author makes an equally persuasive argument that the whole “mismatch” issue is exaggerated and that the only practical way to deal with the hurdles young black students face as they make their way forward in the world is to make sure that they are not denied educational opportunities that by all rights should be theirs. Both make excellent arguments, but, regardless of which view eventually prevails, the underrepresentation of certain racial, ethnic, and social-class-based minorities in our best universities is a real issue to be pondered by Americans devoted to equality and, yes, possessed of a spirit of compassion for the disadvantaged. (To see Sander’s piece, click here. To see Rothstein’s essay, which originally appeared in American Prospect, click here.) In any event, both authors clearly have the same larger goal in mind: the creation of a color-blind society in which race neither enhances anyone’s chances for success nor detracts from it. Nor would either argue, I suspect, that there is anything base about feeling compelled by one’s sense of compassion for the underprivileged to work for a more just society. To determine how best actually to help is a different issue. But to argue that compassion itself is the problem is, at best, a perverse argument to make at all, and particularly for someone as steeped in Jewish tradition as Dennis Prager.


Prager’s essay ends with a rhetorical question: “If the Torah is not our guide, who or what will be?” It sounds like such a simple choice when put that way: either we embrace the Torah and allows it to guide us forward, or else we discard it and choose a different book or individual as the font of wisdom from which we drink and as our moral guide through life. But that is more of a fool’s choice than a serious one. We hold fast to the Torah as our tree of life and we endlessly study its intricacies and riddles. But for all our endless lip-service to notion of the Torah as God-given Scripture suffused with its Author’s divine spirit, we are not biblical Jews whose sole allegiance is to the simple meaning of Scripture and neither have Jews ever believed that it could be possible to be faithful to God’s law while behaving immorally at the same time. To look at someone who is riven with conflict about his or her gender-identity and not to respond with kindness, with compassion, and with a willingness to work to find a way for such people to know the kind of inner peace that comes naturally to people not afflicted with gender dysphoria—that would be to turn our back on the lessons the Torah teaches to see the divine image in all humankind…and to bring only compassion and kindness to bear in evaluating the downcast and the marginalized in society. And it would also be to ignore the fact that being compassionate is specifically listed in Scripture as one of the thirteen attributes of God, a virtue therefore to be cherished and embraced by all who would walk in God’s ways.

Thursday, December 10, 2015

Chanukah 2015

As I was reading the paper the other day, I unexpectedly came across an article about some parallel scientific studies being undertaken in Denmark and in our country. At first, they sounded like the kind of detailed, complicated studies which only scientists could love…or even understand. But then, upon further reflection, I found myself drawn to them and wishing to learn more. And then, entirely unexpectedly, my thoughts turned to one of the riddles of Chanukah…and I found a plausible answer sitting right before my eyes.

When I was in high school, the concept of genetic heritage was presented to us as a kind of code embedded in our cells that we are able to pass along to our offspring if and when we manage to reproduce. As opposed to, say, citizenship, which can be passed along from parents to children but which has no physical aspect to its existence, we were taught to think of our genetic heritage as something fully real in the physical sense (because genes, teensy-weensy though they may be, exist as actual, physical things) and thus not that different from money or property or any other part of a parent’s estate that a child might acquire as a gift from a still-living parent. 

How it all worked was a bit mysterious, surely more than slightly arbitrary. Unless they are identical twins, for example, siblings receive different sets of these gifts from their same two parents. This accounts for the differences between them and was explained to us with reference to the fact that children have two parents, not one, and that the various parts of those parents’ genetic heritage combine in different ways on different conceptive occasions to create different genetic gifts to a couple’s different children. But our genetic heritage was presented to us not only as arbitrary, but also as immutable: you can do what you can to resist the siren call of your genes but they constitute a gift—generally some combination of blessing and burden—that cannot be altered, only inherited and gratefully accepted, actively resisted or passively given in to. I didn’t really understand the whole thing then and I’m sure I don’t fully understand it now. But one thing that was completely clear, even to my tenth-grade self, was that genetics is unalterable destiny, something to be pleased about or struggled against but about which you can’t do a damn thing! Nor, needless to say, can you control the contents of your own future genetic gift to whatever offspring you may eventually produce.

Apparently, I was wrong. In 2010, several professors at the University of Copenhagen found that they could alter the sperm of male rats not by addressing their genetic make-up at all but rather by subjecting them to different sets of experiences. One set of rats, for example, was made obese by being fed very high-fat foods. This was a post-birth phenomenon, obviously. So, at least theoretically, the rats—none of whom was predisposed to obesity—should not have had a higher percentage of obese offspring than rats that were fed a normal diet. But they did. And so began a long, complex set of experiments intended to determine if the genetic heritage bequeathed to offspring can be altered by experience.  In 2013, a group of scientists led by Adelheid Soubry, a molecular epidemiologist at the Duke University Medical Center in North Carolina, attempted to perform a similar experiment on human subjects and concluded that experience can indeed alter a man’s sperm in a way that affects the genes a man bequeaths to his offspring. And now the Danes have published a study in a very respected journal, Cell Metabolism, that supports that conclusion. (The science is complicated and I won’t attempt to review it here. It has to do with the way sperm is or isn’t altered by experience to bring certain features of that man’s genetic heritage to the fore. The genes themselves are not supposed to self-alter through the experience of experience. But if the specific way they configure in the context of reproduction can be weighted differently by some specific experience that the man in question has had, then it more or less comes to the same thing. Or at least it does from the vantage point of the embryo that inherits that man’s DNA configured differently than it might otherwise have been.)

Others are less sure about how meaningful the results really are. Many of the arguments against accepting the results of these studies are very complex but, to the extent I was able to follow them, also very interesting. To learn more about these studies, both for and against, click here to read the article by Carl Zimmer mentioned above that was published in the New York Times last week. To read a précis of the Cell Metabolism article (not recommended for people who last encountered the study of biology in tenth grade), click here

I’m hardly in a position to offer an opinion about the worth of the research, but I find it fascinating nonetheless…and not solely because of its implications for our understanding of the human reproductive process. What I find fascinating is the possibility that the role of experience might be no less meaningful on the national level as a people moves forward through history and bequeaths its national culture to new generation after new generation.

There’s no question that Judaism itself—as well as its much maligned stepsister, Jewishness—has developed over the millennia. Every student of the Bible can see how different modern Jewish religion is from the faith depicted in the pages of Scripture. But Judaism today isn’t only different from the Israelites’ religion in biblical times. It is also dramatically different from the Judaism described in the Talmud and even, in profound and meaningful ways, from the Judaism of medieval times. That religions develop over dozens of generations is hardly a great discovery. But what makes religions develop in the specific ways they do develop? What makes some innovations successful and others wholly unsuccessful? Why does an entire people barely pause to notice when whole bodies of scriptural law are summarily dropped—I’m thinking, for example, of the elaborate laws that the Torah sets forth governing inheritance, laws more or less universally ignored today including in the most pious circles—while other practices dating back only three or four centuries have not only established themselves as authentic Jewish rituals but are universally observed in every synagogue community? Are these developments entirely arbitrary? Or is it possible that experience shapes the genetic code—or whatever you’d call it on the national level—that passes silently and subtly from generation to generation? In other words, we are used to thinking of history as the result of Jewishness—what happened to us being a function of who we are—but what if the reverse were true (or also true) and history were rationally to be understood as the set of nation-wide experiences that rests invisibly at the generative core of Jewish life not unlike the way the sperm itself that conveys a man’s genetic heritage to his newly conceived child vanishes into the embryo and is never heard from again other than by manifesting itself in the nature and culture of the man that embryo eventually grows to become?

It would be interesting to think of Chanukah in that vein. It’s not a biblical holiday. Lots of other events of arguably equal importance historically failed to turn into holidays. (One of the few books from outside the standard rabbinic literary corpus to survive intact from the early rabbinic period, Megillat Taanit, is basically a detailed list of thirty-five such politically and historically important days.) Chanukah should have been in that category—a week of days on an ancient list during which eulogizing and fasting was forbidden because of some positive historical event that once happened. But somehow that isn’t what happened.  The experiences of exile and restoration, of being assaulted by a hostile culture and having to find a way to preserve our national cultural heritage despite the pressure to adopt what is touted to us as “world” culture (and thus by definition something superior to our rinky-dink set of beliefs, customs, stories, and ceremonies), the experience of finding the courage to stand up to the world and refuse to vanish merely because a set of self-appointed pundits can’t understand why we wouldn’t want to be a modern nation according to their definition of the term…that set of experiences related to the nation growing up spiritually, nationally, militarily, economically, and, if one can say such a thing about nations, emotionally…that was something that shaped our national DNA permanently and left us different than we otherwise might have been.

That a man’s experiences in life can alter the destiny of his children by affecting his sperm in specific ways is a tantalizing notion. Whether it’s true, who knows? But that the same could be true of national cultures—that they are not so much the source of national experience as they are the product of those experiences’ effect on the transmission of that culture to subsequent generations—that theory strikes me as truly tantalizing. It could go a long way to explaining why Chanukah, which shouldn’t really have been a festival in the first place and which certainly doesn’t feel like it merits the major place on the Jewish calendar it now occupies, has taken such a prominent place in our festal calendar. The rabbis of ancient times had no difficulty permitting the blessing recited while the Chanukah candles are lit to refer to God as having commanded us to kindle them. That that commandment appears nowhere in the Torah, which fact the rabbis surely knew perfectly well, makes perfect sense—Judah Maccabee lived a full millennium after Moses. But perhaps the rabbis were onto something nevertheless. Could it be that God ultimately sanctifies the House of Israel specifically by allowing this concept of experience-altered reality to guide the nation’s religious practices? Could the plan all along have possibly been that, no matter how far afield of Scripture Jews allow their faith to develop, they will always feel themselves under God’s watchful protection and truly to be sanctified by God’s commandment to act in harmony with their own historical experience and its exigencies? That is the thought I offer you to ponder as Chanukah draws to a close and we move on to less festive weeks and, presumably, the eventual arrival of “real” winter.


Thursday, December 3, 2015

As the Seas Rise

Every marriage has its compromises, and one of ours has to do with so-called “disaster” movies. I am drawn to them and Joan (occasionally) endures them. She uses the unappealing expression “disaster porn” derogatorily to qualify this specific peculiarity in my set of otherwise urbane and sophisticated artistic tastes and I keep my peace. I recall the rabbi who married us pointing out when we met him before the wedding that compromise in the context of marriage doesn’t mean meeting each other halfway exactly, but rather requires that each party go a good three-quarters of the way towards the other’s position so as to create a huge swath of middle ground that can easily accommodate inexactitude in terms of just how far one is prepared really to give in to effect the compromise in question. It was good advice. I offer it to my own brides and grooms all the time. I recommend it highly to all my married readers. But why I am drawn to these generally terrible movies…that is the more interesting question to ponder.

Knowing, or at least sensing on some level, that this all has to do with my obsessive reading regarding the Shoah and its horrors, I suppose I like these movies because, oddly, they all have happy endings. In Armageddon (1998), an asteroid the size of Texas threatens to destroy all life on earth, but Bruce Willis—albeit at the cost of his own life—saves the day at the very last minute. In Deep Impact (1998), a comet plunges into the Atlantic and creates a kind of mega-tsunami that devastates life on the Atlantic coasts of North America, South America, Europe, and Africa…but a last-ditch effort to blow up a second comet—one that would finish all the first comet’s survivors—actually is successful and life on earth ends up going on after all. In Independence Day (1996), it’s a fleet of huge, hostile (very hostile!) alien warships that attack earth and threaten to destroy life as we know it, but Randy Quaid, also at the cost of his own life, saves the day by discovering how to destroy the aliens’ spaceships once and for all. In The Day After Tomorrow (2004), it’s the weather—a lot of weather!—that render most of Asia, Europe, and North America uninhabitable. New York turns into an arctic wasteland with a mean temperature daily of -98° F., but eventually the storms abate. Survivors are located. The President returns from his Mexican exile. The effort to rebuild commences. Life goes on.

I could go on too. Contagion (2011) was about deadly viruses only eventually neutralized.  Volcano (1997) was about a volcano that suddenly erupts in downtown Los Angeles and wreaks unimaginable havoc. I even liked Pompeii (2014), which at least spared us the expected treacly ending as all the principals end up engulfed in the pyroclastic flow. But at least the rest of the empire survives! When I force myself to think clearly, I suppose the Shoah connection isn’t all that hard to explain either. What student of Jewish history could not like movies featuring horrific forces that threaten to destroy life as it was known in some specific place (Earth, Pompeii, L.A., etc.), but that in the end are themselves always defeated. There are always survivors. Life always goes on. Indeed, as the credits role, life is always already going on. And it is that weird combination of terrifying and uplifting, of horrific and hopeful, of unspeakable and encouraging that seems to draw me to these movies both as a lover of exciting movies and as a student of Jewish history.

But no one had to pay to see this week’s disaster epic unfold: all anyone had to do was turn on the television or open a newspaper to peruse the reports from Paris—how quickly the phrase “reports from Paris” has come to mean something entirely different than it did two weeks ago!—as the various reports coming out of the Climate Change Conference being held in Le Bourget, a suburb of Paris, from last week through next week were disseminated through the world’s media.

The basic concept is simple. The world, acting in concert, need to find some way effectively to stem greenhouse gas emissions lest we experience—and not in the distant future either but in most of our lifetimes and certainly in our children’s and grandchildren’s lifetimes—horrific things on a terrifying scale: extreme weather, worldwide drought, massive wildfires, disruption of the food supply, the spread of dangerous pathogens, and a rising sea that could eventually submerge many of the world’s greatest cities. To cut back these emissions to a level that the planet could manage to absorb without raising average temperatures would require a gargantuan amount of good will among nations that would be basically unparalleled in the annals of human history. That is unlikely enough, but the fact that the conference is being held under the auspices of the United Nations makes it feel even less likely that anything good will come of it. Nor is the history of efforts to address the problem on the scale necessary to make a difference at all encouraging. The 1997 Kyoto Protocol was a promising start, but the United States never signed on (considering that it placed an unfair burden on developed nations and risked seriously harming the U.S. economy) and developing countries like China and India—among the world’s worst greenhouse gas offenders—weren’t included at all. The 2009 Copenhagen Conference devoted to the same issues on the table today led more or less nowhere. And yet, more than 150 nations have stepped up and offered to do at least some of what it’s going to take to save the world from itself. Presumably, more concessions will be wrung out of the willing participants before the conference ends a week from today. Altruism among nations being even rarer than altruism among individuals, it all feels like a huge long shot…and yet if enough nations truly come to believe that the fate of the planet really is hanging in the balance, perhaps the results will be at least enough to make some sort of meaningful difference. Don’t these movies always end up with the world being saved?

For me personally, it’s the image of the sea rising that makes the greatest impact. Perhaps it’s a biblical thing. The trope of the sea rising and the poet’s sense of himself about to drown only to be saved at the very last minute is, after all, a regular feature of biblical imagery. And the prayers that are triggered by the fear of drowning as the water rises are as heartfelt as they should be famous. When, for example, the ancient whose poem became our sixty-ninth psalm wrote “let the deep not swallow me / let the mouth of the pit not close over me / answer me, O Lord,” it’s hard for people who take the dangers of climate change seriously not to empathize, and deeply.  Or consider Jonah’s heartfelt prayer: “You flung me into the depths of the sea / so that its currents surround me / and its waves pass over me…/ I feel the water rising to take my life / the depths slowly encompass me / seaweed swirls around my head / I can see the mountains rising from the floor of the sea / the earth is sealed off from me….”  But Jonah, of course, was saved from death in what would otherwise have been his watery tomb. And, indeed, the story of only almost drowning is a feature of Israelite history written small and large: first Moses almost drowns and is saved by Pharaoh’s daughter unwittingly acting as God’s agent of salvation, and then the entire people Israel itself almost drowns and is only saved because God creates walls of water that enable them to cross the seabed to safety. And, of course, the great exception merely proves the rule—the death of all people in the world but eight in the days of Noah’s flood—by reminding us that the waters rose once and could conceivably rise again. At the end of the story, after all, God’s promise not personally to annihilate humankind again with a flood does not mean that humankind will not be able to accomplish that all by itself!

As always, there are naysayers who insist that the governments of the world are cooking the books to create an atmosphere of world-wide hysteria they will then exploit for their own ends. But, at least as far as I can see, an overwhelming number of scientists, including those do not see any evidence of a significant rise in sea levels in the course of the nineteen centuries that preceded the twentieth, believe that the sea level has been steadily rising since 1900 somewhere between .04 and .1 inches a year. And this phenomenon, they explain as being caused basically by two factors, both triggered by human activity: one-third of the increase is due to the sea itself expanding as it warms, while the other two-thirds has been caused by glaciers and other land-based ice formations (particularly in the Arctic and Greenland) melting. (For more details, click here to read a page of facts produced on the topic by the National Ocean Service. And it also bears noting that the sea ice surrounding Antarctica is growing, not shrinking…but click here to see NASA’s explanation of how that unexpected detail too fits into the larger picture of a warming planet.) Just to put things into even more vivid perspective, scientists have concluded that if just the Jakobshavn glacier in Greenland alone were to melt completely, world sea levels would rise about 197 feet. Since about 634 million people live less than thirty feet above sea level, that’s a pretty terrifying statistic. All in all, the prospect of a rising sea is beyond terrifying, and not solely for the 44% of the world’s population that lives within ninety miles of the sea…and I say that not only as someone who lives on an island jutting out into the ocean, but as a member of the global community.


Will something meaningful come out of the Paris conference? It’s hard to say. The Pope is on board, having described a world-wide failure to produce profound change as an act of global suicide. So are more or less all the leaders of the free world. But there are plenty of nay-sayers. Some (although fewer and fewer) doubt the science. But others are opposed for other reasons entirely. In our country, for example, the House of Representatives just this week passed a pair of resolutions that would forbid the Environmental Protection Agency from implementing the rules announced earlier this year by the President to curb greenhouse gas emissions. The argument at home and abroad against committing to profound cuts in greenhouse gas emissions are the same ones levied against Kyoto: the developed world is being asked to shoulder an unfair part of the burden, and the responsibility of the governments of every nation, including our own, is to act on behalf of its citizenry…which means declining to take actions that will harm the national economy.  And yet, this really isn’t a movie. The waters really are rising. The nations of the Pacific Islands—places obscure to most of us like Tuvalu, Tonga, and Kiribati—are already contemplating the possibility of disappearing from the map entirely as the waters cover over their landmass and leave them to exist solely in the realm of history.  Whether the Kiribatians will, like Jonah, be saved in the end by being swallowed up by a giant fish seems, at best, unlikely. But disappear they surely will—or at least their country will—as the waters rise and the world, focused as always on the bottom line, dithers.  Or could the future unfold differently? The answer will be available to us all next week!

Thursday, November 19, 2015

Paris

Last week, I wrote about the cui bono concept—the presumption that wrongdoers generally do wrong because they rightly or wrongly expect some benefit to accrue to themselves as a result—and I applied it to the spate of untruths that seem just lately to have been functioning almost as the stock in trade of several of our most prominent presidential candidates. Today, I’d like to apply that same principle to the perpetrators of the horrific events of last Friday evening in Paris, events that cost 129 innocents their lives and left 352 others wounded, some of whom have died in the last few days and many of the rest of whom are still hospitalized in critical condition.

Let’s start, however, not in Paris but in Israel, where Rabbi Yaakov Litman and his eighteen-year-old son Natanel were also murdered last Friday. The rabbi and his son were ambushed by terrorists as they were driving last Friday afternoon to Meitar, a small town of 7,500 just northeast of Beersheba on the always-Israel side of the so-called Green Line. Leaving out the horrific detail that a Red Crescent ambulance, the Palestinian equivalent of the Red Cross or the Magen David Adom, apparently sped past the scene of carnage without stopping, I’d like to focus Rabbi Litman and Netanel’s murder through the cui bono lens by asking what their murderers expected—or at least were hoping—to accomplish. Did they hope to bring Israel to its knees by murdering two innocents? Did they imagine that the settlers in Judah or Samaria would be so shaken by their dastardly deed that, cowering in fear, they would simply respond by packing up their things and moving back to pre-1967 Israel? Surely the answer to both questions is no…but that leads us to ask what they did expect to accomplish. Did they consider the murder of Jews simply to constitute its own reward? Or is there an even more sinister, or at least more calculated, motive lurking behind the latest spate of terror attacks in Israel on both sides of the Green Line?

Let’s hold that thought while we consider Paris. The basics, everybody by now surely knows. Seven coordinated attacks at several different locations in and around the City of Light. Names unfamiliar to most of us just a few days ago—the Stade de France football stadium, the Bataclan concert venue, the Petit Cambodge restaurant, the Belle Equipe  and Le Carillon bars, the Casa Nostra pizzeria, the Eagles of Death Metal band, the Molenbeek neighborhood of Brussels, the Saint-Denis suburb of Paris—are now so familiar that it seems odd to think that, with the exception of Saint-Denis (home to the famous Gothic cathedral in which all but three of the kings of France are buried), I myself hadn’t heard of a single one of those names just a week ago. The simple, unadorned words of the president of France, “La France est en guerre,” were still ringing in my ears when, to my semi-amazement, he went on to back up his words with deeds by actually by going to war and attacking ISIS bases in Syria vigorously and violently, then by invoking Article 42.7 of the 2009 Lisbon Treaty that requires every member state of the European Union to respond to armed aggression against any other member “by all the means in their power,” and then by ordering hundreds of strikes against potential terrorist targets within France. Clearly, when President Hollande unequivocally labelled the events of last Friday as “acts of war,” he meant that literally and plans to conduct himself accordingly. Where France and the rest of the world, including our own country and Russia, go from here remains yet to be seen. But that France means to deal with the threat against its citizens forcefully and meaningfully seems beyond doubt.

As all of this was unfolding, I was still in the middle of Michel Houellebecq’s novel, Submission. (The author’s name is pronounced as though it were written “Wellbeck.”)  I’d read one other of the author’s books, his novel The Elementary Particles, which I found so vulgar as to border on the pornographic. (In my defense, I should mention that this is the kind of high-class porn that wins international literary prizes, not the kind they used to sell in midtown sleaze shops before Mayor Giuliani cleaned things up. But even so I’m still a bit embarrassed to admit that I read it through to the bitter end.)  Submission also has its share of crude passages, but is a bestseller throughout Europe—and particularly in France, Germany, and Italy—and is now available in our own country in an English translation by Lorin Stein published just two months ago.

The novel is set in 2023 and unfolds as the Muslim Brotherhood, evolved by the early 2020’s into a French political party, almost unexpectedly gains enough votes in a national election to make it almost impossible for the actual victors to form a government without them. This kind of parliamentary wrangling will be familiar to any who follow French (or Israeli or Canadian or British) politics, but what is truly shocking here is the easy plausibility of it all. The Socialists win the election, but are obliged to choose between the Muslim Brotherhood led by a fictional Mohammed Ben Abbes and the far-right Front National led by the very real Marine Le Pen if they wish to form an actual government. They weigh their options, but in the end determine that they will have a better chance of remaining in power if they choose the Muslim option, which they do. In a matter of weeks, Ben Abbes is the president of France. What happens next is both predictable and horrific. The unemployment problem is solved by eliminating most women from the work force. The national deficit is eliminated by ending mandatory education at age twelve. The problem of anti-Semitism is “solved” by encouraging Jews to immigrate to Israel. The university system is closed, then re-opened as a national grid of Islamic universities with exclusively Muslim faculty members (most of whom are merely the teachers from the previous system who have chosen to convert to Islam, a conversion that appears to require almost nothing at all other than a public declaration of willingness to embrace Islam). By the month, France grows closer and closer to re-attaining its nineteenth-century glory as other European countries install Muslim governments and as Morocco, Turkey, and Tunisia join the EU. French itself regains the supremacy it once had as the language of diplomacy and world trade.

Of course, the book’s premise is that France has been and still is so inept at integrating its Muslim population into the fabric of French society that if the Muslim Brotherhood were to become an actual party, the entire Muslim population of France—something like 4.5 million people—would vote en bloc for its list of candidates. (Of course, by 2023, the number of Muslim citizens in France will be that much greater, particularly if large numbers of refugees from Syria and other battle-torn Arab lands are admitted.) Whether that is a reasonable supposition or not is hard to say. Surely not every Muslim would vote for a Muslim party! But it is also true that the Muslims of France have not been well integrated into society and that very large numbers feel themselves to live outside of the intellectual and social milieu that non-Muslim French citizens consider their natural cultural climate.

Events like the last week’s horror in Paris constitute a major challenge for French society as a whole. If President Hollande’s war against ISIS goes well, then the nation will be able to unite behind that victory. But if it does not go well, and if large numbers of French citizens succumb to base prejudice and end up further marginalizing France’s millions of Muslim citizens, thus alienating them even more, then Houellebecq’s premise—that if and when a serious Muslim political party constitutes itself as a force to be reckoned with in French politics, the Muslim citizens of France will automatically and eagerly vote for them—then that premise, now the stuff of novels, may well become reality.

The challenge facing the French, therefore, is two-fold: to pursue its war against ISIS in the Middle East and in Europe and to pursue it relentlessly and with unyielding determination…but also to reach out to French Muslims and to invite them to join the battle against violent, Islamicist extremism. The time has clearly come for the Muslims of France to decide as a community where they stand and to what degree they are willing to stand by their countrymen in a battle against their own co-religionists. And that brings me back to the cui bono question that I asked earlier in this letter but didn’t answer: could the goal of this kind of horrific terror be specifically to goad non-Muslim France into creating the kind of illiberal atmosphere that could conceivably make the scenario presented by Michel Houellebecq in Submission a reality? That—and not the supposition that terror attacks against random civilians are undertaken merely to terrify—that strikes me as a rational response to the cui bono question that logic tells us must always be reasonably asked in the context of criminal acts.

And that brings me back to Rabbi Litman and his son. Why would anyone choose a car at random and murder its driver and passenger? Could the “real” goal of those attacks not be to kill this or that person, but to make the Israeli public even less likely to see the Palestinians as worthy partners in peace…or even the kind of people with whom one even could live peacefully in adjacent nations? The mood in Israel is grim these days as random violence against civilians is on the rise. How could it not be?  But the real challenge in these acts of random terror is not to find a way to legitimize the demonization of an entire people, but to find a way to combat the bad guys and to encourage the “regular” Palestinians to seize the reins of leadership and to negotiate a lasting, just peace with Israel.


Terrorist acts are not random acts of natural violence like hailstones or earthquakes, but focused, intentional deeds intended to make less likely the kind of peaceful coexistence between nations and peoples that extremists fear the most. France is entirely justified in its decision to go to war with ISIS. The question is whether the effort will make France stronger by making ISIS weaker…or whether it will just weaken ISIS in Syria or wherever, but leave the millions of disengaged, disenchanted Muslims in France more than ready to make real Michel Houellebecq’s dystopian fantasy. That is the question that churns and roils deep within as I contemplate the events of last Friday evening against the background of having just read Submission and internalized its dark, frightening message.

Thursday, November 12, 2015

Cui Bono?

Those of you who like a certain old-fashioned style of mystery writing will be familiar with the words cui bono, the Latin for “to whom the good?” or, more colloquially, “to whose benefit?”  It’s an old expression, going all the way back to Cicero who once praised a contemporary judge for insisting that all the evidence adduced in his court be focused through the prism of those two words, basing himself on the assumption—as true then as now—that when people break the law it is always because they perceive some benefit for themselves in doing so.  And thus should it logically also be reasonable to travel that path in the opposite direction by considering the crime and asking who exactly benefited from it because, at least in most cases, the benefitted party will almost always be the instigator of the crime…and possibly even its actual perpetrator.

When applied to crime, the idea seems simple enough. Thieves steal things because they wish to possess those things and presumably have no other way to acquire them.  Murderers murder, even, because they perceive some advantage that will accrue to them upon the deaths of their victims and are willing—at least in some jurisdictions—to risk the death penalty to derive that benefit. And the same feels as though it should be true about lying, that people tell lies because they see themselves profiting in some way by doing so. It’s certainly true of perjury: there are very grave penalties for willfully lying in court while under oath and it only seems reasonable to imagine that a citizen would risk those penalties solely because of some huge potential benefit imagined likely to come from passing off some lie as the truth. Why else would anyone risk huge fines and years of incarceration by lying in court? Surely not because they don’t see any advantage in doing so!

Often, applying this principle of cui bono (the first word is pronounced in one syllable to rhyme with “twee”) to lying outside of court is a no-brainer as well. Between 2009 and 2015, for example, the Volkswagen Group in effect lied to the world by programming the diesel engines featured in many models of its cars to detect when they were being tested and to change the performance read-out regarding the actual level of emissions being given off. Who stood to benefit is too obvious a question even to ask out loud—they themselves did, of course, managing to sell eleven million of such cars worldwide, including half a million in the U.S., by giving the false impression that those vehicles met standards that they in fact did not meet…and did not meet by staggering amounts. (In some cases, the vehicles in question actually emitted forty times the amount of nitrogen oxide than the test indicated.) In a different context entirely, one could say the same thing about the Grand Mufti of Jerusalem, who went on Israeli television just last week to insist that neither Solomon’s Temple nor the Second Temple ever stood atop the Temple Mount in Jerusalem. He may have overplayed his hand just a bit by insisting that there has been a mosque on the site since “the creation of the world,” but the lie itself—a theory supported, contra the New York Times, by no legitimate historians, scholars, or archeologists, and with no exceptions at all—clearly responds well to the cui bono test: by insisting that there never was a Temple on the Temple Mount, the whole concept of Jerusalem being the holiest of holy cities for Jews becomes a meaningless concept founded on self-serving myth rather than on historical reality…and it doesn’t take much insight into Middle Eastern politics to know whom that lie benefits. Why the mufti imagines anyone would have built the Western Wall if there was nothing atop the mount for its massive stones to support is hard to say. Perhaps the Kotel doesn’t exist either!

I could give lots more examples, but I’m actually more interested in the kind of lie that specifically does not respond well to the cui bono test. These lies do not seem to work to the advantage of those who tell them, but rather bring their tellers into disrepute and thus impact upon them solely negatively. So why would anyone tell them? That’s the question I’d like to write about this week.

I can think of lots of examples.  In his autobiography, Gifted Hands, presidential hopeful Dr. Ben Carson writes as follows:

At the end of my twelfth grade I marched at the head of the Memorial Day parade. I felt so proud, my chest bursting with ribbons and braids of every kind. To make it more wonderful, we had important visitors that day. Two soldiers who had won the Congressional Medal of Honor in Viet Nam were present. More exciting to me, General William Westmoreland (very prominent in the Viet Nam war) attended with an impressive entourage. Afterward, Sgt. Hunt introduced me to General Westmoreland, and I had dinner with him and the Congressional Medal winners. Later I was offered a full scholarship to West Point. I didn’t refuse the scholarship outright, but I let them know that a military career wasn’t where I saw myself going.

It’s a good story, but it’s at best sort of true. For one thing, General William Westmoreland, who had just completed his command of U.S. troops in Vietnam, was apparently not present in Detroit on Memorial Day in 1969.  For another, there is no record of West Point offering Carson a full scholarship, or any sort of scholarship.  It’s true that there is no tuition at the nation’s five military academies, and it really is easy to imagine someone using the term “scholarship” to describe what students “get” at schools with free tuition. (And it’s also true that West Point itself has occasionally used the word “scholarship” to describe its free tuition policy.) But the reference to not refusing the offer outright certainly implies that an actual offer was made…and that’s the part that seems not quite to be so. It is surely possible that young Ben Carson met General Westmoreland somewhere, perhaps on one of the general’s visits to Detroit earlier that year. And it surely sounds reasonable that the general might have touted aloud the value of a West Point education. But the story as told—and as repeated in others of the doctor’s books as well—seems at best to be true-ish, but not precisely accurate. But Dr. Carson is a very accomplished man—a highly respected neurosurgeon, for many years the Director of Pediatric Neurosurgery at Johns Hopkins Hospital, and a recipient of the Presidential Medal of Freedom, our nation’s highest civilian honor. He hardly has to tell a fib about his acceptance to West Point to gain the respect of would-be voters…yet tell that story he apparently did. But why? Cui bono? He couldn’t possibly have imagined that a story that seemed so unlikely wouldn’t be checked and rechecked by investigative reporters eager to make their careers on the ashes of someone else’s reputation!

Nor is this a Republican issue per se. Hillary Clinton herself was caught in a lie back in April when she made the bogus claim that all of her grandparents were immigrants to the United States. She can’t not have known that that isn’t true—one of her grandparents, her paternal grandfather, was born in England, but the other three were born in this country, one in Pennsylvania and the others two in Illinois. She can’t not have known that, yet she said it in public, apparently not expecting anyone to notice. She thus joins Senator Rubio (whose oft-repeated story about his parents’ flight from Cuba when Castro came to power is apparently also not precisely true as repeatedly told) and Dr. Carson as candidates for our nation’s highest office whose fibs do not respond at all well to the cui bono test. Mrs. Clinton, with a life-long record of service to our nation, needs immigrant grandparents to make her appeal to voters? Senator Rubio, who has also devoted his entire career to public service and who surely has the Cuban vote sewed up anyway, needs to fib about his parents’ experiences leaving Cuba? Isn’t it enough that they fled life under communism to seek freedom here?  And, as noted above, Dr. Carson needs to link himself to West Point to earn the respect of Americans? His many strange positions and bizarre theories about the universe will either make or break his campaign…but it’s hard to imagine anyone specifically not voting for him because he wasn’t accepted by West Point.  For the record, in fact, only two of our presidents, Ulysses S. Grant and Dwight D. Eisenhower, actually were graduates of West Point. (The sole president of the Confederacy, Jefferson Davis, was also a grad.) So it’s not like a connection to West Point is crucial for someone who would be president. So why the lie? What’s the point? And where’s the gain?


In a sense, we all write our own biographies. We remember what we remember of the past, fill in the blanks by listening to our parents’ stories, by looking at photographs taken by ourselves and others, by inspecting the various documents that bear witness to our histories…and then trying to piece it all together into a flowing, cogent narrative. What really happened as we grew through childhood into adolescence and then into adulthood, who can say with absolute certainty? And, speaking frankly, which of us truly knows his or her parents…as opposed to the mythic identity they take on in the larger context of family histories and individual relationships.  So in a sense it’s all mythic, what we say we know of ourselves and how we conceptualize our families’ stories. But is it true? That’s a complicated question, one that most of us—thankfully—do not have phalanxes of reporters evaluating intensely scrupulously with an eye towards identifying the slightest inconsistency or deviation from actual reality. I imagine that Mrs. Clinton herself somehow stepped into a mythic version of her family’s history, then made the huge error of judgment by allowing others in as well.  The same must be true of Senator Rubio and Dr. Carson—not that they told lies without understanding the harm that surely would (and did) come from getting caught with their pants on fire. These are not naïve people, and Mrs. Clinton perhaps least of all! I suppose I should self-righteously now be doubting their probity and wondering about their fitness for office based on their inability to distinguish reality from fantasy in their own backstories. But I can’t quite bring myself to think of it that way—without the cui bono test suggesting real benefit to the storyteller, statements about family that do not appear to correspond to actual historical reality are merely pieces of the great inner pageant of identity forged not in the crucible of verifiability, but in the mythic cauldron of self-awareness seasoned with just enough reality to make the myth believable and appealing…to ourselves and, when we lift the curtain—as we all occasionally do—and let others in, to the great world out there as well.

Thursday, November 5, 2015

Just Religion

You can’t eat food. Or rather you can’t eat “just” food. You can eat a steak or a yam, obviously. But even though both are examples of foods people eat all the time, there isn’t anything that is “just” food without it having to be some specific kind of food. The same is true of languages—you can speak Finnish or Yiddish, but you can’t speak in “just” language without speaking in some specific one of the 6,500 languages that are spoken in the world today. You can take this idea into all sorts of other realms as well: you can’t “just” sing a song without singing some specific song any more than you can “just” read a book without reading some specific one. It’s not such a complicated idea. But does it apply to religion as well?

Can you adhere to a religion without adhering to a specific one of the world’s religions? The answer feels like it would have to be no, and for the same reason that applies to singing and speaking—because there is no such as “just” religion, only the thousands of “actual” religions to which people in the world today adhere. But if I pose the question slightly differently and ask if it would be possible to be religious or to be a religious person without adhering to any specific religion, the answer feels less obvious. If I choose to leave the words “religion” or “religious” out of the mix entirely and instead ask it’s possible to be a spiritual person—a person with a meaningful spiritual dimension to his or her life—without belong to any specific religious group, the question seems less easily answered. But the question itself remains worth pondering as asked: if religion is the language of the spirit, can you embrace “just” it without concomitantly embracing any specific religion? 
Or is the notion that you can be religious without actually embracing any religion just self-serving fantasy that makes such people feel less guilty about their lack of “real” religious affiliation?

I was moved to ponder this issue just last week while reading a very interesting essay by Los Angeles-based journalist Tamara Audi that appeared in the Wall Street Journal last Tuesday. The essay itself was based on a recently released study by the Pew Research Center that concluded that Roman Catholic Americans, until recently the largest group within the Democratic Party to self-define by religious affiliation, have now been outnumbered among the Democrats by the so-called “nones,” people who, when asked, respond that they have no religious affiliation at all. Indeed, the “nones,” the Pew Center study concluded, now number about 28% of Democrats, compared with only 19% as recently as 2007. (Catholics, who numbered 24% of the party in 2007, are down to 21%.)

On the other side of the aisle, the situation is both similar and dissimilar. The largest religious group within the G.O.P. have traditionally been evangelical Christians, who number about 38% of the party, not Catholics. But the “nones” are growing in Republican ranks as well, up from 10% in 2007 to 14% now.  And all of this mirrors the trend in the general population as well: in 2007, only 16% of Americans declared themselves to have no religious affiliation at all, but today the figure is 23%.

All of that is interesting enough, but the specific detail that caught my eye was that the Pew Center study lumps together as fellow “nones” both people who self-define as atheists or agnostics and people who profess belief in God but who lack any specific religious affiliation. In other words, according to one of America’s leading research institutes, the answer to my question is that no, you cannot be a religious person if you don’t adhere to a specific religion. Just believing in God is not enough to pry you loose from the “nones.” To be counted as a religious person, you have to self-define as belonging to a specific religious group, exactly in the same way that you can’t be “just” food without being a steak or a yam...or some other edible thing. But is that really true? That is the question I’d like to address in my letter to you all this week.

In the eighteenth century, many of the founding fathers of our nation subscribed to a school of thought called Deism, generally defined as belief in God unencumbered by any ancillary beliefs in divine revelation or in prophecy. God, thus demoted to the level of philosophical principle and specifically not acknowledged as the active agent in the governance of the universe, is real…but not in the sense that human beings need to do anything too much about: like gravity, God is imagined really to exist but invisibly and uncommunicatively. The world, for its part, is as it is because it has as the ground of its existence and at its ethical core a Deity who, by virtue of existence alone, grants order and morality to all that is; but the mythology that the world’s religions promote can be dispensed with as so many ancient fables and the notion that human beings must conform to the whims and wishes of that Deity or face dire consequences can be safely set aside. Deists believe in God as the great Clockmaker, as the Creator who has no ongoing relationship with creation any more than the clockmakers have ongoing relationships with the clocks they build and sell to others. The clockmakers exist. The clocks exist. But their relationship exists solely within the realm of history, not in the province of day-to-day reality.

There were lots of Deists in our nation’s past, although it is true that not all specifically embraced that term to label themselves or their beliefs. Benjamin Franklin, Thomas Paine, George Washington, John Adams, Thomas Jefferson, and James Madison all used terminology in their writings that, despite their formal affiliation with various Christian denominations, make them sound far more like Deists than like orthodox Christians. It is true that all of the above-mentioned founders had complicated relationships with organized religion, particularly Thomas Jefferson, yet history has comfortably labelled them all as at least strongly influenced by Deism and its basic beliefs. To give a sense of what this feels like in the works of the founders, I will only quote from one book, Thomas Paine’s The Age of Reason, written in 1794 as a defense of the French Revolution while the author was in residence in Luxemburg. There, he writes openly about his beliefs:

I believe in one God, and no more; and I hope for happiness beyond this life. I believe in the equality of man; and I believe that religious duties consist in doing justice, loving mercy, and endeavoring to make our fellow-creatures happy…I do not believe in the creed professed by the Jewish church, by the Roman church, by the Greek church, by the Turkish church, by the Protestant church, nor by any church that I know of. My own mind is my own church.

That, in a nutshell, is Deism in its plainest guise.  Is it religion? It’s hard to say. But I think I can say with near certainty that Thomas Paine would have been flabbergasted to find himself lumped together with atheists and agnostics by the Pew Research Center researchers as a “none.”  Certainly George Washington and James Madison, both life-long members of the Episcopal Church, would have rejected the label both as Episcopalians and as Deists.

So what happened to Deism? It still exists. (You can visit the website of the World Union of Deists at www.deism.com.) But it doesn’t exist as a driving force in the world of religion, as the Pew Center report blithely demonstrates by lumping together in one single category people who profess faith in God by without feeling drawn to affiliate with any specific religion and atheists who deny the existence of God entirely. Whether that was fair or unfair is a debate worth undertaking. On the other hand, to say why exactly Deism declined isn’t that hard to say at all.

Philosophical principles are interesting ideas, but hardly anyone since Socrates has willingly or unwillingly accepted a martyr’s fate because of them. To engage the soul, to transform the spirit, to draw people to a life infused with and informed by faith, to inspire people to embrace morality and to turn away from evil…religions need more than ideational substructures of sound philosophical principles. In fact, they need two specific things to make them flourish in the world of actual people: rituals and rites able to grant physical presence in the world to the ideas that rest beneath them something like the way the steel girders that support tall buildings exist invisibly yet also indispensably deep inside those buildings outer walls, and a warm, thick fabric woven of myth, history, biography, and sacred legend in which people eager to adopt those principles as their own can wrap themselves and, in so doing, find comfort, confidence, and the inner strength necessary to persevere in a world in which living a life devoted to spiritual ideals is almost always an uphill battle. From the Jewish point of view, this couldn’t be more true. Indeed, it is precisely the way Judaism brings together ritual shell and ideational core in the context of mitzvah, of sacred commandment, that makes it such an engaging lifestyle for so many people seeking to give physical stature to dogma and reality to the core concepts they have embraced as the truths that guide them forward in life. I’m sure other religions have their own way of making their foundational principles real in the world. But I speak of what I know…and for me personally it is precisely that combination of moving idea embedded deeply within repeating ritual that makes of Jewish life a spiritual journey not only that satisfies, but actually that leads somewhere.


I disagree with the premise of the Pew study mentioned above. I believe that people who are possessed of faith in God are far more like the religiously-affiliated than they are like people who have consciously divested themselves of the trappings of belief. We should acknowledge that reality, with or without reintroducing the name Deism into our national vocabulary, and accept that the divide between those who believe and those who do not is far greater and more profound than the one supposed to exist between people who merely believe and those who have formally chosen to affiliate with a religion that has a name.