Thursday, November 29, 2012

Living Forever

Was Plan A for human beings to be immortal and thus never to know death? It’s a good question, even a critical one. Yet the Bible is more than a bit equivocal about the answer.

Lots of people read the biblical text in lots of different ways. The way to read the narrative that seems the most cogent and reasonable to me, however, seems to require focusing on the fact that Adam and Eve were never specifically forbidden to eat of the fruit of the Tree of Life, but were instead ordered never to eat of the fruit of the Tree of Moral Discernment (the one popular called the Tree of the Knowledge of Good and Evil). Leaving aside the arresting question of why God would not have wished the first two human beings to have ingested as much as there is to know about the difference between right and wrong (supposing that is what “good” and “evil” in the tree’s name are meant to denote), we are left pondering God’s pensive, almost broodingly introspective, lament at the end of the story, the one in which God justifies the decision to expel Adam and Eve from the garden.  “Look what it’s all come to,” God says grimly, “now that human beings have transcended their humanity and become almost divine in terms of their ability for moral discernment. The next thing that will happen is that they will reach out and take also of the fruit of the Tree of Life, then eat it and live forever.”  That sounds as though the Tree of Life, the fruit of which presumably has the power to grant immortality to those who eat it (“…then eat it and live forever”), was there all along for Adam and Eve to enjoy. It wasn’t forbidden. It was clearly visible there in the garden. (The narrative elsewhere places the Tree of Life at the very center of the garden, presumably to make explicit the point that no one seeking the tree could possibly fail to find it.) Humanity thus had a chance to transcend its native mortality and presumably eventually would have…had Eve not listened to the serpent, and Adam not to Eve.  But because they did sin, they were denied the right—which they appear to have priorly possessed—to overcome the innate limits life placing on the living and thus to live on forever.

If that’s the right way to read the story—and I know there are others—then Plan A and Plan B were the same plan with respect to the possibility of dealing death out of the deck entirely: human beings were created mortal and lost the ability to overcome that aspect of how things were by disobeying a direct divine edict that appears to have nothing to do with mortality at all. So the story is one of nothing, not something, happening.

What would have happened had Eve instead gotten Adam to eat of the fruit of the Tree of Life when they still could have? Would they have continually aged, growing older and increasingly decrepit with each passing year, then each century, then each millennium? Or, as most of us fantasize in this regard naturally, would they have attained the fullest measure of adult health and then just stayed there, the key to immortality being the absence of aging as much (or perhaps even more so) than the absence of death? Since their descendants would presumably also have been immortal, the problem usually featured in fantasy and science fiction novels to explain why no one would ever really want to live forever—the concomitant obligation to watch everybody you know and love eventually pass away while the immortal one lives on forever,  doomed to existence in a never-ending loop of love and loss—would not be such an issue. Nor would the obvious problem of convincing the government to issue a passport or a driver’s license to someone several thousand years older than the next oldest person waiting on line.

When I was in college, I remember reading the now-mostly-forgotten novel of Eugène Sue, The Wandering Jew, and being sufficiently entranced by the idea to want to read more, only eventually to find the theme almost totally  co-opted by anti-Semitic authors in whose works the concept of living forever—and particularly when the immortal in question was Jewish—was depicted as punishment rather than reward, and as a fate in many ways worse than death. (In that regard, I especially recall being shocked by first exposure to My First Two Thousand Years, a novel by Georg-Sylvester Viereck, the once well-known Nazi sympathizer who, among other things, invented the genre of gay vampire fiction.) There were other books as well, mostly now forgotten, that tempered my enthusiasm for the theme…and yet the concept itself of transcending death (and, ideally, disease and decrepitude as well) remained and remains a major fantasy.

And so it was with all that in mind that I encountered the almost unbelievable article in the paper the other day (accessible to readers seeing this electronically by clicking here) that reported on the discovery of an immortal being. Neither an eternally wandering Jew nor a half-mechanical cyborg, the being in question is, of all things, a lowly jellyfish, one of the kind known technically as a Turritopsis dohrnii. I’ve always hated jellyfish. I had an unfortunate encounter with one at the beach—or rather in the ocean off Rockaway Beach—when I was a child and haven’t really revised my sense of jellyfish as disgusting squishy things that contribute nothing to the world. (I’ve since learned that there are people who eat jellyfish, and particularly the kind called the cannonball jellyfish. Interested readers—and how could anyone really not be?—can find a long, fascinating account of one man’s effort to eat his way through the edible jellyfish of the world on, a very interesting site well worth the visit for dozens of other great essays as well.) But maybe I need to revise my thinking.

Jellyfish are not much like people. They have no brains. (I heard that. But they really, physically, have no brains.) They have no hearts. (Ditto.) They have one single orifice that serves them as the counterpart, to put the matter as little disgustingly as possible, of both ends of our alimentary canals. Feh! But the biggest, fattest way they are unlike humans is that they appear to have the Benjamin Button-like ability when they are old and weary not to die, but instead to initiate a complicated regenerative path that regresses them backwards through the stages of growth they have earlier experienced until they return to the original polyp state from which they developed in the first place. The polyp then begins the generative process from scratch, slowly developing back into a mature jellyfish. This is not quite like lobsters, which creatures have the amazing quality of not growing weaker or less fertile as they age and which, because they display what scientists call “negligible senescence,” could theoretically live permanently until captured or killed.  Nor are jellyfish in this regard exactly like planarian flatworms, which have the remarkable—the truly remarkable—ability to regenerate lost body parts and, when split lengthwise or widthwise, simply to morph into two separate worms, each free to pursue its own destiny…and also to turn itself into two more discrete worms. That is surely a kind of immortality, but it is not at all the jellyfish’s. Indeed, as far as I can tell, there has emerged a scholarly consensus that the ability scientists attribute to the Turritopsis dohrnii is unique in the animal kingdom. (Benjamin Button was a fictional character invented by F. Scott Fitzgerald, so he doesn’t count.)

So it turns out that there is such a thing as immortality. When we fantasize about living forever, most of us think more along the lobster model of simply not aging, not weakening, not surrendering to senescence and its attendant indignities. Very few of us fantasize along the flatworm model. But what of the jellyfish model of reaching the end of the line and then turning back? It’s a bit like a New York City subway train that has no room to turn around in its narrow underground tunnel and so simply heads off backwards along its route to the beginning of the line, the first car becoming the last and the conductor merely moving to the other end as the train moves forward in that direction. Eventually it must become impossible even for the subway personnel to say with certainty which is the front of the train. Or perhaps the point is more exactly that both ends are its front, that it has no specific front, that the lead-off car in whatever direction the train is headed is its front. For the moment. Until it reaches the end of the line again. And it turns around again. And heads off again in the opposite direction. Just like a jellyfish!

I didn’t much like Benjamin Button, although I thought Brad Pitt and Cate Blanchett did admirably well with what struck me then as a ridiculous storyline.  Perhaps the key to finding the concept appealing lies in a detail I passed just a bit too quickly by above: that Turritopsis dohrnii jellyfish, in addition to having no natural limit to their lives, also have no brains. They therefore, I’m supposing, have no memories, thus also no frameworks for evaluating the progress and regress  of their lives through the years: unlike Brad Pitt’s character in the movie, they do not recognize high school when they get there again after having gotten to the end of the line and turned back to head off to the beginning. And in that perhaps lies the answer to the question: life as we know it—made rich with memories of things past, the texture of evolving relationships, the pleasures and anxieties of family life as it brings us forward from being the children of our parents to being the parents of our children,  the challenges of overcoming the obstacles that life throws down in each our paths, even the worries that accompany our march through the decades towards an indistinct, uncharted future—that kind of life can only be savored in one direction. We are neither subway trains nor jellyfish, neither lobsters nor planarian flatworms. Nor are we characters in a play hired by an unseen playwright to depict ourselves in some cosmic drama none of us can step far enough back from fully to fathom. We are…just ourselves, anchored in time and challenged to make meaningful the years of our lives not despite their finite nature but precisely because of it.

Thursday, November 22, 2012

Thanksgiving 2012

As Thanksgiving dawns, I find myself very pleased with the news from Israel. All of you surely know that a ceasefire has been in effect between Israel and Hamas since last evening. And, as I write, it appears to be holding. Of course, a ceasefire is just that—a cessation of hostilities—not a guarantee of permanent, lasting peace.  I suppose it is possible that ceasefires can eventually morph into permanent states of non-violent co-existence even between enemies unwilling for political or emotional reasons to sign a formal treaty pledging their commitment to mutual non-aggression, something along the lines of what has been in effect between the two Koreas since even before I was born. Obviously, that would be preferable to war! But the great goal that lies off in the distance, but away from which none should dare turn, is not to settle for a mere cessation of hostilities in Gaza, but to work ardently and purposefully forward towards the establishment of a “real” peace between Israel and the Palestinians, both those in Gaza and those on the West Bank.

I know it sounds almost unimaginable that this could ever happen after so many years of enmity and so much bloodshed. But Korea, in a sense, is the anomaly. If Germany and France—after the Seven Years’ War, after the Napoleonic Wars,  after the Franco-Prussian War, after the two World Wars (and I mention only the wars of the last three hundred years)—if after that much endless barbarity and bloodshed, Germany and France can live in peace, then it seems ridiculous to assume that Israel and the Palestinians could not also move past their state of mutual hostility and live as neighbors and even, as is surely the case with France and Germany today, as allies. The same could be said about present-day Germany and Poland. Or about Russia and Finland. Or about the United States and Great Britain. Or between Denmark and Sweden. (For the record, Denmark and Sweden went to war on eleven separate occasions between 1521 and 1814. But who can imagine closer allies today?) Or, for that matter, about the United States and Vietnam, a nation to which we granted “most-favored-nation” trading status less than thirty years after the fall of Saigon. U.S. losses in Vietnam were horrific, our losses exceeding fifty-eight thousand dead. But countless millions died in the conflicts mentioned just above. And yet in all these cases, war was followed not by a ceasefire (except in Korea) but by some version of peace. (Can you imagine any scenario at all, no matter how fantastic, that could lead to a war between the U.S. and Vietnam today?) That the day will one day come when we will say the same thing regarding the Middle East is my Thanksgiving prayer for the world, one in which I invite you all to join me.

A point of special interest for me in the events that led to the ceasefire was the specific role played by Mohamed Morsi, the president of Egypt. A member of the Muslim Brotherhood, President Morsi last week encountered his own “threshold” moment, a first opportunity to test his mettle on the international stage not as a member of the Muslim Brotherhood or as the president of Egypt per se, but as a statesman, as a diplomat, and as a peace maker. Did he pass the test? It appears that he did. Whether this turns out to be a truly defining moment in the after-history of the Arab Spring, on the other hand, remains to be seen.  In other words, the hope we all harbor—that the reform movements that swept away Ben Ali, Mubarak, Saleh, and Gaddafi turn out not merely to have led to the installation of extreme-Islamicist regimes in Tunisia, Egypt, Yemen, and Libya that are no more respectful of those nations’ citizens’ human rights than the dictatorships they replaced—that obviously remains to be seen. And yet, I feel at least tentatively hopeful at how this has played itself out so far.

President Morsi appears to have understood that this was his moment, that once the Turks fully alienated the Israelis with ridiculous show trials and endlessly vituperative language, the cup passed to his lips…to see if he could act both as a leader of the Arab world and as a reliable friend of the United States and as a man of peace granted a unique opportunity to end an armed conflict in his neighborhood that could easily have turned into a full-scale war. I am not enough of a Pollyanna to imagine that President Morsi is planning any time soon to abandon his affiliation with the Muslim Brotherhood, an organization implacably hostile to Israel’s very existence. But I feel buoyed by the knowledge that when the opportunity came to make peace or to wage war, President Morsi—who the press are reporting had six separate in-depth phone calls with President Obama in the course of the final few days before the ceasefire was declared—saw himself as someone uniquely positioned to bring an end  to the conflict, or at least to a temporary cessation of hostilities.  Will he move forward from here to an attempt to broker a real peace between Israel and the Palestinians? Certainly, he can’t do any worse than anyone else who has tried! And maybe this really will turn out to be Mohamed Morsi’s personal “only-Nixon-could-have-gone-to-China” moment.

Am I dreaming? Maybe a little bit I am! But I find myself unexpectedly filled with hope this Thanksgiving morning…and I invite you to join me in my dream, even if just for the duration of the day. And who knows? It’s true that you eventually wake up from even the most pleasant dreams! But it is also true that there are dreams that somehow manage to transcend their original setting deep within the brain and to become part of  extra-cranial reality! My personal hippocampus spins out the same kind of bizarre, often inexplicable, nighttime fantasies all of you know from your own private dreamscapes. But it is onto the campus of the world that I would like to focus my personal Thanksgiving prayer. Peace has returned to Gaza and to Israel. Those in a position to do good from the outside—and principal among them Presidents Obama and Morsi, and Secretary of State Clinton—behaved forcefully and admirably.  I  believe that Prime Minister Netanyahu had no choice but to respond forcefully to protect the citizens of Israel against aggression that no nation at all would tolerate. And yet he too found the strength to desist when the other side signaled its willingness to stop its attacks against Israeli civilians. Even the leadership of Hamas—so grotesquely willing to valorize suicide as a legitimate means to accomplishing with violence what seems unattainable through negotiation—found it in them to agree at least to a temporary peace. So far, I guess, so good. But now the hard part starts, the part that’s going to require political leaders to take unprecedented chances and for national populations to agree to compromise on what they have heretofore been told were things regarding which no concessions would or should ever be possible.  It’s all a dream…and it’s also my Thanksgiving prayer for you all, and for the world.

Thursday, November 15, 2012

Protecting the Calumniated

When it comes to traveling the highway of moral integrity, the gray areas are always the most interesting to negotiate. When things, after all, are unambiguously right or wrong, it may take a bit of moral muscle to do the right thing (or to resolve not to do the wrong one), but there’s not much fodder for internal debate. What’s to discuss?  If some specific path forward in life is unambiguously wrong or inarguably right, then what would the discussion even be about?  But when a sober, thoughtful person—or when society as a whole—looks at a path leading off in some specific direction and can’t quite say with certainty that it taking it would be right or wrong, that’s when the most interesting ethical discussions really do take place.

I was thinking about this part of moral reality as I contemplated the bizarre—and mercifully brief—side-show regarding poor Elmo that unfolded in the course of this last week.  It wasn’t really about Elmo, of course.  Elmo is a fictitious character on Sesame Street who is only as real as the puppet that depicts him. But the puppeteer whose voice he bears really does exist and his name is Kevin Clash. Clash has been the voice of Elmo for almost three decades, a long time by any standard but an eternity in terms of performers working on single television shows. How many shows have even been on the air for that long?  (Sesame Street, currently in its forty-third season, is the longest-running children’s show ever. Of other children’s shows, only Romper Room ran for more than forty years.  But that show went off the air almost twenty years ago.)  Clash has been enormously successful in his role, having won twenty-three daytime Emmys for his depiction of Elmo, a furry, red-haired, monster with a high, squeaky voice.  There were three other actors who played Elmo before Clash came to the role in 1985.  But this week’s story had nothing to do with his talent as an actor or puppeteer.  Nor does it have anything to do with that insane person who paraded around Central Park last summer decked out in a full-sized Elmo suit spouting anti-Semitic and racist drivel to any who had the misfortune to walk past him and linger long enough to listen.

The short version is that a man came forward anonymously and, speaking through a law firm based in Harrisburg, Pennsylvania, accused Clash of having had sexual relations with him when he was an underage minor.  (In New York, having sexual relations with persons under seventeen years of age constitutes a felony if the other person is over twenty-one. In Pennsylvania, it is a felony for someone over the age of eighteen to have sex with a person under eighteen.  If Clash’s accuser specified where the alleged incident took place, that detail was not made public.)  On Monday, Sesame Worship, the producers of Sesame Street, announced that, in wake of the accusation, Clash was going to take a leave of absence from his job. This was widely reported in the press, the accusation against Clash printed universally by people who had no actual idea whether or not he was guilty. Two days later, on Wednesday, the accuser changed his mind and announced that, after all, he had been an adult when he and Clash had sex. Sesame Workshop quickly put the matter behind them, releasing a statement about how pleased they were that all concerned can move on from what they called, saying the very least, “this unfortunate episode.” Clash himself released a similar statement about how pleased he was that “this painful allegation has been laid to rest.” End of story.

Or is it? Because Elmo is so popular and so widely known, the story was reported all over the world. Clash, who may have not wished for details of his private life to be publicly known to everyone in the universe who reads a newspaper, now joins all sorts of others accused of sex crimes but then not actually convicted of them. Does it matter? It would to me!  Yet the outcome seems peculiar: the man who did nothing now gets to be known across the world—Sesame Street, or some local version of the show, is seen in 140 different countries around the globe—as someone who was once accused of molesting an underage teen, while the man who falsely accused someone of a serious crime (for as-yet-undisclosed reasons, but surely not because he was genuinely wrong about how old he was when he and Clash knew each other) gets his identity shielded by the press so that even after recanting the story the young man’s name has still not been published.  Does that seem right?

I understand all the arguments for the almost universal practice among journalists to protect the identity of people who accuse others of sexual wrongdoing. Surely, they argue (not unreasonably) it will inhibit people from coming forward if they know that the press will publish their names and describe, often in detail, the incidents they are coming forward to report.  And it is equally obvious to me that people who genuinely were the victims of sex crimes should have the right to proceed with their lives without having their names forever linked to crimes they themselves did not perpetrate. Why should a victim pay any price at all for someone else’s wrongdoing? All that makes sense to me. But what of the falsely accused? Does the press do society a favor or a disservice by publishing the names of people who have merely been accused of doing bad things, but who in many cases (like Kevin Clash’s) have yet even to be arrested, let alone tried and found guilty in a court of law.

The Torah unambiguously forbids talebearing and gossiping. The point, sometimes missed by moderns, is that the prohibition does not apply only to lies, but also to the truth. Indeed, it seems especially important to stress the degree to which the Torah prohibits telling stories that put another in an unfavorable light even if the information is fully correct. (There is a slight difference of terminology regarding true and false gossip, but the basic principle is the same: you may neither lie nor tell the truth about another if the statement in question is going to cast the subject of your remark in a bad light.) And yet the same Torah that forbids telling tales about others not only permits but requires public trials, trials in which accusers are invited to come forward and to speak openly and freely about those whom they feel have wronged them. Furthermore, it is considered a mitzvah  to come forward with relevant testimony, even though, technically speaking, the person of whom the witness has come to speak ill has yet to be convicted of anything at all. Also, trials are always to be conducted in the light of day. Indeed, certain specific kinds of cases are to be tried not only in public, but in the city gates—the most public of all places—presumably to dissuade people from risking sinful behavior that will ultimately and publicly ruin their reputations. None of these laws seem overly concerned with the subsequent reputations of the accused and then exonerated.

The Torah is an ancient book and reflects the customs and mores of the ancient world. Our job, therefore, is to allow its legislation to mature in the context of our own moral vision and this, I think, is an instance of the world having changed so drastically so as to require a bit of revised thinking. I believe that journalists have made the right decision to protect the identities of victims even in the absenceof conclusive evidence. But in a world in which internet journalism has made it possible for a false accusation against a well-known celebrity to appear almost simultaneously on the screens of millions upon millions of viewers in every country of the world, it seems wrong to rely on ancient precedent when the possibility exists that someone accused of a heinous crime could be totally innocent of the charge.  That’s why we have trials, after all: to determine the guilt of the accused. And that is precisely what it means for accused persons to benefit from the presumption of innocence, that we do not assume that someone is guilty merely because he or she is accused of a crime.  

Yet, if the accusation turns out to be untrue and the accused becomes the victim and the accuser, the perpetrator, then the damage has already irrevocably been done because the cloak of anonymity cannot retroactively be offered to a falsely accused person. And so the Kevin Clashes of the world are stuck bearing the opprobrium that comes from being identified in the press after having been accused of wrongdoing regardless of  the specific outcome in any particular case. What would be so bad about journalists protecting the identity of someone accused of a terrible crime, and particularly a sex crime, until the police determine that that person should be arrested and the members of a grand jury agree to return an indictment and a trial is conducted and the accused is found guilty beyond a reasonable doubt? Yes, it’s true that freedom of the press is one of the cornerstones of democracy. But does our American democracy really rest on my right to know that some unnamed person dislikes Kevin Clash enough to tell what was apparently a horrific lie about him? If it isn’t true, why do I need to know about it at all? If it’s just a false accusation, how could society not be better off if journalists protected its members from suffering the after-effects of public assaults on their good names rather than ruining people’s reputations based on hearsay?

Kevin Clash himself seems ready to move on and to forget the whole thing. But society should use this incident as an opportunity to revisit the way the press deals with mere allegations that have not even resulted in an arrest…and wonder how the system could be made to take into account the possibility of the accused turning into the victim…not of sexual misconduct, but of calumny. 

Friday, November 9, 2012

Things Could Be Worse!

It could always be worse! How often have you heard someone say those words to you? I myself have had them said to me over the years  so often, and by so many well-meaning people, that it is still vaguely surprising to me how little comforting they are. Has anyone ever found solace in the thought that something could conceivably be added to whatever it is that is already making them miserable that would somehow make them even more upset or more unhappy? You’re upset about little ol’ this, the words prompt us to think? This little thing? You have malaria? You could have malaria and gum disease!  Don’t you feel better now?  No? What’s wrong with you? Don’t you know enough to feel grateful that you only have malaria? Honestly, things could be so much worse!

I recall taking courses in pastoral counseling in rabbinic school when I was a student at JTS all those many years ago. The courses varied dramatically in quality, but one of the ones I liked the most, and from which I learned the most, included a very helpful section regarding things our teacher counseled us never to say to anyone. Ever. Some of them were no-brainers that no normal person ever actually would say to anyone. Others, as far as I can recall, were not really such terrible things to say at all. I’ve even heard myself saying some of them from time to time over the years, but never without at least a twinge of guilt. But some were things that people really do say and that our teacher was counseling us to avoid. And chief among them was the attempt to induce comfort by noting that, no matter how bad things are, they could actually be far worse.

My father suffered from diabetes and he was obliged to have a serious part of his left leg amputated when he was eighty years old. The operation itself went smoothly. But recovering emotionally took a lot longer than his physical recovery and, wholly uncharacteristically, my father agreed to be visited by one of the hospital chaplains. (I lived in Vancouver in those days and was coming to New York, as my dad had specifically asked me to, only after he went home and was really going to need my help getting around.) And that young rabbi—whom my father told me looked like he was maybe fourteen, whose wispy beard looked pasted on, and whose yarmulke, Dad said, was almost bigger than his head—actually said The Forbidden Words to my father, attempting to feel better about his sorry lot because, hey, he could have lost both legs! He should feel lucky because, honestly, it could have been So Much Worse! The young rabbi, whom (luckily for him) I never encountered personally, knew at least not to gild the lily by pointing out that my father could also have had malaria. Or that he could have also lost an arm. Or both arms. Or his teeth. Or his arms and his teeth. I was sorry I wasn’t there to explain to the chaplain that no one ever feels better by having it pointed out that things could be worse, that a comment along those lines only really serves to make a patient feel guilty for being ungrateful enough with his or her lot to be upset in the first place. And that that thought—that they are somehow behaving like ingrates by being unhappy with their negligible misery—invariably makes people feel worse, not better.

These thoughts all came back to me in the course of the last few weeks in New York. As I write to you, it is snowing outside as the nor’easter promised for this evening provides an advance covering of ice, snow, and slush before the really bad weather begins later this evening.  In the course of Wednesday, I heard not once or twice but three separate times (once in person and twice on the radio) people attempting to make us all feel better by pointing out how much worse things could be! You’re upset about a little ol’ nor’easter? It could be another hurricane!  This is nothing—or at least it’s promised to end up being nothing at all—like Sandy. So being upset because it’s cold out, because you still have no heat or light because of the last storm, because you are among the 150,000 Long Islanders still without power (or among the 30,000 whose power was stricken by the nor’easter—which, by the way, are you as thrilled as I am that we don’t have to refer to nor’easters as well with adorable names, which practice regarding hurricanes I’m finding increasingly cloying with each successive storm?)—none of it is worth getting all unhappy over because, as we all know from last week, it could be so much worse. You could have no power and you could have malaria. Or diabetes. Or malaria and diabetes. If anything, you should feel relieved that things aren’t worse. What if you also had no teeth? I suppose all of the above is true. But who really feels comforted by any of those thoughts? In my experience, no one at all!

And that brings me to the presidential election. Yes, it could have been worse. We could be facing a Bush/Gore scenario in which the election remained undecided during weeks upon weeks of legal wrangling. We could be facing the unpalatable situation in which one candidate wins the popular vote, thus having the support of a majority of citizens, but the other candidate wins in the Electoral College, thus winning the election in the only way that actually matters. (As you may know, this has actually happened four times in our nation’s history, in 1824, 1876, 1888, and in 2000.) But the results were still depressing. A Senate divided against itself. A House of Representatives equally riven along party lines. A country as clearly as ever divided along lines of race, class, and political philosophy. (That the other great dividers of American society—religion and gender—don’t seem to have been such a huge factor in the election, and particularly with a Mormon running for president for the first time, is a good example of how the results could have been even more depressing.) Two blue coasts with a huge red heartland that are supposed to think of themselves as parts of the “one country under God, indivisible” to which we all pledged allegiance as elementary school children, but which hasn’t actually felt anything like that in a very long time.

The president was victorious in his quest for re-election. But, just as I invariably counsel all my brides and grooms that, for all weddings are fun and exciting, the real task at hand is not to fret about the party but to work hard to create the kind of viable, happy marriage that will last a lifetime, the task facing the president is not to bask in the afterglow of his victory but to attempt to find a way to restore the unity that was the hallmark of our country in its finest hours and that could, I believe, characterize American society again…if the people at the helm have the breadth of vision and the generosity of spirit necessary to effect that kind of sea change in our culture and our society.  So, yes, it could have been way worse. No one could have clearly won, like in 2000. But the tasks facing us now that the electoral process has finally died down a bit—the 2016 race only really begins after the inauguration in January—are nonetheless incredibly daunting. So it feels like cheap comfort to note that things really could be worse. Yes, surely they could be. But the real job at hand is to grapple with things as they actually exist and not to seek comfort in make-believe fantasies that would be even more daunting than the situation that actually exists. This one is plenty daunting enough.

I suppose all my readers know the difference between a Jewish optimist and a Jewish pessimist: the pessimist says, “Oy, things couldn’t get any worse,” to which the optimist responds “Of course they can!” For some reason, that joke always makes me laugh. But the challenge facing our country doesn’t make me laugh at all. Nor do the myriad challenges facing Long Island as we attempt to negotiate the aftermath of an early snow without having been anywhere near fully recovered from our previous weather disaster. Yes, it could be worse. We could have had an earthquake too! But the real task at hand is neither to fantasize about worse weather or worse political outcomes, but to take sober stock of how things are, to find the inner resolve to live through the storm, to shovel the snow, to make sure no one in our midst is cold or hungry or living in the dark…and to do what it takes as a nation to regain our sense of common purpose and shared destiny to overcome the divisive forces that seem no less likely to hold us back us in the future just because the election is now behind us.

I wish the president well. We all should and must. Governor Romney’s concession speech was eloquent in its simplicity and suffused with what I took to be heartfelt generosity towards the president.  As a nation, we need to be no less gracious…and to do together what it takes to move past the Us vs. Them mentality that has done our nation no good at all for a very long time now, and to work together truly to make of this land the one nation indivisible under God that it was and surely can again become. And cheer up—we could be facing all this counterproductive divisiveness and partisan rancor and we could all have gum disease! Wouldn’t that be worse?

Thursday, November 1, 2012

After the Storm

Like all of you, I’m sure, I responded to the events of this last week with a complicated parade of conflicting emotions: irritation (at first) that they were hyping this thing (I thought) way out of proportion, followed by irritation that they (whoever they were) hadn’t been explicit enough with respect to the storm’s potential to devastate, followed by a deep sense of foreboding as the winds picked up and the sky darkened (and Sandy suddenly seemed possibly like something they—a different “they” this time—hadn’t just made up to sell newspapers), followed by a certain smugness that our power hadn’t gone out, followed by a distinct sense of ill ease (sharpened by the sudden absence of smugness) when the lights did go out with a huge bang as our generator—or transformer or whatever that big metal box on the pole next to our house is—exploded and partially fell to the ground, followed by a long, dark, ominously quiet night. Followed by another. And then another after that. As I write these words, we haven’t had power for three days. We have been given no information at all when the power might return. It feels as though there is no place to turn to, no office to inquire at, no specific person to remonstrate with. And it’s getting distinctly colder out there.

None of my readers, I’m sure, needs me to deliver any baseless predictions regarding the restoration of power at Shelter Rock or in any of our neighborhoods. (I am neither a prophet nor the son of a prophet…and I don’t think I could find my Ouija board in our pitch-black basement anyway.) I’ve been spending my days trying to do some version of a day’s work—racing to my temporary office in the Hillcrest Jewish Center on Union Turnpike in Queens (where I have been warmly and endlessly hospitably hosted by my colleague and very good friend, Rabbi Manes Kogen, and where they have power and wi-fi access to the internet), trying to keep up with hospital visits, working with our lay leadership to figure out how we’re going to negotiate things this weekend, attempting to run a make-shift evening minyan in what was briefly the only home in the community that had power. After prayers, it’s been like hurtling back through time to great-grandpa’s day: Joan and I go home, make a fire in the fireplace, eat our soup (we have one burner that’s still working for some reason), then read by the hearth until the fire goes out, then go to bed. It was marginally romantic, briefly. (I am, however, reading a terrific book, which I hope to write to you about one of these days, Cutting for Stone by Abraham Verghese. So the evenings of reading I’ve actually liked a lot) By now, however, we’re had more than enough.

There have been some consolations. The spirit in the community—and especially the experience of seeing individuals going out of their way, including some dramatically so, to find ways to help each other—has been very moving to watch. Living through whole days without hearing a single word from the television has been almost unexpectedly soothing. Despite my carping, it turns out that snuggling up under a blanket and reading in a cold room in front of a roaring fire actually is one of life’s great pleasures. (I think I actually knew that once, possibly in a former life. But who remembered?)  Joan made the best soup too the other night on our (one) burner out of red lentils, beef, and kasha. So it really has been like traveling from the twenty-first back to the eighteenth century once the sun goes down each night, even if I have been ruining the tableau in the inglenook by reading mostly on my Kindle. What can you do? How often does any of us eat home-made soup in front of a fire?

A million years ago, a poet in old Jerusalem sat through a devastating winter storm. Not quite a hurricane, perhaps, let alone a post-tropical tornado (whatever that is exactly), but instead one of those rare but dangerous winter storms that even today has the capacity to hold Jerusalem in its paralyzing grip for a few days before things slowly return to normal. But instead of seeing a disaster, the poet saw an opportunity. God, the poet wrote, has neither time nor patience for arrogance, but instead prefers endlessly to encourage humility. And it is in that context—explicitly stated early on in the poem that we know as the 147th psalm—that the poet turns to his fellow Jerusalemites and calls upon them to be grateful to a God who makes humble the faithful by covering “the earth with snow as though it were a blanket of wool,” by scattering “frost on the ground as though it were ash,” by hurling “chunks of ice to earth as though they were mere breadcrumbs.” And then, when the poet rhetorically asks who possibly could withstand personally the cold God sends to earth, his answer is obvious and his point already clear: possessed of faith, we all can. And as it was for him in ancient Jerusalem so is it too for us: we can all withstand the cold for a bit…if we use the opportunity to feel prompted by the experience to remember how things actually are in our world of hyper-modern machinery and sundry gadgets.  For, the poet knows, God does not only send along cold and ice to the world, but healing warmth as well. “It is, after all, also God,” the poet writes thoughtfully, “Who sends a word to melt all the ice and Who sends a warm wind that makes the ice into flowing water,” just as surely “it is God Who offers divine words to Jacob and divine statutes and laws to Israel.” In other words, storms like the one he was experiencing were not just disasters in his mind, but also opportunities…to feel confidence in God’s mercy, to feel called to acts of kindness and generosity, to remember the truth about life’s simplest pleasures, and—most of all—to cultivate a sense of deep humility, one of the cardinal virtues of the virtuous life yet the hardest, he thought and I also think, to cultivate on our own.

We live in an age of unimaginable miracles. We have machines that do in seconds what scholars once spent years trying to accomplish…if they had the stamina and the income to persevere in their studies long enough to bring their research to a satisfactory conclusion. I push a few keys on my laptop and I’m suddenly talking to—and seeing—my friend in  Jerusalem (or in Paris or in London or in Vancouver) as though he were sitting on the other side of my work table. Doctors routinely undertake  procedures that even in our own lifetimes would have sounded like science fiction fantasies, and yet which have become so totally routine that people barely remember that they were once mere possibilities, let alone unattainable fantasies.  And yet…one good storm—one that comes from nowhere and goes off to nowhere after passing blithely over us while paying as much attention to the devastation wrought by its presence as most of us would give to an anthill accidentally stepped on while gardening in the backyard—one good storm is all it takes for all that progress to stop in its tracks. No cell phones. No internet. No Skype. No Facebook. No nothing…just the world as it once was. Sunlight in the daytime and darkness at night. Communication by talking rather than texting. Heat from fire rather than from electric space heaters. Entertainment not by looking at something, but by doing something: by playing the piano or reading a book, by walking around the neighborhood or knitting a scarf or making a pot of soup or darning a sock. Okay, maybe not the darning a sock part—I’m not even sure I know exactly what “darning” is—but the rest of it for sure!

Mozart somehow managed to live without electricity. So did Shakespeare. And Goethe. And Caravaggio. Not to mention Isaiah, King David, and Maimonides. So perhaps we should respond to the events of this last week not with irritation (or not solely with irritation), but with a bit of humility. Clearly, we’re not as invincible as we thought we were a week or two ago! But the world’s greatest cultural accomplishments were undertaken and accomplished in a world strangely like our post-Sandy one, or at least like our post-Sandy world will remain until they turn the power back on and Long Island enters the post-post-Sandy period. So maybe we would do better not  to spend our cold days and bleak nights wondering whom to blame for this mess, let alone whom eventually to sue for damages, but by trying instead to imagine the good we might be capable of doing now that circumstance has created the context for us selflessly to put others’ needs before our own, for us humbly to acknowledge our true relationship to the universe once we find ourselves forcibly divested of our supremely powerful (and surely only temporarily disabled) machinery, for us productively to wonder what we might create not via game boxes and virtual reality simulators but actually by writing a poem (i.e., on a piece of paper with a pencil) or by learning to sing a song or by making a pot of really, really good soup. From scratch. In the cold. Wearing gloves and a hat. And, if we can manage it, not much caring about the circumstance that led us to this instance of obligatory creativity and reveling instead in the creative process itself. If that experience itself makes us humble, then so much the better. And if the process also leads us to a bowl of truly good soup, then better still!