As a rabbi, I have never been formally asked if I recommend that people lie to each other instead of telling the truth. What kind of a question would that be anyway? And yet I am nevertheless regularly asked to be party to the dissemination of untruths, which are invariably presented to me not as acts of betrayal or conscious disloyalty to the deceived party, but rather as acts of kindnesses. I am, for example, getting ready to visit someone in the hospital when a relative of the patient, usually an adult child, calls me to let me know just how grim the diagnosis actually is, but also to ask me not to say anything to the patient about how things really are. I understand the concept. No one wants to deprive someone (and especially a loved one) of hope in the future. No one wants to ruin someone’s last weeks or months by making it clear just how unlikely recovery actually is. And no one wants inadvertently to discourage someone who is very ill from following his or her doctor’s suggestions by implying that there’s no point to making any further effort. And yet…isn’t lying wrong? Doesn’t the Torah specifically command us to distance ourselves from falsehood? It does say that, and in those exact words (at Exodus 23:7)…and yet all sorts of people feel entirely justified in telling falsehoods that appear to do no harm, or actually to be beneficial, to the lied-to party. I personally don’t want anyone ever to lie to me about anything, including specifically about my health. But I know that different people feel differently…so I don’t break the confidence entrusted to me. But I never feel quite right saying nothing when I know that that is specifically not how I myself would wish to be treated.
Our tradition justifies the occasional white lie. The Talmud relates a charming debate between the follows of Shammai and the followers of Hillel about just such an issue, for example, in this specific case discussing the reasonableness of praising a bride on her wedding day for her gracefulness and her beauty regardless of how things really may be and concluding that avoiding the humiliation of others is as sacred a commandment as the one forbidding the telling of lies. Another passage that has stayed with me all these years, from a different tractate of the Talmud, records a lesson that in ancient times was attributed to the great sage Samuel. In his opinion, the Talmud teaches, it is permitted—supposing that one is not speaking in court under oath—to tell lies regarding three areas of life, which he designated “book, bed, and hospitality.” All three will be resonant with modern readers. If someone asks intrusive, impertinent questions about one’s intimate life at home with one’s spouse, one has no obligation to answer with the truth merely because the question was posed out loud. And the other two are also easy to seize. The “book” rubric means that if someone asks a scholar if he has mastered a certain tractate of Talmud, the scholar may answer vaguely or even misleadingly if it seems that speaking the truth would be immodest or boastful. And the “hospitality” rubric will also seem reasonable to modern readers: if someone starts quizzing you about the level of gracious hospitality you enjoyed in someone else’s home and you realize that an honest answer is going to be taken as a clarion call for every schnorrer in town to descend on your hosts for their free meal, you are free there too to speak vaguely or even untruthfully about your experience so as not to put your hosts in an uncomfortable or untenable position.
I was reminded of these texts—and also of Rabbi Kassel Abelson’s masterful analysis of them and others like them in his chapter on personal integrity in The Observant Life—when I was reading an article in the newspaper last week that seemed at first highly relevant to these issues. (Now I’m less sure. If you are reading the electronic version of this letter, you can see the article, written by Gina Kolata and published in the New York Times last Saturday, by clicking here.) The issue itself is very interesting and has to do with the kind of genetic research that involves the use of tissue samples taken from living donors, people like you or me who wish to further the work of science and who are prepared to part with some of our cells so that researchers can work on them. It doesn’t seem like a major sacrifice. We do have, after all, something like seventy trillion cells in our bodies. They are not all the same, however—there are more than two hundred “kinds” of cells—and so it is often the case that researchers often need specific types of tissue to carry out their research. We all benefit from scientific discoveries regarding the way the human body works. So why not donate a few cells to help along precisely the kind of research that has in the past led to enormous scientific breakthroughs? A trillion is, after all, a million million. And we have something like seventy trillions of them.
That much seems easy enough to justify. People part with some tissue. They are promised that no one will be in touch, that the donation is the beginning and end of the procedure, that they won’t be pestered with requests for further involvement or with disturbing information gleaned from the research they are supporting. But then, as research techniques have become increasingly more sophisticated, scientists seem regularly to learn things about the donors involved that it seems entirely reasonable that those donors would very much want to know. Unless they don’t. Some of the examples in the article seemed to me to be no-brainers. A woman with a family history of ovarian and breast cancer volunteers some tissue for a study. The doctors in charge know that the donor is so worried about breast cancer and so certain, given her family history, that she will eventually become ill, that she is planning to undergo preventative mastectomies. Surely, when researchers learned that she was not, in fact, carrying her family’s breast cancer gene, they had an obligation to breach the confidentiality clause—the clause that promised her she would not be contacted in the future and burdened with unsolicited, possibly unwanted information—to give her what was clearly going to be very good news.
But other cases are less clear. The article highlighted the case of someone who donated cells for cancer research. In the course of the work, the scientists involved realized that the person involved was probably, but not definitely, HIV-positive, and clearly did not know it. (The individual filled out a form before donating the cells in which that question was specifically asked.) Surely, that person would want to know such an important detail about his or her health. Or is that not that obvious…the person in question could surely have arranged an HIV test if he or she wished to know his or her status! Would it be helpful or cruelly intrusive to force that information on the person involved? It’s not easy to say, and you don’t get to be sure what someone else wants because you feel certain you yourself would, or wouldn’t, want to know. And, given the ability doctors have to maintain the long-term health of individuals infected with the HI virus, even this is a relatively easy issue to talk about. But what if the information had to do with a disease lurking in one’s genetic code that could manifest itself later on and could be fatal…but for which there is no known cure and no known advance therapy. The person would then be burdened with information about which he or she couldn’t do a thing…except spend the rest of his or her life worrying about an uncertain future that either could or could not feature the disease in question. Once the commitment has been formally given not to burden donors with the results of the research they chose to support with the donation of their own tissue, who gets to decide when or if to override that commitment because it seems as though the individual in question would probably (or possibly or almost definitely) want to know? And who mops up the mess if the person who ultimately makes that decision turns out to be completely wrong? (The answer there is clear enough, I think: no one at all, other than the donor now burdened with unwanted bad news, information which cannot be unlearned no matter how profusely the decision-maker apologizes later on.)
Is not telling something to someone the same as lying to them? The halakhah endorses the idea of withholding information from sick people that might make them even sicker. Specifically, the texts talk about the question of telling a very weak person that a close relative has died. The law there is clear: if the doctor in charge feels that that knowledge, by saddening the patient to the point of despair, will even just possibly worsen his or her physical condition, then such information may legitimately be withheld. That seems reasonable enough, yet the decision to withhold information about the patient’s own health—more analogous to the issues raised in the newspaper last week—seems different to me. None of us knows each other that well…and that thought, painful though it may be to contemplate, includes our own parents. Which of us can say with certainty, therefore, that a parent might not have some unfinished business that he or she would want very much to deal with while there might still be time? Or peace to be made with someone that will go unmade if the patient weakens too much further and becomes concomitantly less able clearly to communicate? Who can know another well enough to decide? Or is the solution therefore for no one ever to decide on another’s behalf, for each of us to decide whether or not we wish to know…and then for the rest of the world to honor our decision no matter what?
In the end, none of these questions has a clear answer. The relationship between physician and patient is not precisely the same as the relationship between a clergyperson privy to some detail about a congregant’s health status and that congregant. Nor is either of the above relationships at all like the one between a researcher and the donor of tissue on which that researcher is working, a relationship that almost by definition does not exist at all. In each case, the decision whether to speak or not to speak can only be made based on someone’s best guess about where a patient’s (or a congregant’s or a parent’s) best interests lie. No one can ever be sure. But neither can that lack of finality and certainty serve as an excuse for making no decision at all. These are all thorny issues in need of adjudication. But I only see one clear solution: medical researchers need to make clear to donors in the future that the possibility exists of finding out things about their physical health status that could be very good or very bad news…and to require donors to decide for themselves, and long in advance, whether or not they wish to be contacted in the future. Will that discourage donors from donating? Some, it will. But not all…and perhaps that has to be good enough.
As we approach the High Holiday season, the question at hand is not whether we wish for others to lie to us, but whether we have the moral courage to desist from lying to ourselves, whether we have the inner fortitude to look at ourselves as we truly are, and then to move forward from that kind of honest evaluation of ourselves and our deeds to the point at which we resolve to do better. Lying to the dying is probably only a good idea under very specific conditions when the truth could actually harm, not simply sadden, the patient. Lying to the living is even less often justifiable than that. But lying to ourselves? It’s hard to imagine how that can ever be a good idea, or one that leads to anything other than regret later on.
Friday, August 31, 2012
Thursday, August 23, 2012
Haimi at Sobibor
There was an article in Haaretz this week regarding the excavations that Israeli archeologist Yoram Haimi has undertaken at Sobibor, the Nazi death camp in eastern Poland, that made a profound impression on me…and not only because of the actual work itself that Haimi has undertaken. (If you are reading the electronic version of this letter, you can see the article by clicking here.)
Sobibor, as I expect all my readers will know, was a place of indescribable horror, a camp that existed for no purpose other than to kill the people sent there. There were no fig leaves set in place to obscure the “real” point of the place’s existence: no industry, no forced labor, no factories of any sort, no phony schools. There were also no survivors, or almost none: of the quarter of a million people deported to Sobibor, a scant fifty-three are known to have survived. (All the survivors were among the three hundred camp workers who managed to escape during the camp-wide revolt in October of 1943, an uprising that wrested full control of the camp from the Germans for all of sixteen hours. Some readers may have seen the 1987 made-for-television movie, Escape from Sobibor, starring Alan Arkin, Rutger Hauer, and Joanna Pacuła, which dramatizes the story very effectively.)
Just a few days after the revolt was quashed, Himmler himself ordered that Sobibor be closed and its buildings, including its gas chambers and crematoria, razed to the ground. The site was then planted with trees in the hope, I suppose, that the crimes committed there would never come to light if the place itself were to vanish from the face of the earth. It didn’t work, of course. There were, at the end of the day, more than four dozen escapees, each possessed of complete knowledge of the Nazis’ crimes in that place. And, of course, most of the German officers who ran the place also survived, as did many of their Ukrainian underlings. The perpetrators, of course, were less inclined to tell their story, although many were tried in courts of law and subsequently either committed suicide or died in jail. But even despite all that testimony, the earth slowly set itself to reclaiming the ground upon which Sobibor stood. The trees grew. Eventually, grass covered the ruins, obscuring them from view and making the precise lay-out of the camp difficult to discern even to Shoah scholars. There is, I suppose, something comforting in the thought that the earth even can do such a thing, that our sentient planet can step in—if this is not too bizarre a thought to say out loud—to obliterate the horrors committed on its outer surface something in the way our bodies naturally heal wounds, sometimes even without leaving scars as physical evidence of their historical presence. But there is also something wrong with a place of such indescribable pain simply vanishing, something that makes the eradication of a place like Sobibor painful and upsetting even to think about, let alone to acquiesce to.
Enter Yoram Haimi, an archeologist by training but also the child of a family of survivors. Except, of course, that it was not all of his family that managed to flee…and some of those who remained behind were murdered at Sobibor. For a long time, apparently, Haimi felt the matter to be settled, the camp to be gone, the testimony of those few survivors to be all that remained of that time and that place and those horrors. But then, about five years ago, it struck him that he himself held the tools to right a great wrong. And so he set himself to work, bringing the tools of his trade to bear in his effort to unearth Sobibor, to locate its buildings and its killing sites, to reclaim what the perpetrators tried so strenuously to obscure—and, in so doing, to honor the dead by showing reverence to their final resting place.
Does thinking about a Nazi death camp in that way seem odd to you? Try thinking of Sobibor instead as a vast Jewish cemetery—and, unlike in some of the other death camps, only Jewish people were murdered at Sobibor—and the effort to reclaim the space makes more sense and becomes, if anything, a noble task for Haimi’s team to have undertaken, and perhaps even a sacred one.
Has so much time really slipped by that the work of uncovering the full story of the Shoah has passed from historians to archeologists? Having read about Haimi’s work in Haaretz, I started looking for more to read and came upon a very interesting essay, complete with photographs, that Haimi and two other archeologists, one Israeli and one Polish, authored, an essay partially about Sobibor but also about similar archeological excavations recently undertaken at Treblinka, Chelmno, and Belzec. (Readers reading on a screen can find the essay here.) In a sense, it is only logical for archeologists to work at uncovering that which the earth has naturally hidden—that is, after all, exactly what archeologists do for a living—and yet it seems unbelievable to me that we have, somehow without me noticing, moved through the time when an event as ever-present in all of our psyches as the Shoah could merely be recalled and recounted to a new era in which even its details need instead to be physically drawn forth from the earth by scientists trained in that kind of excavation.
The thought that the work of recovering the story of the Shoah has passed to people trained to dig it out physically from the earth—and not to some rarified version of “Shoah archeologists” but to actual archeologists trained in their field and simply applying techniques usually used on sites from distant millennia to events that, even in our own day, have slipped into the realm of the unrecoverable…other than by the means they are bringing to bear in the pursuit of history and of truth—that makes me feel old. But also very satisfied. First and foremost, I feel satisfied that the poor people who died in that place will have their final resting place acknowledged even absent the stone monuments that normally mark the graves of deceased Jewish people. Secondly, I feel satisfied to take note of the effort to guarantee that the stories of the Holocaust that have not yet been told do not simply vanish with those who remember their details from personal experience. And thirdly, I feel very satisfied by the thought that, even after all these years, the truth about the Shoah turns out truly to be ineradicable even if the means of extracting it from the mists of time past must necessarily change as times moves forward and an ever-increasing number of years separate us from the events at hand.
We all age. About that, there isn’t much to do. Episodes in our lives that were once current events become ancient history. We grow older along with the earth itself, and none of us much likes that thought. Its most dour implication, obviously, we like even less. But spending this month of Elul—this holy month that leads directly into our holiest days—contemplating the passage of time and our place along the trajectory that leads from history to destiny through an ever-shifting present, that should be an ennobling experience far more than an upsetting one. It seems unimaginable to me that the study of the Shoah has, at least to some extent, passed to archeologists. It seems unimaginable to me that I have colleagues in the rabbinate who were born long after I myself was ordained. It seems hard, even, to believe that I am my own age. But coming to terms with who we are and what we have become—including not just our specific ages but the fuller story of our lives as well—that is the sacred work to which the days and weeks of Elul call us as we approach the High Holiday season imbued not with arrogance or unwarranted certainty regarding our own merit, but with humility and with resolve. The concept is not merely to number our years so we can know how old we are…but to number them, as the psalmist wrote, so that in doing so we may yet gain hearts of wisdom.
Thursday, August 16, 2012
How Nations Grow
What is the right model for thinking about countries? Are they “like” people, who start off in infancy, then move through adolescence and adulthood into old age? That model sounds reasonable at first blush—we talk about the “birth” of nations all the time—and logical, but that kind of thinking seems necessarily also to impute an unavoidable future of inevitable decline to even the healthiest of the world’s nations. And decline is not the worst of it: there is also the detail to consider that human life is inexorably finite, and that eventually the curtain comes down on even the most robust among us. Some would argue, I’m sure, that that model, dour implications and all, is exactly how countries live…and eventually die. The world—or, rather, the world’s history books—are filled, after all, with the stories of once-mighty nations that felt invincible in their heydays—but which eventually just stopped existing, even (like the old Soviet Union) without being defeated militarily in battle. Nor is there any paucity of nations to consider that were once world powers but which feel as though they eventually declined into healthy or unhealthy old age.
Or are other models more right for the history of nations and peoples? To ask the same question differently, are countries specifically unlike individuals in that they have the ability (which human beings lack but would all love to have) to morph forward from iteration to iteration specifically without facing the inevitability of decline and demise? Holland, for example, was once a mighty world power that controlled a huge empire with land holdings on five continents. Clearly, those days are long gone—all that’s left these days are a few islands in the Caribbean—yet it would seem odd to describe the Netherlands as a senescent nation merely because it was once immeasurably bolder militarily, more powerful, and more influential than it is today. More reasonable, I think, would be to say that Holland hasn’t declined as much as it has morphed forward into a new stage of national existence, into a new version of itself. Nor does it seem all that logical to argue that the death of the imperialist impulse that led nations to consider it reasonable to seize huge swaths of other people’s property and unilaterally to declare them part of a vast, far-flung empire must inevitably imply that the once-imperialist countries themselves too must eventually collapse. So if today’s Netherlands, with an annual Gross Domestic Product of more than $840,000,000,000 (which figure reflects a per capita rate of more than $50,000 per citizen, a rate higher than our own nation’s, Germany’s and the U.K.’s) isn’t reasonably understood as the doddering version of yesterday’s Dutch Empire, then perhaps we should be talking about metamorphosis rather than decline, about national growth forward specifically not characterized—or not inevitably characterized—by the inevitability of national decline and death.
All of these thoughts occurred to me this week as I perused an article in the paper this week that caught my attention, one in which it was noted that our country crossed an impressive, very interesting milestone last week when Governor Romney chose Paul Ryan as his running mate. It seems that, for the first time in history, there is not a single white, Protestant man or woman among the top governing officials, or perspective governing officials, of our country, considering in that category the presidential and vice-presidential candidates of both parties, the nine justices of the Supreme Court, the Speaker of the House, and the Senate majority leader. At first, it seemed unimaginable to me that that could be the case. I quickly made my own hand list, letting Wikipedia supply the details I wasn’t sure about. But my own research yielded the same results: the above-listed group of fifteen is made up of nine Catholics, three Jews, two Mormons, and one black Protestant. That’s all of them…and, just as the article said, not a white Protestant among them. Even granting that the president himself is, at least genetically speaking, as white as he is black, he is still far from the white Protestant model that held sway not for decades or for scores of years, but for the entire span of our country’s existence. Until now.
As recently as the 1952 presidential election year, that group of fifteen would have included only one individual who was not a white Protestant man, Felix Frankfurter. And although there has not been an election year since 1928 in which the entire group of fifteen has been white, Protestant, and male, that surely was the case for the 150 years of American history that preceded that year. I realize that one can make a reasonable argument that Abraham Lincoln, whose religious beliefs have been endlessly discussed, was not a “real” Protestant, but, at the end of the day, he worshipped in Protestant churches (without actually joining any) and regularly spoke in religious terms that most of us would easily identify with Protestant beliefs. One could make a similar argument about Thomas Jefferson, who is often vaguely labelled as a “Deist” (that is, as someone who believes in God without actually espousing any specific religious doctrine), but that too is a bit overstated in that he played a role in governing the Episcopal Church near Monticello and regularly referred to himself as a Christian (“"To the corruptions of Christianity I am, indeed, opposed; but not to the genuine precepts of Jesus himself. I am a Christian….”) Other than the two of them, only four presidents weren’t or aren’t formal members of Protestant Churches while in office (Andrew Johnson, Ulysses Grant, Rutherford Hayes, and President Obama), but all had ties to Protestant Churches and self-defined as Protestant Christians. Other than Andrew Johnson, every Vice President of the United States except for the incumbent has been a member of a Protestant church. (I hardly have to bother adding that all have also been white and male.)
I suppose I could have seen it coming. Maybe even I should have, but this detail that we have crossed the line into a world of American political leadership in which no one at all is a white man affiliated with a Protestant church caught me completely unawares. Nor do I want to interpret this development by viewing it through self-conscious Jewish eyeglasses and lauding it based on the assumption that the more multicultural our leaders are, the more tolerant their leadership will be, and thus also the less likely to find acceptable any traces of prejudice, including anti-Semitism. That probably is true. In fact, I think it probably definitely is true, but that’s not the sole point to consider. Instead, I’d like to use this detail in our evolving American story as a springboard for considering how nations grow.
For the most part, social policy moves ahead imperceptibly as people gradually, sometimes over decades, morph into more sophisticated versions of their former selves. Ideas that are commonplace fall away almost glacially slowly, but then are suddenly gone almost to the point of unimaginability just years after they were almost universally held. The notion of bus stations having separate washrooms for black people and white people is a good example of something that now seems so difficult to imagine so as almost to sound more quaint than malign, something like putting sinners in stocks in town squares to allow them to atone in public for their misdeeds. The idea that it is reasonable consciously and intentionally to pay women who have the same jobs as men lower wages than their male counterparts, I think, falls in the same category of an idea that was once widely considered rational, but which now sounds beyond peculiar. Even the notion that there could be societal merit in pressuring gay people to devote lifetimes to making believe they are straight, despite the misery that kind of pretense must almost invariably entail, seems impossible to square with our basic American commitment to tolerance and reasonability. Yet that was indeed how people felt, and for a very long time!
And that is how I think countries too grow. Unlike with respect to people, where growth inevitably leads at least eventually to demise, nations and societies can morph into healthier versions of themselves without the experience necessarily leading to decline. There was a time when it would have seemed impossible to imagine a black family in the White House. There was a time when people spoke self-consciously about Jewish or Catholic seats on the Supreme Court, as though neither group could or would ever be represented without a seat being reserved for them in advance. And there was a time when the thought of the political leadership of our country not being affiliated with mainline Protestant churches would have seemed as unlikely as the political leadership of Russia not being Communist. Yet we have cleared all those hurdles not because we are losing our sense of what it means to be an American as we slowly decline into national decrepitude, but because we have managed to maintain a lively, ongoing national debate about ideas that has allowed us to grow forward into an ever-more-sophisticated version of ourselves and our nation.
In people, there is always an ominous, slightly dark, aspect to growth. But in terms of society, growth does not imply inevitable degeneration. The fact that we have in our lifetimes crossed a line that we would have seemed not just unlikely to our parents or grandparents ever to cross, but one that would have seemed to them totally uncrossable—that is not a sign of moral deterioration or societal decay, but rather one of healthy growth towards an idea we all profess to hold dear: the ideal of a society in which people are evaluated based on the morality of their behavior and on their ethical worth as productive members of society, not on the color of their skin, their gender, or the spiritual path they follow. It’s a whole new world out there…and, in many ways, a far better one than the one we inherited from previous generations.
Or are other models more right for the history of nations and peoples? To ask the same question differently, are countries specifically unlike individuals in that they have the ability (which human beings lack but would all love to have) to morph forward from iteration to iteration specifically without facing the inevitability of decline and demise? Holland, for example, was once a mighty world power that controlled a huge empire with land holdings on five continents. Clearly, those days are long gone—all that’s left these days are a few islands in the Caribbean—yet it would seem odd to describe the Netherlands as a senescent nation merely because it was once immeasurably bolder militarily, more powerful, and more influential than it is today. More reasonable, I think, would be to say that Holland hasn’t declined as much as it has morphed forward into a new stage of national existence, into a new version of itself. Nor does it seem all that logical to argue that the death of the imperialist impulse that led nations to consider it reasonable to seize huge swaths of other people’s property and unilaterally to declare them part of a vast, far-flung empire must inevitably imply that the once-imperialist countries themselves too must eventually collapse. So if today’s Netherlands, with an annual Gross Domestic Product of more than $840,000,000,000 (which figure reflects a per capita rate of more than $50,000 per citizen, a rate higher than our own nation’s, Germany’s and the U.K.’s) isn’t reasonably understood as the doddering version of yesterday’s Dutch Empire, then perhaps we should be talking about metamorphosis rather than decline, about national growth forward specifically not characterized—or not inevitably characterized—by the inevitability of national decline and death.
All of these thoughts occurred to me this week as I perused an article in the paper this week that caught my attention, one in which it was noted that our country crossed an impressive, very interesting milestone last week when Governor Romney chose Paul Ryan as his running mate. It seems that, for the first time in history, there is not a single white, Protestant man or woman among the top governing officials, or perspective governing officials, of our country, considering in that category the presidential and vice-presidential candidates of both parties, the nine justices of the Supreme Court, the Speaker of the House, and the Senate majority leader. At first, it seemed unimaginable to me that that could be the case. I quickly made my own hand list, letting Wikipedia supply the details I wasn’t sure about. But my own research yielded the same results: the above-listed group of fifteen is made up of nine Catholics, three Jews, two Mormons, and one black Protestant. That’s all of them…and, just as the article said, not a white Protestant among them. Even granting that the president himself is, at least genetically speaking, as white as he is black, he is still far from the white Protestant model that held sway not for decades or for scores of years, but for the entire span of our country’s existence. Until now.
As recently as the 1952 presidential election year, that group of fifteen would have included only one individual who was not a white Protestant man, Felix Frankfurter. And although there has not been an election year since 1928 in which the entire group of fifteen has been white, Protestant, and male, that surely was the case for the 150 years of American history that preceded that year. I realize that one can make a reasonable argument that Abraham Lincoln, whose religious beliefs have been endlessly discussed, was not a “real” Protestant, but, at the end of the day, he worshipped in Protestant churches (without actually joining any) and regularly spoke in religious terms that most of us would easily identify with Protestant beliefs. One could make a similar argument about Thomas Jefferson, who is often vaguely labelled as a “Deist” (that is, as someone who believes in God without actually espousing any specific religious doctrine), but that too is a bit overstated in that he played a role in governing the Episcopal Church near Monticello and regularly referred to himself as a Christian (“"To the corruptions of Christianity I am, indeed, opposed; but not to the genuine precepts of Jesus himself. I am a Christian….”) Other than the two of them, only four presidents weren’t or aren’t formal members of Protestant Churches while in office (Andrew Johnson, Ulysses Grant, Rutherford Hayes, and President Obama), but all had ties to Protestant Churches and self-defined as Protestant Christians. Other than Andrew Johnson, every Vice President of the United States except for the incumbent has been a member of a Protestant church. (I hardly have to bother adding that all have also been white and male.)
I suppose I could have seen it coming. Maybe even I should have, but this detail that we have crossed the line into a world of American political leadership in which no one at all is a white man affiliated with a Protestant church caught me completely unawares. Nor do I want to interpret this development by viewing it through self-conscious Jewish eyeglasses and lauding it based on the assumption that the more multicultural our leaders are, the more tolerant their leadership will be, and thus also the less likely to find acceptable any traces of prejudice, including anti-Semitism. That probably is true. In fact, I think it probably definitely is true, but that’s not the sole point to consider. Instead, I’d like to use this detail in our evolving American story as a springboard for considering how nations grow.
For the most part, social policy moves ahead imperceptibly as people gradually, sometimes over decades, morph into more sophisticated versions of their former selves. Ideas that are commonplace fall away almost glacially slowly, but then are suddenly gone almost to the point of unimaginability just years after they were almost universally held. The notion of bus stations having separate washrooms for black people and white people is a good example of something that now seems so difficult to imagine so as almost to sound more quaint than malign, something like putting sinners in stocks in town squares to allow them to atone in public for their misdeeds. The idea that it is reasonable consciously and intentionally to pay women who have the same jobs as men lower wages than their male counterparts, I think, falls in the same category of an idea that was once widely considered rational, but which now sounds beyond peculiar. Even the notion that there could be societal merit in pressuring gay people to devote lifetimes to making believe they are straight, despite the misery that kind of pretense must almost invariably entail, seems impossible to square with our basic American commitment to tolerance and reasonability. Yet that was indeed how people felt, and for a very long time!
And that is how I think countries too grow. Unlike with respect to people, where growth inevitably leads at least eventually to demise, nations and societies can morph into healthier versions of themselves without the experience necessarily leading to decline. There was a time when it would have seemed impossible to imagine a black family in the White House. There was a time when people spoke self-consciously about Jewish or Catholic seats on the Supreme Court, as though neither group could or would ever be represented without a seat being reserved for them in advance. And there was a time when the thought of the political leadership of our country not being affiliated with mainline Protestant churches would have seemed as unlikely as the political leadership of Russia not being Communist. Yet we have cleared all those hurdles not because we are losing our sense of what it means to be an American as we slowly decline into national decrepitude, but because we have managed to maintain a lively, ongoing national debate about ideas that has allowed us to grow forward into an ever-more-sophisticated version of ourselves and our nation.
In people, there is always an ominous, slightly dark, aspect to growth. But in terms of society, growth does not imply inevitable degeneration. The fact that we have in our lifetimes crossed a line that we would have seemed not just unlikely to our parents or grandparents ever to cross, but one that would have seemed to them totally uncrossable—that is not a sign of moral deterioration or societal decay, but rather one of healthy growth towards an idea we all profess to hold dear: the ideal of a society in which people are evaluated based on the morality of their behavior and on their ethical worth as productive members of society, not on the color of their skin, their gender, or the spiritual path they follow. It’s a whole new world out there…and, in many ways, a far better one than the one we inherited from previous generations.
Wednesday, August 1, 2012
The Circle Closes
Is life a line or a circle? It’s a good question, but also a misleading one because, by framing the concept in mathematical terms, it makes it sound like the answer must be one or the other, a line or a circle. But the reality—not in the world of geometric grids, perhaps, but in life as we actually live it—is that life is a line and a circle, both a journey forward and a journey back around, both a trip into the future and an endless journey back into the past.
The circle part is the easier one to notice. In 1966, when I was a bar-mitzvah boy and my parents, acting entirely uncharacteristically, sent me off to Israel on the Jewish Agency program then called the Bar-Mitzvah Pilgrimage, we were lodged at a youth village near Givat Ada (now part of Binyamina) called Alonei Yitzchak. From there we made different trips to various parts of Israel, staying overnight in some places and returning back to the village when the destination was close enough not to require an overnight stay. I remember those tiyyulim to varying degrees, but the one that has stayed with me the most clearly over all these many years—now more than forty-five of them—is our trip to Jerusalem.
It was 1966. Israel was all of eighteen years old. The Six Day War, in the course of which Jerusalem would be reunited, was still a year in the future and the city was still divided in two by barbed wire fences and military checkpoints. The Mandelbaum Gate, located at the end of Shmuel Ha-Navi Street, was still very much in place and I recall the experience of being brought close up to it so we could peer through the barriers at the walls of the Old City being guarded in the distance by Jordanian soldiers. And that, other than when we were brought up to the roof of some tall, bullet-scarred building on what is now called Kikar Tzahal to peer through coin-operated telescopes at whatever we could see of the Old City from that lofty vantage point, was all we were to know of Old Jerusalem. Perhaps I was too young to feel disappointed, but New Jerusalem was more than exciting enough for me. I remember it all remarkably clearly, especially the nighttime scene in Kikar Tziyon and along Ben Yehudah Street. Yad Vashem, now a state-of-the-art museum, was still more like a series of ramshackle galleries housed in several one-story buildings in those days, but it had a profound effect on me nevertheless. The now-mostly-forgotten memorial to President Kennedy, called Yad Kennedy, was brand new in 1966 and that too had a real effect on me as well. (In retrospect, it seems odd to equate the two. But at the time, lacking perhaps the perspective of a life-long student of the Shoah, I remember being struck by the two as symbolizing, each in its own way, the horror of senseless violence.) But what struck me the most was the New City itself.
This was, of course, the Old Israel. Public telephones, when they worked at all, worked with tokens you had to buy in newspaper kiosks or at the post office and which you somehow never had quite enough of. International phone calls—the kind you made to your parents when you were lonely or needed more money—could only be made at special international telephone centers located mostly in the central post offices of most big cities. Not all public toilets featured actual toilet bowls. (Perhaps the less said in this specific regard the better.) Israel, even at eighteen, was brand new, the overwhelming majority if its citizenry either immigrants from elsewhere or the children of such immigrants. The Shoah, of course, even not within the walls of Yad Vashem, was the unseen backdrop against which current events still unfolded. (That is still true, I think, to a certain extent. But not in the same way it was in the 1960s, when the world was filled with survivors still in their thirties and forties, and child survivors who had come to British Palestine even before independence were still on active duty as soldiers of the IDF or studying in university.)
And into the middle of all that somehow parachuted myself, the young me whose parents were still dithering about whether to permit him—me—to take the F-train into the city on my own. It was, to say the very least, an experience that changed my life. Or, to speak less charitably, that would have changed my life if I had already had enough of one for it to be changed. But the bottom line was that my visit to Israel, and particularly my visit to Jerusalem, set the course for my life that I feel myself still following. It’s been a long time. And it’s been a long journey. Mostly, it’s been a line. But it’s also been a kind of a circle.
When we visited Jerusalem, we stayed at Kiryat Moriah. Now Kiryat Moriah is a huge, modern Jewish Agency campus that hosts all kinds of tour groups, seminars, Birthright groups, and missions from abroad. But then, back in 1966, it was something more akin to a slightly dilapidated youth village set down between the southern suburb of Talpiyot and the apple orchards of Ramat Rachel, one of Israel’s oldest kibbutzim. It was also on the border, situated (if I remember correctly) adjacent to the strip of No Man’s Land that separated Israel from Jordan and far more similar in its feel to Alonei Yitzchak, “our” youth village, than to the college campus it resembles today. And it was there that we stayed, feeling brave for being so close to the Jordanian patrols we were assured were passing by menacingly on their side of the line as we slept peacefully on ours. It’s strange—I have only the haziest recollections of all sorts of things that came afterwards in my life, but that experience of visiting divided Jerusalem as a bar-mitzvah boy remains with me as a kind of watershed event that divided and still divides the part of my life that came before it from the part that followed.
Nothing stays the same. Within a year, Jerusalem was united, its earlier iteration as a divided city merely a recollection shared by those of us who were present actually to experience it. And, soon enough, I myself moved into the next part of adolescence, the part that involves some awkward combination of physical, mental, emotional, hormonal, and (slightly) spiritual growth, the part that relegated my own earlier iterations to the realms of pleasant and unpleasant memory. Kiryat Moriah too changed, as noted above, but only cosmetically: physically it remained where it always had been: on Ha-Askan Street in the part of Jerusalem now called Arnona. And it was right there, on the street facing the main gate into Kiryat Moriah, that Joan and I just spent the first three weeks of our summer vacation time trying to make our priorly unfurnished apartment into a living space that could accommodate people actually living there.
It all worked out beautifully. The place is lovely. The light, airy, sunny space we recalled from when we first saw the place was just as recalled. We found, even, a tenant—actually two tenants, young men pursuing degrees in Jewish education at the Pardes Institute—who will stay in the place until we can return next summer. Everything we needed, we somehow were able to find in some combination of the furniture stores on Herzl Street in Tel Aviv, the lighting fixture shops in Talpiyot, the shuk in Jaffa, and, of course, the giant, magnificently air-conditioned IKEA in Rishon Letziyyon. And so the circle closed: in the some parallel universe, the thirteen-year-old me is still lying in bed at night somewhere in Kiryat Moriah wondering if there really are Jordanian soldiers just a few hundred yards away, but in this one, in the iteration of the multiverse that seems to us all to constitute unchallengeable reality, the considerably-older-me just spent three weeks sleeping right across the street, worried not about Jordanian soldiers but about the VISA bill that, just as surely as dawn the night, will inevitably follow our return from Zion.
Other circles also closed. We had lunch last Shabbat with the woman who, twenty-eight years ago, was our Lamaze teacher when Joan was pregnant with Max, our oldest child. We had a lovely visit with cousins of Joan’s whom we hadn’t seen in almost thirty years. I went to the Kotel, closed to Jewish visitors from Israel in 1966 but which I visited for the first time in 1974 when I was a counselor on a summer teen tour to Israel run by the then-robustly-functioning American Zionist Youth Federation, and had a fleeting vision of myself in that place in that summer between college and JTS trying, mostly (I fear) unsuccessfully, to explain to my young charges why the place mattered, why it was so meaningful for us to be able to visit the Wall not under the begrudging aegis of the Jordanian government or the United Nations, but under the watchful protection of a mighty Jewish army protecting the citizens of an independent Israel and their guests from harm.
All in all, we had a fabulous three weeks. In many ways, and excluding the year we spent in Israel when Max was born, it was the most meaningful, most satisfying visit to Israel we’ve ever had. There are so many more stories I want to tell, so much more I want to share with you all about our experiences. But for the moment I’ll have to content myself with what I’ve written above, with the bare outline of what it meant for us to take a few baby steps forward towards being part of Israel not just emotionally but physically and financially in a way we have dreamt about forever, but only now have found the wherewithal—and the nerve—actually to undertake.
Subscribe to:
Posts (Atom)