Thursday, May 30, 2013

The Will to Self-Alter

When Joan and I got married, we were not the only residents of Apartment 3E. We were the only ones who paid rent. And we were the only ones who had keys. But what those advertisements in the subway insisted regarding the quaintly-named roach motels we bought by the dozen--que las cucarachas pueden entrar, pero no pueden salir—seemed more true of our building on West 111th Street than it did of the actual devices themselves that we were counting on to rid ourselves of our unwanted and disgusting roommates. Why I remember that in Spanish, a language I don’t speak, more easily than in English (and how can there not have been an English version?), I have no idea. Perhaps it was the graphic, which I still remember clearly, of a fierce Hispanic-looking woman wearing a colander on her head and leaning a toilet plunger over her shoulder as she sallied forth to do battle with the same varmints that were as unwanted in our home as the ad copy made it crystal clear they were unwelcome in hers.  We were clearly in the same boat, the three of us. And if her battle cry was in her native Spanish, then so was ours going to be!

I was better about it than Joan. (Joan is an outdoors girl from Ontario, but her idea of living with animals involves sharing the forest with moose and caribou rather than sharing a kitchen with revolting hexapedes.) But I wasn’t too happy either. Mind you, it wasn’t us. Our kitchen was, as it still often is, spotless. We didn’t leave food around, and least of all overnight. We cleaned up our crumbs! But the entire building was filled with them and no single apartment’s efforts seemed sufficient to turn the tide.  Eventually, we solved our problem the simple way—by moving to Israel for a year and never returning to Morningside Heights. But, although I remember our years on 111th Street fondly (and still respond very favorably to the smell of Indian cooking that was a permanent feature of life on the third floor at 600 West and which provided the ongoing olfactory backdrop to our lives as newlyweds), I never developed the Manhattanite’s native tolerance for the cockroach. Unlike my father (who for some obscure reason only possibly related to the dialectics of Brownsville English, added an “a” between the “k” and the “r” to create a medial third syllable), I pronounced their name in two syllables. But I never liked them. 

Nor have I come to remember them fondly, even all these years later. But over the years, they have earned my begrudging respect (if not quite my admiration). For one thing, they have survived on earth for about 300,000,000 years, which is about 299,800,000 years, give or take a few millennia, longer than our own species has wandered the planet. For another, they are apparently close to indestructible. The oft-repeated claim that they can survive almost anything is apparently at least mostly true. The Discovery Channel conducted an experiment in which a mass of cockroaches were exposed to 1000 radon units of cobalt 60, which dose of radiation would kill any human being in ten minutes, and half the cockroaches survived. When they upped the ante to 10,000 radon units, about the amount released at Hiroshima, at least ten percent of the roaches survived. (If you are reading this electronically, click here to see the results of the experiment in more detail.) But it is not to discuss the cockroaches of wartime Japan that I write today. Perhaps some other time!

The reason I am writing about this topic today, actually, is because of a study that was published just this last week in the journal Science.  Undertaken by Ayako Wada-Katsumata, Jules Silverman, and Coby Schal, all of them professors at North Carolina State University in Raleigh, the study effectively demonstrated that cockroaches have the internal capability to defend themselves against sweet-tasting poisons by self-altering their own body chemistry to make things taste bitter and unpalatable that under normal circumstances would taste sweet to them. (To read the abstract, click here.)  This is, of course, not the same as an insect developing a tolerance for some variety of poison that is regularly and repeatedly used against it. It is far more interesting than that, I think, because this is not just a case of poison not working well but one about the ability of a living thing to morph into a new iteration of itself that, instead of merely mimicking its prior behavior (and dying in the process), is now not vulnerable to things that just months or weeks earlier could easily have cost them their lives.

How exactly this all works remains an unsolved puzzle, but it appears to have something to do with the way cockroaches taste things. We human beings taste with our taste buds, which limits us to tasting with our tongues. But cockroaches taste with something called “taste hairs” that appear in many different parts of their bodies. How cool would that be, being able to taste with our elbows or our noses? Or with our big toes!  But the reality is interesting enough without shifting into fantasy. These “taste hairs” work because they contain different kinds of nerve cells that react to flavors and send messages bearing decipherable code to the cockroach’s tiny brain, which is how the bug knows if something is sweet or bitter. Those are the only two options, apparently: sweet or bitter. (There’s a metaphor in there somewhere.) When something sweet touches the nerve cell that is programmed to detect sweetness, it sends along a message that makes the cockroach want to eat the substance in question. When something bitter touches the bitter receptor cell, the message discourages the roach from wanting to eat the substance in question.  What the scientists in North Carolina discovered was that the cockroaches—this is, in and of itself, amazing—that the roaches somehow understood that they were being tricked into eating poison and—and this, even more so—were able to switch around the receptors’ messaging system so that the sweet poison would send a message to the cockroach’s brain discouraging it from eating the substance, thereby avoiding being poisoned to death.

So much remains unknown. How could cockroaches be savvy enough to understand that they—not the individual, but the group, the genus, the whole bug nation—is in danger of being poisoned? How could they possibly be sophisticated enough to realize that an effective defense against poison would be for the sweet, seductive flavor of the toxic substance to trigger the “bitter” response rather than the “sweet” one?  How can evolutionary change—and what is this what an example of if not evolutionary change?—possibly happen so quickly, in this case in a matter of a few years? (The behavior studied by the scientists in North Carolina has only been observed since the 1990s.) Can evolutionary change be willed into existence? Do cockroaches even possess the will to self-alter? Do they have that level of insight into the world in which they live? (That hardly seems possible, but what alternate explanation could there be?) These are all good questions. And none as yet has a very good answer. Who ever thought cockroaches could be this interesting?

For me, all this science prompts a different set of questions entirely. We too, after all, are evolutionary creatures, we human beings. As the countless millennia have passed, we too have altered and developed as we morph into versions of our former selves more suited, or at least ideally more suited, to the world as it itself has developed along into its latest iteration. But we clearly lack the cockroaches’ ability to do this in response to stimuli over the course of decades rather than over the course of eons.  The announcement a few weeks ago, for example, by scientists at the Mauna Loa Observatory in Hawaii that carbon dioxide had reached an average daily level over 400 parts per million, a concentration of the most heat-trapping gas in the atmosphere the earth has not seen in millions of years, was greeted, as far as I can determine, mostly with yawns by the general populace.  (You can read more about it by clicking here.)  I’m sure environmental scientists all over the world took the announcement as a great call to arms…but the rest of the world seems to me to be in desperate need of the cockroaches’ ability to self-alter in the face of pending disaster. For the roaches, the task was simple: if sweet equals death, then sweet needs to taste bitter. How they managed that, who knows? But it’s the lacking human parallel ability that interests me even more. (I’m a rabbi, after all, not an entomologist.) Will human beings respond to this news, which Maureen Raymo, a Columbia University scientist who works at the Lamont-Doherty Earth Observatory, characterized now as part of an “inevitable march towards disaster,” by self-altering, not to find sweet bitter and bitter sweet, but to find boring riveting, yawn-provoking stimulating, and ecological-tedious deeply and personally challenging. In that same article about the carbon dioxide, a Yale University professor was quoted as saying, not (I thought) bitterly but with resignation, that the time to act forcefully on our own behalves was probably yesterday.


But, as Chad and Jeremy already knew decades ago, yesterday’s gone. So the question isn’t whether we need a time machine to deal with this disaster. That would be nice, but even if we did possess the ability to go back in time to address this issue by heading it off at the pass, we would still need the roaches’ ability to self-alter from complacent beings who find it possible to be bored by looming environmental disaster on a scale we non-scientists can barely imagine into the kind of thoughtful creatures (and the sapiens part of Homo sapiens specifically references intelligence as our species’ most characteristic feature) that have the good sense to be terrified by truly terrifying things. Tasting sweetened poison and finding it bitter, and then specifically not ingesting it, is a good model for us to follow. If bugs can do it, then my sense is that so can we…if we could only summon up the will to self-alter as productively and as beneficially to our long-term existence as those unappealing invertebrates my father only called cockaroaches.

Thursday, May 23, 2013

Artifacts



I wear my grandmother’s wedding ring. She was born—in, of all places, Pottstown, Pennsylvania—in 1883 and she married my grandfather in 1907. I have their wedding invitation framed and hanging on the wall of our living room. (They were married on November 10th of that year at my great-grandparents’ home at 60 East 113th Street.) I have their ketubbah as well, also framed and hanging in our home. And I have the ring they used at their ceremony, and which my grandmother then wore, as far as I know, every day of her life until she died a few months before my bar-mitzvah in 1966. The ring passed into my mother’s hands, and then, when she died, into my father’s. So my dad had two wedding rings to give us, and to this day Joan wears my mothers’ and I wear her mother’s. It’s nice being connected to the past in that specific way, feeling her presence from time to time just by noticing the ring on my ring finger.  I didn’t know either of my grandfathers. My other grandmother passed away when I was just four years old. So the grandmother whose ring I wear is my only real connection to the world from which she came, a world different in so many different ways from our own.  What East 113th Street looked like in 1907, I have a reasonably clear idea. (I’ve never seen a picture of that specific block, but there are loads of photographs of Manhattan at the turn of the century to give someone like myself the general idea.) The ghosts are surely real, but sometimes you can feel the reality of people loved and lost in their absence simply by contemplating some specific thing they left behind.

But there are artifacts and there are artifacts! My grandmother’s ring surely counts as one. But just this week there was an announcement regarding something seriously older than my grandma’s ring, something left behind by someone who lived and died about 30,000 years earlier.

When I was growing up, referring to someone as a Neanderthal was not a compliment. (One of my friends once had to serve detention for referring a bit too loudly to one of our stricter gym teachers that way.) But it turns out that, as is so often the case with respect to the language of casual disdain, that usage may have been a bit hasty. The real Neanderthals—named for the Neander Valley near Düsseldorf in Germany (the German for “Neander Valley” is Neandertal, formerly spelled Neanderthal) where their fossilized bones were first unearthed and identified—turn out not to have been such bad guys at all. Mind you, we don’t know all there is to know about them. Anthropologists aren’t even united on whether they should properly be characterized as a subspecies of Homo sapiens (the species to which we ourselves belong) or a separate species of the same genus.  But whether the Neanderthals are correctly to be called Homo sapiens neanderthalensis or just Homo neanderthalensis, the bottom line is that they were around for a very long time, appearing in Europe somewhere between 350,000 and 600,000 years ago, and becoming extinct—although not quite completely—something like 30,000 years ago. Nor is the “not quite completely” part is a detail to be passed by lightly: results of efforts to map the Neanderthal genome in 2010 yielded the surprising conclusion that at least 2.5%  and possibly as much as 4% of the genetic material carried by modern non-African human beings today was inherited directly from the Neanderthals, probably through cross-breeding between Homo sapiens and the Neanderthals when the former arrived in Europe on their journey out of Africa somewhere between 80,000 and 50,000 years ago. So, like so many other things in life, they’re gone and also not gone…because, at least a little bit, they are us.  Or rather, to say the same thing more clearly and more challengingly, we are they.

Among the bits and pieces of fossilized remains that constitute all—other than ourselves—that’s left of the Neanderthals is a single child’s tooth, a molar, found at an archeological site in Belgium. That, in and of itself, is amazing enough a fact to give pause—I don’t know what happened to my own milk teeth (perhaps the Tooth Fairy still has them), yet this one tooth has survived as a sole dental sentinel still, after all these countless millennia, bearing witness to a world that came and went, to a world like and unlike our own, to a world inhabited by some version of who we are—people slightly shorter, far stronger, and possessed of brains about the size of our own who somehow became part of who we are. But it’s what scientists have managed to learn from this one tiny tooth that’s the truly amazing part.

It turns out that researchers in Sydney and New York (the locals affiliated with the Carl Icahn School of Medicine at Mount Sinai Hospital in Manhattan) have learned successfully to exploit the fact that the age at which a child is weaned from its mother’s breast can be determined after the fact with reference to the presence of trace-levels of barium in the tooth’s enamel.  Why that is exactly, who knows? But by studying the data, the scientists in question, whose work was published earlier this week in the journal Nature, determined that the child from whose mouth the tooth originally came was weaned from the breast at 1.2 years of age. That’s 1.2 years of age…more than 30,000 years ago. (By way of comparison, the average age of weaning in non-industrial countries today is, at 2.5 years of age, more than twice that. The American Academy of Pediatrics recommends that mothers nurse their babies for a minimum of six months. Among other primates, the suckling period is much longer: wild chimpanzees, for example, wean their young at about 5.3 years.)  The conclusions published in Nature were not universally accepted.  One Michael Richards, a specialist in the study of ancient bones and teeth at the Max Planck Institute for Evolutionary Anthropology in Leipzig (and clearly not the guy from Seinfeld), wondered aloud in response to the publication of the article how the researchers could be sure the barium hadn’t seeped into the tooth from the soil in which it was embedded for tens of thousands of years. Other scientists held back for different reasons from fully endorsing these new findings.  But, aware of those demurrals, I still found myself drawn to the article and its findings.

We truly do live in an age of miracles. A child loses a tooth, and 30,000 years later—or rather, 30,000 years later at least—someone in Belgium picks it up.  Instead of mistaking it for a tiny chip of stone, this person somehow recognizes it as a fossilized tooth. Then, after somehow figuring out that the age at which a baby is weaned from its mother’s breast can be determined after the fact by analyzing the traces of barium left behind in the enamel coating of the child’s teeth, scientists determine that the child from whose mouth that tooth came was weaned from mother’s milk at 1.2 years of age. Just like that!

Did the child have a name?  Almost definitely it would have! The Neanderthals spoke and used some kind of language to communicate. (Interested readers may wish to consult University of Reading archeology professor Steven Mithen’s book, The Singing Neanderthals, published by Harvard University Press in 2007 for more information on the Neanderthals’ language. I haven’t read it myself…but I will!) They lived in communities and when they were injured they nursed each other back to health. When they died, surviving members of their communities buried them. They were, in short, some version of us possessed of slightly larger and differently shaped brains. So the child had parents and probably siblings. The child lived in a community. And probably it had a name as well.

I’ve been thinking about that child. Scientists say that the sun will turn into a fiery giant that will render life on earth impossible in about 50 million centuries. That’s more than enough time for scientists—or whatever they’ll be called by then—to discover one of my lost baby teeth in about 30,000 years and to make whatever conclusions they can draw about me and my life. (I was bottle-fed from the start, so I can save them the effort of analyzing the barium levels in my tooth enamel—assuming my letters to you also survive for 30,000 years, that is—by just admitting that up front now. The rest, they can figure out on their own.) Who can even begin to imagine what life will be like in a mere three hundred centuries?  Nothing will be the same! But also…everything will be the same, I think. People will find their greatest happiness in each other’s arms. Surviving the loss of a loved one will still be the greatest of all life’s challenges. People will still, I think, invest their greatest hopes in their children, and spend their lives worrying about them and trying not to hover. I suppose that even that far in the future people will still occasionally eat too much and drink too much, then wake up the following morning regretting either or both. Everything changes and nothing changes!  The next time any of you is in Düsseldorf, you can take the train over to Mettmann and from there you can go to the actual Neandertal and visit the actual Neanderthal Museum. (In the meantime, click here to take a quick look!)  And there you will find evidence of people wholly unlike and remarkably like ourselves, people depicted as living in family groups, as worrying about feeding each other, about growing old together, about how to face death and survive loss.

Why people who profess faith in God and for whom the Bible serves as the foundation of their spiritual lives would turn away from remarkable evidence like this—evidence for the commonality of the human experience in all its unimaginable variegation—merely because it needs to be read as a kind of scientific midrash on the story of creation as presented in Scripture, I can’t imagine. This kind of scientific research confirms my faith without weakening it even slightly: I find it infinitely easier to believe that all humanity has a Creator in common when I learn about the amazing ways that the human experience is precisely one of shared experience from continent to continent over the course not of centuries or millennia, but scores, even hundreds, of millennia.  The core ideas around which the Torah’s story of creation rotates: that we all have one Creator, that we all share common ancestry, that the human genome testifies to the brotherhood of humankind far more meaningfully than it can be construed to divide us from each other—these are the ideas suggested to me by that tiny tooth…and the lesson scientists have managed to bring forth from its ancient enamel.


Friday, May 17, 2013

Bringing In The Clones


Humanity crossed an amazing threshold just this last week, but the world seems mostly to have yawned. But in terms of the history not only of science but also of human culture itself, it seems to me that it will eventually sound just as odd to say that we barely took note of the event to which I am referring as it would sound now to say that at the time we hardly realized that Neil Armstrong’s first step onto lunar soil constituted a very big step forward in the history of human accomplishments. I am referring to the successful use of cloning to create human stem cells that was announced by scientists earlier this week at the Oregon Health and Science University in Portland. You could easily have missed it, however.

We live in an age of blasé attitudes towards everything. Perhaps the pace of technological advances in the last several decades has simply dulled our ability to be impressed: when I attempted the other week to have a video conference call with colleagues in Israel, the U.K., France, and California all at once and the picture was briefly distorted slightly, I found it far more natural to be irritated that the program wasn’t working properly than to be amazed that such a thing exists in the first place and that this kind of technology is available for free to normal people such as myself for the mere effort of downloading the program and clicking the “install” button.  But this announcement in Oregon was far more than a simple threshold over which some team of scientists finally figured out how to step; it seems to me that this achievement constitutes a real game-changer in the history not only of science but of human culture itself. Or, at the very least, that it constitutes a huge challenge for a society that must now make regarding innovative procedures that have the capacity to change the way we think of, and define, human life.

The scientists in Oregon were attempting to help an eight-month-old baby born with a genetic disease. (To protect the patient’s privacy, neither the child’s name or gender, nor the specific disease from which the child is suffering, have been made public.)  That much—doctors attempting to help sick children recover from debilitating conditions or illnesses—happens a million times a day in every country of the world. But it was the specific way in which they chose to attempt to help that is of note here: the scientists were successful in using simple skin cells from the baby to create several embryos that were the precise genetic doubles of the baby.  Then, having created these several embryos, the doctors were able successfully to extract stem cells from them which they will now use to “cure” the baby of its gene-based illness. Not long ago, this would have sounded like science fiction of the most unlikely variety imaginable. And yet it happened just this last week in Portland.

The key principle is that embryonic stem cells have the amazing capacity to turn into any kind of cell within the human body. Healthy ones, therefore, can be used successfully to replace unhealthy, genetically-diseased cells and that in turn can cure people suffering from the diseases brought on by some specific kind of genetic imperfection in the first place.  Today, these kinds of curative stem cells are generally derived from embryos created in vitro in laboratories from two gametes in a process that is merely the mechanical version of the normal human reproductive process. But those embryos are not—and cannot be—the exact genetic match of anyone at all, just as all babies, because they are the genetic heirs of both their parents, can never be the precise genetic matches of either. And it is precisely because the chances for rejection are far greater when recipient and donor are not exactly matched that this week’s achievement in Oregon, in the context of which a human embryo was created from single parent, is so important. It apparently no longer takes two to tango.

This is not unlike the way Dolly, the world’s most famous sheep, was created in Edinburgh in 1997. Yet although Dolly was born healthy and grew to ovine adulthood, the Oregon team, led by Dr. Shoukhrat Mitalipov, are insisting that the embryos they created would not be implanted in a human womb and, far more to the point, could not develop into human babies even if they were implanted. They did not say why not, or at least no news source I could located quoting them as explaining other than with reference to similar experiments with monkey embryos, none of which traveled successfully through the stages of gestation to birth. Still, this is surely a step in that direction, albeit one in need of further tweaking actually to create human beings from a single gamete. If this is a road down which society does not wish to travel, then this would clearly be the moment—and perhaps the last one, at that—to get off the bus.

The responses to the Oregon announcement were so predictable they could almost have been scripted in advance. The loved ones of people the most likely to benefit from the Oregon announcement—people suffering from Parkinson’s disease, Alzheimer’s, spinal cord injury, and diabetes—were thrilled. People who place supreme emphasis on the sanctity of human life were less enthusiastic. The distinction between therapeutic cloning (i.e., cloning for the sake of helping cure sick people of their illnesses) and reproductive cloning (i.e. cloning that has as its point the creation of human life) was trumpeted by those eager to support the former without feeling concomitantly obliged to support the latter, while others—myself included—found the distinction between the two a bit blurry, especially if the specific issue addressed by therapeutic cloning has to do with a couple’s fertility issues and leads to the creation of an embryo that would then be implanted in the uterus of a woman who would then give birth to the baby and become its mother. When put that way, how different is the use of cloning to help infertile couples become parents from the use of in vitro fertilization techniques to accomplish exactly the same thing? In that context, at least in my opinion, the fact that the embryo that grows through gestation to become that couple’s child was created from material harvested from one rather than two parents seems more like a detail than a crucial factor in evaluating the procedure’s moral acceptability. So the child is the genetic match of one parent rather than the genetic amalgam of both. So what?

Sheep have been cloned. Monkeys also have been cloned, as have been goats, cattle, mice, horses, pigs, frogs,  carp, fruit flies, rabbits, camels, rats, wolves, and at least one water buffalo. It feels inevitable that, at least eventually, someone is going to figure out how to clone human beings. And it will be specifically at that juncture that society is going to have to decide how to proceed. In the United States, federal law has since 1996 prohibited stem cell research that leads to the destruction of embryos. (It was just this last January, however, that the Supreme Court decisively declined to hear a lawsuit intended to forbid the federal government from financing any stem cell research at all.) But although there are no federal laws that ban cloning completely, there are laws on the books in thirteen states that specifically outlaw reproductive cloning. The distinction, as noted above, is not as absolute as it at first might sound. But more to the point is that, in the end, genies escape from bottles and can rarely, if ever, be successfully forced back inside. Somewhat in the same way the world cannot unlearn how to create nuclear weapons although there are many who wish this were not the case, this kind of technology, once developed and proven to work, will not be unlearnable…and people desperate enough to reproduce will always find ways to access the kind of technology they need to become parents. If, that is, that technology is permitted to be developed and perfected.

Society has long since accepted the principle that there is nothing inherently wrong with providing assistance to couples who would once have had no choice but to remain childless. Still, the whole concept of cloning raises the possibility of misuse in a way that “regular” IVF technology doesn’t and, as a result, the real question is how real those possibilities actually are. Yes, it is true that bereaved parents could attempt to create precise genetic replacements for their lost sons and daughters. The relatives of murder victims could attempt to create babies bearing their lost relations’ exact genetic code. Societies could respond to genocide by creating, or rather re-creating, legions of their formerly lost citizens to live on in their stead. Totalitarian governments could create armies of genetically pre-programed workers possessed of precisely the skills needed to perform some specific job. Of course, none of these cloned individuals would be the people they replaced. They would carry the same genetic code, but neither their memories nor their experiences. They would not be the same people, although, somewhat like Dr. Evil and Mini-Me, they would probably look very much alike…assuming they lived similar lives in similar places, spent similar numbers of hours in the sun or at the gym, and weighed roughly the same number of pounds. But, other than cosmetically, they would simply be possessed of the same genetic potential as their former iterations absent any of the actual results that that potential yielded.  The real challenge facing society in light of this week’s breakthrough, therefore, is to decide how seriously to take any of the above possibilities…and how to weigh it against the right of individuals to reproduce as they wish and can, and of scientists and physicians to seek to cure disease by whatever means suggests itself as feasible.

We don’t say, after all, that deaf people have some sort of moral obligation not—in cases where it is possible—to have surgery to restore their hearing because God created deafness in the world and undoing that aspect of creation would be tantamount to thwarting God’s will. We certainly don’t follow that line of reasoning with respect to the blind or the lame. So why should an individual who simply hasn’t married or found a partner with whom to create life be forbidden by law from reproducing merely because, up until now, the reproductive process has been always a pas de deux and never a pas seul. Things change! In my opinion, the challenge facing society is not to embrace or reject the kind of cloning techniques that can lead to healing for some and, eventually, families for others, but to figure out how to move forward in a way that is consonant both with our ever-evolving moral values and with the rights of any individual in a free society to chart a course forward in life that corresponds to that specific person’s wishes and desires.

One thing is clearly the case: it’s a whole new world out there, and it’s getting newer with every passing year!

Friday, May 10, 2013

Tannhäuser Returns!


When I wrote last week about Jodi Picoult’s novel, The Story Teller, I hadn’t anticipated how chock full of Shoah-related news the days to follow were going to be. And the news was mostly good!

The good news—and it is very good news—was the conviction of the final three of the despicable lowlifes who stole $57,000,000  from the Hardship Fund, a fund established by the German government in 1980 to provide one-time payments to people who abandoned their property when they fled to the Soviet Union from the Nazis but who were neither German citizens nor citizens of countries that had already been occupied by the Germans. (The other twenty-eight indicted individuals simply chose to plead guilty to the changes against them.) And, as though that wasn’t bad enough, they were also convicted from the so-called Article 2 Fund, a fund established in 1990 following the re-unification of Germany to provide reparations to Shoah survivors who had the very bad luck to end up living in East Germany when the dust settled and Germany was divided into West Germany and East Germany, and who were thus denied reparations at all by the Communist government. (The Communists of East Germany, instead of owning up to their guilt as Germans, chose instead to pursue the fantasy policy that they were actually the victims of Nazis and not the perpetrators of their crimes.)  With these final convictions, a total of thirty-one individuals have pled guilty or been found guilty. Sentencing is still to come, although it’s hard to think of a ring in hell hot enough for people who would participate in a plan to steal from people whose suffering was, even before this final ignominy, incalculable.

The next piece of news, also good, relates to something I wrote to you about a few weeks ago.  In that letter, I discussed the work of the German government’s so-called “Z Commission,” more properly called the Central Office of the State Justice Administration for the Investigation of National Socialist Crimes, and its announcement that it had uncovered the names and identities of several individuals who had participated in the murder of millions at Auschwitz, mostly guards at the camp who had never before been identified, let alone indicted of their crimes and tried courts of law. Since Auschwitz was liberated by the Red Army on January 27, 1945, we are talking about crimes committed almost seventy years ago and I specifically wrote to discuss with you whether it was just or cruel to pursue nonagenarians this long after the fact. (If you are reading this electronically, you can reread what I had to say about that by clicking here.)  And now, only a few weeks later, the games are on with the arrest this week of one Hans Lipschis, the ninety-three-year-old who occupies (or rather, until this week occupied) the number four spot on the Wiesenthal Center’s list of most-wanted war criminals.  Mr. Lipschis was arrested at his home in the picturesque Germany town of Aalen, formerly the hometown of Nazi Field Marshal Rommel, where he has lived since being deported from the United States after an investigation by the Justice Department’s Office of Special Investigations uncovered his Nazi past. Born in 1919 in Lithuania, he admits to having belonged to the SS and to having been stationed at Auschwitz but insists that he was only a cook. The members of the “Z Commission,” who should know, apparently think otherwise.  For my part, I think only good can come from trials like the one Hans Lipschis is apparently about to have.  There is no statute of limitations for the crime of murder, nor should there be. The argument, therefore, that if the defendant is really, really old, he should be allowed to die in peace seems to me somewhere between absurd and silly: if there is no statute of limitations for murder, how could there logically be one for mass murder?

And that brings me finally to the scandal surrounding this spring’s production of Tannhäuser at the Deutsche Oper am Rhein in Düsseldorf. I am not a huge fan of Wagner, and not solely because he was later on so beloved of the Nazis. That, in and of itself, says more about them than about him. (Beethoven, after all, to whom no anti-Semitic attitudes have ever been ascribed, was also lionized by the Nazis.) But Wagner was also the author of the infamous anti-Semitic screed, “Jewishness in Music,” which was every bit as much an attack on Jews and Judaism in general as it was “about” the worthlessness of specific composers of Jewish descent like Felix Mendelssohn and Giacomo Meyerbeer, and which became a landmark publication in the history of pre-Nazi German anti-Semitism. On the other hand, Wagner was apparently one of those salon anti-Semites who had Jewish friends and had it in him to like certain specific Jewish people, but who saw no reason to hide his distaste for Jews in general and for Judaism. Still, to lay Treblinka at Wagner’s feet also seems exaggerated. The man died in 1883, six years before Hitler was even born.  He was, by all accounts, no more anti-Semitic in his world view than the average German of his day and place. Perhaps more to the point, there are no offensive characterizations of Jews in any of Wagner’s operas. (Indeed, the specific characters sometimes identified as Jewish “types” in Wagner’s operas, specifically Mime in the Ring cycle, Sixtus Beckmesser in Die Meistersinger and Klingsor in Parsifal, are specifically not identified as Jews.)  The whole issue is complicated. Readers who want to know more and who are reading this electronically can read an excellent, I believe fair-minded, survey of the whole issue by clicking here.)

And now we have the whole brouhaha surrounding this year’s production of Tannhäuser  in Düsseldorf.  Tannhäuser was a real person, a historical figure of the thirteenth century remembered as a bard and as a poet.  His poetry survives, but far more famous, however, are the legends that surround the poet’s life, and particularly the one that features him first locating the subterranean home of the goddess Venus and then spending a year there worshiping her. Eventually filled with remorse, Tannhäuser  —all this according to legend rather than historical record—then travels to Rome to ask the pope to absolve him of his sins. The pope declines, observing that just as likely as Tannhäuser achieving God’s forgiveness after spending a year steeped in debauchery and idolatry would be the pope’s staff sprouting blossoms. Three days later, the pope’s staff does indeed produce such blossoms (just like Aaron’s in Parashat Korach), but by then Tannhäuser has returned home to seek earthly redemption not on his knees before the pope but instead in arms of his true love, Elisabeth. Wagner’s libretto, which he himself also wrote, is based directly on this legend and thus features a combination of themes guaranteed to interest any opera-goer: debauchery, regret, atonement, rejection, absolution, and redemption. How could that combination of themes not draw audiences?

Tannhäuser premiered in Dresden in the fall of 1845. It has been produced and re-produced countless times in opera houses all over the world, including famously in an updated version in Paris in 1861. (The opera’s American premiere was in 1859 at the Stadt Theater on the Bowery in lower Manhattan.) But there has never been a production like this spring’s one in Düsseldorf. In this production, directed by Burkhard Kosminski, sets the story in Nazi Germany. Venus appears as a Nazi officer; her subterranean crypt is recast as a gas chamber.  In one especially brutal scene, an entire family—mother, father, and daughter—are stripped naked and murdered on stage. Apparently the scenes were so graphic that some audience members actually required medical assistance after leaving the theater. Others stood up in their seats and booed loudly.  Many people walked out. Even more complained to the management that the liberties taken with libretto made it reasonable to wonder if what was being produced even was Wagner’s opera, even if it featured Wagner’s music.

Watching from the outside, it’s hard to know what to make of this. It’s not at all hard to understand why Germans would prefer to recall the Nazi era as an aberration, as a bizarre departure from the noble culture of pre- and post-war Germany.  The thought that the roots of Nazism can be traced back to the nineteenth century—the century during which most of the upper-level Nazi leaders were born, after all—is one thing, after all. But to move beyond that to find the roots of the Nazis’ brutality in the complex of myths and legends that form the medieval foundation upon which rests the very pre-modern civilization that most Germans would like to think that Nazi barbarism was a deviation from—that, I can also understand easily, is something most contemporary Germans would want ardently not to believe. Maybe it’s even not so, although books like Daniel Jonah Goldhagen’s make me wonder if that might not be just so much wishful thinking. But what interests me in all this, however, is not specifically the way the audience’s revolt in Düsseldorf has been resolved (the Deutsche Oper announced yesterday that the work would henceforth be performed in concert version rather than as a dramatized stage piece with the singers in costume), but the fact that finally, after all these years, the question of how deep in the culture of modern Germany the roots of anti-Semitism lie appears to have become the question to ask…and, if possible, to answer honestly.  The audience’s response to the Düsseldorf Tannhäuser clearly signals that today’s Germans are eager to contextualize their nation’s Nazi legacy. That, surely, is their right. But to do so in a way that corresponds to history—and particularly to the history of German culture within the broader context of European culture—without falling prey to wishful thinking or to satisfying, but basically groundless, fantasies, that is the challenge facing modern Germany as we approach the seventy-fifth anniversary of Kristallnacht this November and, with each passing year, the horrors of the Shoah slip further and further into the past.

Friday, May 3, 2013

The Storyteller


I am not a literary snob. I am not, therefore, someone who looks down on books that hoi polloi read and enjoy, but that fail to meet my lofty standards of literary excellence. Instead, I prefer to judge books, including mass-market best sellers, based on the degree to which I find them engaging and satisfying to read, and specifically without respect to the author’s pedigree, education,  day job, real-life status, or prior accomplishments.

It was in that spirit, in fact, that I bought and read Jodi Picoult’s new novel, The Storyteller. She’s a good example—Danielle Steele and Tom Clancy are others—of people who write hugely bestselling books but who have never acquired the cachet of a “serious” author, the kind of author whose books are taught to undergraduates as opposed merely to being read by them.  But there was another reason I was drawn to read The Storyteller and that had to do with its plot. It is a big hit, that book, currently on both the Times’ bestseller list of hardcover fiction and its list of bestselling e-books. It will be read, at least eventually, not by thousands but by hundreds of thousands, if not more. (In aggregate, Jodi Picoult has about 14 million books in print. Her twenty-odd previous books have been published in thirty-five countries in thirty-four different languages.) As a result, she constitutes her own private voice of America to many out there in the big, wide world. And this novel she has just published, The Storyteller, is therefore going to be what all those uncountable people read and believe specifically about the Shoah.

People who haven’t ever heard of Elie Wiesel, Saul Friedländer, or Daniel Jonah Goldhagen, much less read their books, have heard of Jodi Picoult! And so it was also for that reason that I set myself to reading the novel last week. What countless thousands across the globe are going to know of the Holocaust, I want to know too! (I plan to be particularly interested in reading how the book is received in Germany when it comes out in German translation.) In truth, I know Picoult’s work only slightly. Joan and I once listened to one of her novels on long drives to and from Toronto, and we both found it somewhere between cloying and irritating, and did not come away as long-term fans.  But I was more than prepared to give her a second chance this time ‘round. If someone with fourteen million books in print is prepared to set a book in the Lodz ghetto and in Auschwitz, then I am prepared to read what she has to say!

I have to say that I was impressed. Not by the literary quality of the book particularly, but by her willingness not to cut corners and to tell her story plainly and clearly. At the heart of the novel, which is told from ever-shifting points of view by four different narrators, are the two chapters narrated by the “main” narrator’s grandmother, an elderly woman named Minka. These chapters, not unlike the two “Shoah” chapters in Vasily Grossman’s Life and Fate (about which novel I wrote to you at length last year), these twin chapters are the axis around which the rest of the plot revolves. And they are, to say the least, harrowing. Even the worst stories of all—the story, for example, of the unimaginable events leading up to September 4, 1942, the day on which the residents of the ghetto were ordered to hand over all children under the age of ten for immediate deportation—even that story is told in detail and without flinching. Or without flinching much. Nor is Minka’s account of her time in Auschwitz told other than in stark, plain prose. Since she lived to tell her tale, Minka’s story was atypical of those who were sent there. But Picoult understands that, or seems to, and bends the story just far enough—but without taking readers actually into the gas chambers in the way Andre Schwarz Bart did in The Last of the Just or Herman Wouk did in War and Remembrance or Grossman did in Life and Fate—for readers to get a reasonable picture as well of what fate those not selected for work met upon arrival.

I’ve read more Shoah novels than I can remember the names of. But I found myself engaged by Minka’s account, even when it veered so far into unlikelihood that it was barely believable. To say the same thing differently, the people in the foreground were whoever Picoult’s storyline required them to be, but it was the background that drew me into the book, the stories of the people about whom the book isn’t but who are simply present as the story of the people that the book is about unfolds around them.

Two themes that are featured throughout the book are worth mentioning. One is the theme of forgiveness. I won’t spoil the plot for anyone who may read the book, but the story turns on the question of whether anyone has the right to forgive someone for wrongs done to other people. The k’doshim of the Shoah died in the whirlwind, for example, and are no more. Does that, in and of itself, mean that there can be no forgiveness, no repentance, and no atonement for the perpetrators?  Most of us, I think, would handily agree with the notion that there can never be atonement absent reconciliation with the wronged party. But Picoult moves the discussion onto even less comfortable ground by twisting the plot to make this a point of contention between an ex-nun who represents the Christian notion of forgiving one’s oppressors, of turning the other cheek, and of seeking absolution through confession and penance, and a young self-denying Jew (that is, the child of Jewish parents who insists that she is not a Jew at all, which is—perhaps not irrelevantly—how Picoult describes her own relationship to her parents’ Jewishness in an interview presented on her website) who appears to represent the traditional Jewish disinclination to offer cheap forgiveness for aggression against others.

Neither position fits well. The Christian position is presented simplistically and, in my opinion, oddly. The Jewish position is presented oddly as well, clearly seen through Christian lenses and not especially flatteringly at that.  What Picoult is doing—and what she herself says she is setting out to do in her preface—is responding to Simon Wiesenthal’s book, The Sunflower, which is about the efforts of a former SS officer to gain forgiveness after the fact not from the people in whose murder he was complicit (which story is told in horrific detail in the book), but from some other Jewish person—Wiesenthal himself—whom he has arbitrarily selected as his source of potential Jewish absolution.  Wiesenthal’s book is one of the more profound works on forgiveness (and particularly on forgiveness in the context of the Shoah) I have ever read and I recommend it to you wholeheartedly, especially in its later editions which include responses from all sorts of others, including Primo Levi, Desmond Tutu, Albert Speer, and the Dalai Lama. This book constitutes Jodi Picoult’s answer to Wiesenthal’s question. (My own answer, I believe, is that the gates of repentance are always open—just as tradition teaches us—yet that not all may step through them. And thus is laid the groundwork for the traditional Jewish approach to atonement as well: that the ability to repent oneself of one’s sins and to atone for them is itself a gift from God that must be earned, and that it is perfectly possibly not to have earned it. So my response to that part of the book is also equivocal.)  

On the one hand, the book for interested parties to read is The Sunflower, not The Storyteller. On the other, I’m willing to guess that an overwhelming majority of Picoult’s readers will never have heard of Wiesenthal or his book, and so may possibly be led to consider reading it not by reading this letter by me to you but by reading Jodi Picoult’s preface to her own book. And we are talking, at least potentially, about a lot of people.

Also running through the book is an odd countertale about vampires. Presented in the book as a story written by Minka—one of the most ridiculous parts of the storyline features Minka, who just happens to speak perfect German, as a prisoner in Auschwitz gaining all sorts of favors from one of the upper-level Nazis employed there because he likes to listen to her read from her manuscript—the actual tale is chilling and, in its own way, interesting. If I understand the concept correctly, we are supposed to understand that there are people—the undead in our midst—who are congenitally programmed to devour their brethren. Is the point that Nazis, like vampires, had no choice? Precisely the opposite point is argued throughout the novel, yet the vampire story moves forward throughout the whole book and only ends when the vampire himself stops destroying because he himself is destroyed. Of course, the undead cannot really die, and so…we are left wondering if and when the story will recommence.  Somewhere in there is a Shoah parable, something to do with the impossibility of eradicating anti-Semitism because of the degree to which it is embedded in Western culture. (In this regard, readers would do better to read University of Chicago professor David Nirenberg’s new book, Anti-Judaism: The Western Tradition, which I read earlier this spring and found very worthy and interesting.) But the whole vampire story is distracting and, in the end, even I wasn’t sure what exactly the point of telling it was. It was not, however, intended to make readers reeling from the impact of the worst of Picoult’s Shoah stories feel any better. As well it will not!

We who have accepted upon ourselves the sacred task of keeping the memory of the martyrs alive cannot be sorry that a major popular author has written a book that will introduce the basic story of the Shoah to countless readers who might otherwise know little or nothing about it. That the story as told is unlikely in the extreme—Minka’s granddaughter breaks off her affair with a married mortician because she ends up falling in love with the agent from the Justice Department’s Office of Special Investigations who is sent to her idyllic New Hampshire town when she comes to believe that she has inadvertently befriended one of the SS officers, now living in hiding half a century later, who tormented her grandmother and murdered her best friend—but compelling in its unlikely, convoluted way. The vampires are intriguing, if under-explained. The whole plot is a bit silly—the theme of baking is also featured very prominently both in the real book and the book-within-a-book about vampires—but the Shoah passages are truly harrowing and, within the limits of mass-market fiction, accurate enough.

The short answer is that I didn’t love the book. But I love that the book is out there, that thousands upon thousands of Americans and readers in other countries will read it and be moved by Minka’s story and by her granddaughter’s. And if some of those readers are moved to read Wiesenthal’s book or other historical accounts of the Shoah, then Jodi Picoult—even despite her glib disavowal of her parents’ Jewishness—will truly have done something of merit for Jews everywhere.