Thursday, February 16, 2017

Playing God

It’s funny how some events in the course of human history become universally understood as watershed moments and the individuals connected with them become correspondingly famous. The invention of the moveable type by Johannes Gutenberg in the middle of the fifteenth century is a good example: he’s famous, his invention is famous, and the shift from handwritten to printed books is widely understood as a true threshold in the development of world culture. You could say the same thing about James Watt’s perfection of the steam engine in 1781 or Alan Turing’s invention of the world’s first working computer, the so-called Turing Machine, in 1936. All famous men, all the well-known dates of famous events.

But other events fall away, just as do also the people connected with them. The real inventor of the moveable-type printing press, Han dynasty inventor Bi Sheng, is known to almost nobody today. Isaac Newcombe, the inventor of the steam engine that Watt was able dramatically to improve has long since been forgotten by all but historians of science. Charles Babbage, the British polymath whose 1822 “difference engine” was the forerunner of the computer, remains an obscure figure to most. My point in mentioning these three names is not to suggest that the people mentioned in the first paragraph don’t deserve their fame, which they all surely do. Rather, my point is to show how difficult it is to see these events when they are actually happening and to recognize them as momentous. Indeed, despite the fact that all three of the mostly-forgotten persons mentioned here—Bi Sheng, Isaac Newcombe, and Charles Babbage—managed materially to alter the course of human history through their work, all were eclipsed later on by the perfectors of their efforts not because the latter schemed to deny their predecessors their due but because, when the world finally got around to noticing that it was standing at a threshold moment, the people in the first paragraph were standing in the right place at the right time and not the people in the second.

Nor is it easy to notice when society has crossed a developmental line back across which it will never be able to step. And, indeed, all sorts of things that felt momentous in their day were proven later on not quite to be the breakthrough they seemed at the moment to be. I remember buying my first music CD and thinking that music would never be the same again. But that was then…and now the introduction of the music CD in 1982—for the record, a Philips recording of Claudio Arrau playing some Chopin waltzes—feels like a bridge between cassette tapes and the kind of audio files that seem to exist without physical space and which simply fly on command through the ether into the machines devised to 
play them.

And now I get to the real subject of this week’s letter: the joint announcement the other day by the National Academy of Sciences and the National Academy of Medicine that they formally approve of the effort to modify human embryos by altering the genetic code in which are embedded the traits the people those embryos will eventually become will be able to pass along to future generations. It’s hard to know what to do with such an announcement. Is this one of those pivotal moments in world history that will be remembered as a real turning point in the development of human society, as a real break with the past? Or is it just a breakthrough moment in terms of human attitudes towards a specific kind of scientific research…but not a true threshold moment in the history of humankind? That is the question I’d like to explore this week.

The academies only noted their approval with respect to certain specific kinds of research, the kinds designed to enable the deletion of genes that cause “serious diseases and disabilities.” And even that is only to be considered acceptable when there exists no reasonable alternative to eradicating the disease by altering the genetic code of those who bear it into the world.

It feels unlikely, however, that the kind of discipline necessary to keep faith with those two strictures will be maintained for long. For one thing, the terms in play—the “reasonable” in “no reasonable alternatives” or the “serious” in “serious diseases or disabilities”— are open to a very wide range of definitions. Yet, even with that caveat, there surely are diseases that all would qualify as “serious” threats to health and disabilities that no one would think twice about referencing as “serious” disadvantages to the people obliged by circumstance to deal with them. It’s hard, for example, to imagine the argument against doing whatever it takes to eradicate Huntington’s chorea, a terrible affliction that leads through horrific disability to eventual death. And if there are unfortunates who carry the genetic code for that specific disease, but from whose gametes could be created an embryo that could specifically be altered not to bear that code and therefore not to have to fear the disease and its awful consequences or to risk handing it down to future generations—it’s hard to come up with a cogent argument against helping such people rid themselves and their descendants of a horrific genetic curse.

And yet there are those who look with disfavor on this kind of research, fearing that the moral and ethical brakes they deem requisite for looking positively on this kind of research will simply not be applied by all and, indeed, the whole specter of “designer babies” is something that really should give us all pause for thought.

Due to the development of something called CRISPR-Cas9, the concept is not as far-fetched as it once was. The first part stands for Clustered Regularly Interspersed Short Palindromic Repeats. The second part, Cas-9, is CRISPR Associated Protein 9, an enzyme that somehow has the ability to act as a kind of molecular scissors capable of “cutting” a strand of DNA at a specific point in the genome so that it can be deleted or adjusted.  Come again? I’ve been reading websites all week looking for a simple explanation. No luck on that front! Still, to read the best (and, yes, the simplest) explanations I could find online, click here and here. Really, you need a background in molecular biology even to begin to get how this works, but the ethical issues do not inhere in the science and it should be more than enough for laypeople like ourselves to understand that CRISPR-Cas9 is a genome-editing tool that works well enough for scientists seriously to be on the verge of learning how to alter the genetic code of the pre-born.

From a certain vantage point, you could argue that the ethical concerns that so worry so many are being overstated. After all, we all do what we can to help our children succeed in life! We specifically do not teach our kids just to accept their weaknesses and inherent shortcomings, and to leave it at that. Instead, we do what we can to help them succeed and consider it irrelevant if their eventually performance only comes after long hours of training, practice, rehearsal, study, exercise, etc.  So why exactly shouldn’t, say, tone-deaf parents ask a scientist to alter their genetic code to include the gene for musical excellence for future generations to enjoy? Yes, of course, that sounds a bit frivolous. But the arguments against sound just a bit puritanical (and I mean that in a negative sense): if a child overcomes a natural, genetically-based disability through hard work, perseverance, and dogged tenacity and dedication, we consider it praiseworthy. But, and here we wander onto ethically thinner ice, if the means of overcoming some specific innate, inborn obstacle comes from without—from a friendly genetic engineer altering the child’s potential skill set to delete the specific traits that will hold him or her from succeeding in that very same arena—then we consider that to be unfair and morally suspect. It feels that way even to me! But more difficult, and by far, is saying exactly how those two means of assisting a child excel differ ethically.

Yes, one avenue will be available to the wealthy before it trickles down to the middle class, let alone those who live in poverty. But in a society in which the same could be said of a thousand other things—SAT prep courses, the kind of personal training that leads to athletic excellence, private music or art lessons, summers spent in camps devoted to the cultivation of the specific skills necessary to succeed, travel to distant lands to learn languages or some skill available in that specific place—it feels odd suddenly to climb up onto a high horse with respect to this specific means of helping children succeed. Don’t we specifically not care that the wealthy can provide more for their children than the poor? We certainly behave that way in most other contexts! And to tell the child of well-off parents that he or she can’t be helped to overcome some congenital inability to succeed because of his parents’ wealth also seems a bit perverse. Isn’t helping some children better than helping none?

And yet I also see the other side of the coin…and clearly. There surely is something unsettling about the notion of altering the genetic code that yields the diversity that now characterizes human society. But to oppose scientific research that could eventually assist people in ridding society of gene-based diseases and defects seems impossible to justify morally. So perhaps the real question before us is not whether the report of the National Academy of Sciences and the National Academy of Medicine is right or wrong to support the latter while strenuously arguing against using this kind of technology to improve the lot of future generations other than by ridding them of terrible diseases or defects, but something incredibly more difficult to decide: if it were to be so that this particularly genie, once out of the bottle, will be impossible to force back inside…then would the notion of ridding the world of Huntington’s or Tay-Sachs disease or beta thalassemia be worth the risk of scientists, both at home and abroad, crossing the line to create people who are better than they might otherwise be in other ways as well?

To condemn the possibility of altering the genetic make-up of embryos as “playing God” requires having a clear sense in mind of what that thought even means.  Every significant medical break-through has altered the world God made in a profound way that could reasonably be qualified as unnatural. Yet none of us regrets the eradication of smallpox or would dream of arguing that Edward Jenner was “playing God” in 1798 when he developed the world’s first effective vaccine for any disease at all. But wasn’t he doing just that?


It seems to me that we are crossing a huge threshold with the report of this last week endorsing the kind of research into the alteration of the genome that we both eagerly await and reasonably fear. Is it worth going forward and merely hoping for the best? Should we shove this particular genie back in the bottle and throw it into the sea?  If you want a clear answer, ask a potential parent who carries the Huntington’s chorea gene!

Thursday, February 9, 2017

Extraordinary Popular Delusions and the Madness of Crowds

There must be something of the nineteenth century in my character, given the number of nineteenth-century-books on my list of books that I’d say materially altered the way I think about the world. Some titles, you could probably guess on your own: Moby Dick, Huckleberry Finn, The Scarlet Letter, Leaves of Grass, and Tolstoy’s Resurrection are all there, although possibly not in the order you’d have predicted. But those are all works of fiction—fiction at its most sublime, yes, but fiction nonetheless—and there are non-fiction books on my list as well and among them is the book I wish to write specifically about today, Charles Mackay’s Extraordinary Popular Delusions and the Madness of Crowds.

It’s a remarkable book, even 176 years after it was first published in 1841. And it had a profound effect on me, one that altered my thinking in every way, even theologically, by bringing me to the realization that truths can elude almost everybody, that things that everybody “just” knows can just as likely be false as true, and that falsehoods can easily masquerade not merely as true statements but almost as societal axioms—that is, as the kind of “common knowledge” facts that people are made to feel foolish even to question, let alone to deny. It’s a big book (almost 700 pages in the edition I own), but it’s well worth the effort and the time necessary to read—indeed, almost every chapter is eye-opening and interesting. Mackay was a Scot who spent most of his working life in Belgium and England, where he worked as a lawyer without ever losing his predilection for writing. He was apparently the first to compile a dictionary of the language then called “Lowland Scotch,” the dialect of Gaelic spoken in the Scotland in his day. And he wrote Extraordinary Popular Delusions and the Madness of Crowds, his masterpiece.

One by one, the author goes through beliefs that were either current in his own day or in some earlier time and shows how they achieved nearly universal credence despite the fact that there was no convincing evidence—and often no evidence of any sort at all—to support them. Let me quote the opening passage from the preface to the 1852 edition:

In reading the history of nations, we find that, like individuals, they have their whims and their peculiarities; their seasons of excitement and recklessness, when they care not what they do. We find that whole communities suddenly fix their minds upon one object, and go mad in its pursuit; that millions of people become simultaneously impressed with one delusion, and run after it, till their attention is caught by some new folly more captivating than the first.

And then he goes on to demonstrate that, to cite his own words, “men…go mad in herds, while they only recover their senses slowly, and one by one."

Mackay and his book have repeatedly come to my mind as I have been contemplating the nation-wise brouhaha concerning the President’s Executive Order barring refugees from everywhere but Syria from entering our country for the next 120 days, refugees from Syria from entering indefinitely, and immigrants from seven predominantly Muslim countries from entering for the next ninety days. It may seem odd to reach back to a book written almost two centuries ago for insight regarding events happening now, but I have to say that I can’t recall ever hearing more people say more things that they somehow “just” know to be the truth without bothering to say how exactly they know themselves to be right, let alone unarguably true. And the more such “facts” are bandied about as though they were not groundless assertions but self-evident truths, the more I regret that Mackay isn’t around to prepare a twenty-first century edition of his book.

The President’s ban has maddened people because it was apparently promulgated without being formally vetted in advance by officials at the Justice Department or the Department of Homeland Security. I’m hardly an expert on these things, but that feels like a huge misstep: the people responsible for enforcing the President’s directive on the ground should probably have been given maximal, not minimal, time to prepare. But the specific problems connected with enforcing the two bans are not really the issue here…and it is precisely outside the issue of how exactly to enforce the ban that people on both sides seem to be campaigning for a place in an updated edition of Mackay’s book.

For people who support the Executive Order, the challenge seems clear. We are surely all in agreement that our government should not admit terrorists or criminals to our country merely because they present themselves as peaceful immigrants or refugees. And so, that being the case, the only convincing argument in favor of a ban on entering our country on the scale of the President’s Executive Order would logically have to be that the system to vet would-be refugees and immigrants that we already have in place is not working properly and that, time and time again, those charged with keeping our nation safe have failed to recognize dangerous, or potentially dangerous, people for what they are and so have naively and ineptly admitted them. That argument sounds persuasive, but it needs to be grounded in reality. Where is the list of those bad people we inadvertently let move here? Where is the list of terrorist acts, ones actually carried out or thwarted by law enforcement officials before they could be carried out, that people whom we incompetently let cross the border into our country either did manage to pull off or else clearly intended to pull off? If we have been screening people applying to enter our country ineffectually and inexpertly, where is the proof of that incompetence on the part of the very people being paid to keep us safe—proof that could only really be constituted by a long or even short list of bad people who somehow slipped through despite their best efforts to prevent such people from doing so. But there is no such list…or at least there has not been published any such list that I have seen.

That being the case, all those people insisting that the system is broken need to be asked a simple, pointed question: if the system is letting terrorists and criminals slip into our country, why can’t you list some of their names as proof positive of your assertion? And if the system isn’t actually broken, why do we need to fix it?

But the people on the other side of the aisle have their own unanswered questions to address…because so many assertions coming from the “opposed” camp also seem unsubstantiated and naïve. The President’s Executive Order is not a ban on Muslims per se, which fact is more than adequately demonstrated by the fact that there are dozens of Muslim nations not on the list and whose Muslim citizens are, therefore, not affected at all. Nor are non-Muslim citizens of the countries that are on the list free to enter our country: there was a story carried by the Jewish Telegraphic Agency just the other day about Jews from Iran and Yemen whose visas were cancelled before the Executive Order was put on hold by the courts. So it’s hardly true that the President has banned Muslims from entering our country…and yet I have heard and read people say exactly that now for days and days as though it were a self-evident truth.

Moving along, the assertion that we don’t have anything to fear from radicalized Muslims seems, to say the least, naïve. Perusing the Wikipedia page on “Islamic terrorism” (click here), it’s more shocking how many of these incidents—instances of barbarism that have led to thousands of deaths even just in the last twenty years—have been almost totally forgotten or are at least not regularly referenced in public discourse or in the press. So when people say that the President is behaving irrationally by worrying about the special security issues related to the admission of Muslim refugees or immigrants to our country, that seems, to say the very least, a bit naïve. The key, I think, is to avoid careening away from thoughtful caution and intelligent watchfulness towards xenophobia and the kind of blanket condemnation that makes it harder, not at all easier, to identify the bad guys: all Muslims are surely not terrorists, but there are large, well-funded groups of radical Islamicists out there who express themselves through violence and terrorism…and the foundation of whose worldview is precisely their particular version of Islam. Particularly bizarre, I should add, is hearing Jewish people who claim to feel a deep sense of allegiance with Israel—including, I am ashamed to say, some rabbis—who appear to feel called upon for some obscure reason not to take note of the phenomenon of radical Islamicist terrorism in the world and, just to the contrary, to brand as racist anyone who does. These people too deserve a chapter in Mackay’s book.

Our world would be a lot easier to negotiate if the prerequisite for being quoted in the press or appearing on television was that you had to read Mackay’s book and internalize its lessons. The basic facts in evidence are not only clear, but more or less universally agreed upon. All Muslims are not terrorists, and people who claim otherwise are simply wrong. There being versions of Islam that do promote the concept of worldwide jihad and for whom terrorism directed against innocents is fully acceptable, however, we need to guarantee that no Muslims admitted to our country are future terrorists because they do subscribe to the version of Islam that animates ISIS, Al-Qaeda, Hezbollah, Hamas, Al-Shabaab, the Jemaah Islamiyah, Islamic Jihad, and the like…and people who do not see the cogency of that obligation really do belong in Mackay’s book as well. If the system we have in place to vet people from other lands who seek to enter our land to visit or to settle is not working, it needs to be fixed. But the burden of proof in that regard would normally have to rest with the people making that assertion…and just asserting it without being able to present any evidence to bolster such claim is also worth a mention in the next edition of Mackay.


It makes no sense at all to talk about excessive diligence in keeping our country safe and our co-citizens secure—if we were talking about keeping your children safe, would you recognize a level of “excessive” diligence? On the other hand, a former president of our congregation, a physician (and we’ve had several), once pointed out to me that doctors can cure any disease if it’s not considered crucial that the patient survive the curative procedure, but that this is generally not considered the very best way to practice medicine…even despite the 100% cure rate.

Thursday, February 2, 2017

Subjective Remembrance

Of all the various events of the last ten days, saying which will have the most lasting effect on our national character—or our nation’s image abroad or its sense of itself at home—would be, to say the very least, challenging. But saying which event of that same time period was the most grotesque is actually simple: surely, it would have to be the spectacle of so many eager to take sides loudly and vehemently in response to President Trump’s brief statement on Holocaust Remembrance Day, the United-Nations-sponsored memorial day scheduled each year since 2005 for January 27, the day in 1945 that the Red Army liberated Auschwitz.
The statement itself was innocuous enough. (I wonder how many of those who commented on it at such length and with such passion actually read it. Surely some…but also surely not all!) Because it was so brief, I would like to cite it here in its entirety:
It is with a heavy heart and somber mind that we remember and honor the victims, survivors, heroes of the Holocaust. It is impossible to fully fathom the depravity and horror inflicted on innocent people by Nazi terror.
Yet, we know that in the darkest hours of humanity, light shines the brightest. As we remember those who died, we are deeply grateful to those who risked their lives to save the innocent.
In the name of the perished, I pledge to do everything in my power throughout my Presidency, and my life, to ensure that the forces of evil never again defeat the powers of good. Together, we will make love and tolerance prevalent throughout the world.
A Martian visiting Earth and being presented with these paragraphs would probably find them moving. A wave of horrific violence, correctly characterized as one characterized by unfathomable depravity, engulfed the world and took the lives of countless innocents. Yet even in the context of such horror, there were those who chose to risk their lives to save at least some who would otherwise have surely been killed. And in response to those two thoughts—the loss of the many and the heroism of the few—our national leader pledges to devote both his years in office—and the rest of his life—to the effort of guaranteeing that the forces of evil never triumph over the powers of good, and that tolerance and love prevail in their place.
The response, however, was not as the President had surely expected or wished, and for one single reason: the omission of the detail that the primary victims of Nazi genocide were Jews, not “just” innocents chosen at random from the universe of the guiltless, struck many as vaguely sinister and not at all the kind of thing reasonably waved away as a function of mere naiveté. And that single fact—the President’s failure to identify the Jewish people by name in his statement—generated the storm of criticism that ensued, some of it thoughtful and some of it beyond shrill. 
As the days passed, new details emerged among which the most arresting was that the statement, which I don’t suppose anyone imagined President Trump himself wrote, was actually penned for the President by Boris Epshteyn, once the ten-year-old child of Soviet Jewish émigrés to this country but now a White House special assistant. But the Jewish bona fides of the author’s statement did little to suppress the anger over the perceived insult. In some ways, in fact, it only made people who were already angry even angrier.
There is no doubt that the Jews were not the Nazis’ only victims and the numbers of non-Jewish victims are both numbing and appalling: half a million Serbs, almost two million Polish civilians, almost three million Ukrainians,  somewhere between 2 to 3 million Soviet P.O.W.s, a quarter of a million Gypsies, another quarter of a million mentally-handicapped individuals, and hundreds of thousands of others: gay people, Jehovah’s Witnesses, Freemasons, Catholic priests, and more than thirteen million Soviet citizens (including the 1.2 million people who died during the siege of Leningrad alone between 1941 and 1943). And yet…it is also true that it was only the Jews that were the intended victims of genocide itself, the term used to denote the intentional effort to annihilate an entire people and to leave no survivors at all.  And that is where things get confusing: it is surely so that the Germans never intended to murder every single Pole or every Soviet citizen, just to bring those nations to their knees by decimating the population and thus weakening the national resolve to resist German rule. (The situation of the mentally handicapped is more complex, since the Nazis probably did intend eventually to rid the world of mental illness by murdering the entire mentally ill population…and yet that program, called Aktion T4 because it was headquartered at Tiergartenstrasse 4 in Berlin, was in the end only used to kill German citizens and was not extended into occupied countries. Nor does it seem quite right to characterize mentally ill people as a nation that even could be the victim of genocide.)
And so we are left between a rock and a very hard place: not wishing to sound dismissive or unfeeling with respect to the countless non-Jewish victims of the Nazis, men, women, and children whose suffering was not only real but in many ways and details just as horrific as the misery inflicted on the Jews of Germany and Nazi-occupied Europe…but also not wishing to look past the fact that the Shoah itself—the Nazi war against the Jews—was a unique event both in world history and, needless to say, in Jewish history as well.
The figure of 11 million victims of the Nazis is probably incorrect—there is some evidence that Simon Wiesenthal came up with it himself without relying on the soundest of scholarship—but nitpicking about the number seems unworthy. (For a detailed account of that number and Simon Wiesenthal’s role in devising it, click here.) Nor is it a number without its own place in the history of Shoah memorialization: in establishing the U.S. Holocaust Memorial Museum in Washington, then-President Jimmy Carter made referenced to the 11 million victims of the Holocaust and there was, as I recall, no particularly vocal response at all. That figure appears all over the place as well, including as recently as last week, on the Facebook page of the Israel Defense Forces’ spokesperson’s unit. And I am personally aware of rabbis who regular reference the eleven-million victims of the Holocaust as though it would be unseemly to note the Jewish victims without folding the others into the batter so as not to appear concerned solely with Jewish suffering. I can follow that line of thinking easily. And the thought of turning the Shoah into some sort of ghoulish contest—and “ghoulish” would be to say the very least—to see who suffered more grievously or in larger numbers at the hands of the Nazis and those who chose to collaborate with them—the thought of entering into that kind of calculus of agony with other victims’ groups to see who wins the right to claim the more horrific fate under the Nazis seems revolting to me.
Under normal circumstances, no one would care. I myself, whose entire adult life has in a sense been guided by the self-imposed need fully and deeply to internalize the details of the Shoah and its deeply monitory message for my own generation and my children’s—even I can’t say with certainty that I would have reacted particularly negatively to the President’s remarks under normal circumstances. It was, I think I would have thought, impressive that the President even took note of Holocaust Remembrance Day, let alone bothered in the course of his first week in office to issue a formal statement in which he pledged to spend both the years of his presidency and the rest of his life after leaving office—a bit over the top, perhaps, but that’s what the man said—combatting the forces of evil exemplified by the Nazis.
But, of course, these are not normal times and we are not operating under normal circumstances. The presence among the White House staff of people who have been openly associated with anti-Semites, the open use of anti-Semitic slogans and graphic memes by the Trump campaign, the President’s own repeated, jarring use of the “America First” slogan in his Inaugural Address without any apparent awareness of the set of memories those words would awaken for an entire generation of Americans and particular for American Jews (for a brief history of the “America First” slogan, click here), and, most of all, the resurgence of the kind of rhetoric with respect to immigration that characterized our nation as its moral perigee during the FDR years when the gates remained shut even to children, let alone to adults, facing unfathomable torment and almost certain death—all of that provides the backdrop against which the President’s statement calls out to be read. And when considered against that background, the statement that the Martian I mentioned above would find both innocuous and moving, feels, to say the very least, unsettling.

I remember visiting the Anne Frank House in Amsterdam in 1977 and being shocked to discover that Anne’s Jewishness was left almost completely unmentioned in the displays on exhibit. That was my first experience of the Shoah universalized to the point of meaninglessness, of the effort to make the Shoah about oppression in general and not about anti-Semitism in its most extreme guise, of the notion that there was something at least slightly morally suspect in defining the Shoah as the apotheosis of rabid anti-Semitism and not, far less specifically, as an example of prejudice or extremism.  That was my first taste of that specific kind of anti-Jewishness, but not my last. I’d like to think that the President’s remarks were unfortunately but not maliciously phrased, that the omission of any reference to the Jewish people was a mere oversight by a naïve aide, that the larger concept that there even was a declaration is what we should be focusing on…and not on its specific wording. I’d like to think all those things! But whether that option will still be tenable a year from now—that is the real question for Jewish Americans—and for all fair-minded citizens—to contemplate as we move into the first months of the Trump administration.