Thursday, September 14, 2017

Dadbots


While perusing the website of the monthly technology magazine, Wired, the other day, I came across a video by James Vlahos, currently one of their own staff writers but formerly a contributing author at the New York Times, GQ, Popular Science, Popular Mechanics, and National Geographic Adventure. It was an interesting experience, watching that video—which is just slightly over seven and a half minutes long—but also a slightly disorienting one. Part of me was intrigued. Another part of me was envious. And still another part of me was at least slightly appalled. And so, figuring that anything that triggers such a strange set of such conflicting emotions in me will likely be worth presenting to my readers in this space, I decided to write this final time before Rosh Hashanah and to present Vlahos’ work to you.

The video is about his creation of Dadbot, which was his name for the chatbot he created while he father was dying so that he could continue to converse, sort of, with his father after the latter died. To see the video and hear its creator describe his project, click here.

To appreciate this accomplishment, you have to understand the concept of a chatbot. But even if the term will possibly be unfamiliar to most, the concept is all too well known to all of us who have telephones and endlessly receive sales calls on them. Some are just recordings. (I hang up immediately.) Some are actual people. (I wait for the speaker to catch his or her breath, then I politely ask them never, ever, to call me back. And then I hang up.)  But the most sophisticated robo-callers are chatbots—computer programs that are programmed to respond to you as though they were human beings listening to what you have to say and responding appropriately and even colloquially. Occasionally, I’ve been taken in. I suppose we all have. Just the other day, for example, I picked up the phone and a woman’s voice said that she was Jennifer and that she had an important message for me. She paused. I asked, I thought cleverly, if she was a person or a machine. And then, when I stopped speaking, she responded entirely reasonably, “Of course, I’m a person.” Something about her intonation made me suspect that she was a chatbot, however, so I asked her if she could further prove her humanness by reminding me what state Tallahassee is the capital of. She responded by repeating that her name was Jennifer and that she was calling with an important message for me, and with that our brief relationship, such as it was, ended with me hanging up the phone. I’m sure Jenn will eventually give me another try, though, possibly after reviewing her state capitals.

The earliest chatbots were developed in the 1960s, and they have become increasingly more sophisticated since then with every passing decade. But all chatbots, from the least to the most sophisticated, have at least one thing in common: they are essentially elaborate parlor tricks designed to make you feel that you are speaking to a human being, not instances of machines being endowed with the digital brainpower actually to engage in what any of us would normally call a “real” conversation, the kind in which one party speaks and the other party understands what was just said and then responds intelligently but in a non-predetermined way. Almost all chatbots use language-triggers to develop dialogue, for example by listening for the word “father” in the human speaker’s remarks and then responding, “So interesting…but please tell me more about both your parents.” But none actually thinks. Or, to use the term the way we normally use it daily discourse, speaks. Not really!

James Vlahos’ father, John James Vlahos, died just this last February. They were very close, and there are very touching moments on the video where James has to pause because he is simply too choked by emotion to continue. But whereas most of us somehow make our peace with the dead being beyond meaningful communication, Vlahos decided to respond to that thought by creating a chatbot featuring his father’s voice.

Before their father died, Vlahos and his siblings undertook an oral history project in the course of which they recorded a dozen hours of their father’s reminiscences regarding his childhood, his family, his career, his marriage, his children, and his life. They also took voluminous notes, which effort yielded about two hundred pages of extra material. Plus, of course, Vlahos and his siblings knew their father for decades and could easily imagine him responding to specific questions with specific expressions that he like to use and said all the time.  But none of that would have mattered much if Vlahos hadn’t been able to bring his technological training to bear—and specifically his ability to use an artificial intelligence computer program called PullString. (For more about PullString, click here.)  And so, using that specific program to bring together thousands of sentences his father actually uttered and to match them to appropriate word-triggers, he created Dadbot, a chatbot capable of playing the role of his father in an ongoing dialogue left unimpeded by the detail that one of the dialogists was gone from the world. Plus, Vlahos had a large store of stories and favorite songs recorded by his father over the years in various contexts, and those too he was able insert into the program where appropriate.

Is it meaningful or silly, the Dadbot? His father isn’t really there, of course. Or is he? We “are” lots of things in this world, but surely one of those things is what we say, how we speak, the words we choose to express our thoughts, our mannerisms of language and self-expression. Why is what Vlahos did any less “real” than preserving photographs of our late parents or grandparents? Those pictures aren’t our actual grandparents either! But they preserve the way they looked, not entirely unlike the way the Dadbot preserves the way James Vlahos’s father sounded. And although I suppose you could say the same thing about any recording made pre-posthumously by anyone at all—that it preserves the way that person sounded—this is really so much more than just a cleverly edited recording that it seems to bear evaluation on its own terms. I suppose I’m envious more than anything else because I wish my Dad lived on in my phone the way his father lives on in his. After all, saying that his father lives on in his phone isn’t quite as crazy as saying that his father actually lives in his phone! That, of course, really would be impossible.

I have a few recordings of my father’s voice, but I never listen to them. I’m not even sure why not. I would certainly recognize his voice anywhere. I would love to have a final (or, even better, a not final) conversation with him, and surely hearing his voice would trigger all sorts of associations that are probably lying dormant within me just waiting for the appropriate stimulus to elicit them from my memory banks. I do not have any recordings of my mother’s voice, which I regret. (The obvious paradox of me wishing I had recordings that I don’t have and not using listening to the ones I do have will for the moment have to remain unresolved.) I hear my father’s voice all the time, of course. Just I hear it inside my head, where his ghost plants them, not in my ear courtesy of a Dadbot. Is that a profound difference? It is! (Or is it?)

And so we come to our High Holiday season, which we celebrate with our families living and long gone as we all crowd into the sanctuary to participate in the services that, more than anything, awaken in us a sense that the distinction between time past and time future dissolves in the flow of associative memories that our prayers—and particularly the most ancient ones—call up easily in every Jewish breast. Maybe that’s the reason I find the whole concept of a Dadbot so resonant—not because I don’t wish I had one (which I sort of do), but because, in the end, I don’t need one, just as none of us really does…and particularly not at this time of the year when our parents are with us either in body or in spirit, when their parents, grandparents, and great-grandparents are palpably present even in their physical absence, when we pause to notice how comfortable we have grown over the years feeling the air overhead heavy with the spirits of all our ancestors as we gather in our sanctuary on these High Holidays, and particular (of course) during Yizkor. So who needs our parents to live in our phones?

We sometimes lose track of the fact that when technology mimics life, it doesn’t need also to replace it. I remember when my daughter Lucy, then a little girl, was amazed to learn that it’s possible to play solitaire without a computer. I suppose it’s possible my great-grandchildren, please God, will find it surprising that it’s possible to read a book without having to plug your book-reading-device in first. Or that it’s possible to determine if it’s raining outside without using any data at all. Or that you can achieve same-day delivery of purchases—and for free—by taking yourself physically to an actual store and buying something there in person.  Or that you can commune with your late parents without a phone, without any expertise in PullString, and without any actual digital programming skill at all simply by coming to synagogue on Rosh Hashanah, opening a Machzor, and allowing its ancient words to make fall the scales from your eyes and to allow you to see the world of the living and the dead as it truly is.

Thursday, September 7, 2017

Billy Joel and the Yellow Star


A few weeks ago, Billy Joel surprised his audience at Madison Square Garden by returning to the stage at the end of a concert wearing a yellow star specifically tailored to resemble the ones the Nazis forced Jews in occupied Europe to wear. Clearly, the point was to make a statement—a stark, wordless one, but one that would (and did) get the attention not only of his audience at the Garden but of the wide world beyond the arena’s walls as well—about the rising tide of white supremacism, neo-Nazism, and anti-Semitic and racial intolerance in our American republic.  As wordless protests go, it couldn’t have been more well-timed: the nation was still reeling from the sight of white nationalists, neo-Confederates, and undisguised neo-Nazis marching in Charlottesville while carrying semi-automatic weapons, waving Confederate and Nazi flags, and chanting overtly anti-Semitic slogans when Billy Joel donned his star at the Garden not even two weeks later.

The response to Joel’s gesture was mixed. In the non-Jewish media, it was generally lauded as a dramatic non-verbal statement about a serious national issue by a personality who found himself in the right place at the right time to make it. TMZ, the celebrity news website, referenced it as “a bold statement about the times we live in.” Billboard referred to it as “a powerful political statement.” MSN, The Microsoft Network, said Joel’s gesture was “a strong statement against the growing Neo-Nazi and White Nationalism movement.” People Magazine called it a “strong statement” against intolerance.

The response in the Jewish media was far more equivocal. 

Andrew Silow-Carroll, writing on the Jewish Telegraphic Agency website, focused almost exclusively on his fear that Joel’s gesture, no doubt heartfelt and sincere, might accidentally trigger an unfortunate trend: “I don’t think anybody wants the yellow star to become this year’s AIDS ribbon or Livestrong bracelet,” he wrote. “The wearing of the yellow star seems the kind of gesture that can be made once, or sparingly, lest you diminish its shock value or begin to insult the experiences and memory of the people who are purporting to identify with an honor.”  But that dismissive response qualifies as restrained and measured when compared to the response of Stephen Pollard in the Jewish Chronicle, the U.K.-based newspaper of which he is editor, who labelled Joel’s gesture “crass, infantile, ignorant, stupid, and offensive.” And that was just the headline. Later on in the piece, he explains his position in slightly more detail: “[You] do not express your pride in being Jewish, or your revulsion against hate, by donning the Nazi yellow star as a fashion statement of that supposed pride. All you do is insult those survivors who lived through the Shoah, and who did not wear their yellow stars to draw media attention to themselves but because they were forced to do so by the Third Reich.” Nor was Pollard at all impressed when Nev Schulman, an actor and the producer of the popular MTV television show Catfish, showed up at the MTV Movie Awards wearing his own yellow star, a gesture that prompted Pollard to label him a “half-wit” and which only seemed to confirm Silow-Carroll’s fear that the yellow star could yet become a widespread symbol of opposition to intolerance. 

Other Jewish responses varied.  A piece in the Forward earlier this week by the anonymous blogger who writes as Jewish Chick described herself as “flabbergasted, outraged, and frankly puzzled,” by Joel’s and Schulman’s gestures. “For myself,” she wrote, “and [for] many others, [the gesture of donning a yellow star] represents a slap in the face for [sic] those who perished during and [those who] survived the Holocaust, no matter what the intent.” On the other hand, Aryeh Kaltmann, a Chabad rabbi writing on the Algemeiner website, labelled Joel’s gesture as “an inspiring surprise” and explained himself as follows: “By boldly wearing the startling image of the star that the Nazis forced Jews to wear during the Holocaust, Joel was decrying anti-Semitism in particular—and, by implication, racism and other forms of hate.”

I think Rabbi Kaltmann got it right. Yes, it was shocking to see Billy Joel (who has hardly worn his Jewishness on his sleeve in the course of his many years of fame) appearing on stage willingly wearing something that symbolizes the barbarism of Nazi intolerance and anti-Semitism. But isn’t that the point of dramatic gestures in the first place, that they trigger emotions in the people who see them that might otherwise have lain dormant?

I’ve read in many places that there is no apparent historicity to the story I heard a thousand times as a child about how Denmark’s King Christian X chose to express his solidarity with his Jewish subjects after Denmark was invaded by the Germans by donning a yellow star himself. When I was a boy, that story stirred me mightily…and the reason I responded to it so viscerally, now that I think back carefully, is precisely because it was so unexpected, so dramatic, and so intense a gesture for someone outside the Jewish community to make in public on behalf of those on the inside. King Christian wasn’t a Jew, obviously, but he—in the story, at least—was expressing his solidarity with the victims of Nazi anti-Semitism personally and publicly. So why should it not be equally moving to contemplate a pop star—and particularly one whose Jewishness has been so low-key over the years that I myself was slightly surprised the learn that he even was Jewish—by such a person standing up to oppose neo-Nazi anti-Semitism…and particularly when he personally had nothing at all to gain by making such a public statement? That the story about King Christian isn’t true (click here for the details) hardly matters and, indeed, the fact that the story was apparently just a fantasy speaks volumes about how meaningful a gesture it would surely have been had he really made it.

The back history of the Jewish badge goes back a long way. In 1215, for example, the Fourth Lateran Council headed by Pope Innocent III decreed that henceforth Jews in all Christian lands under papal control would be obliged to adopt some specific article of dress that could vary from land to land but that in every place would set them apart from their Christian neighbors. In 1222, the Archbishop of Canterbury, Stephen Langton (who is otherwise remembered for inventing the chapter divisions in the Hebrew Bible that are used today in all Christian editions and most Jewish ones) decreed that English Jews were required to wear a white band across their clothing minimally “two fingers broad and four fingers long.”  In 1227, the Christian Synod of Narbonne in France decreed that Jews in France wear an oval badge; just the next year, James I ordered the Jews of Aragon to wear a similar badge. In 1294, the Jews of Erfurt in Germany were similarly required to wear the Jewish badge, the first mention of such a thing in any German city. You get the idea…one way or the other, the practice spread across Europe, constantly being cancelled and then re-introduced over the course of almost the entire medieval period. And then, of course, after centuries of disuse, the Nazis re-introduced the idea in many of the countries they conquered in the early 1940s as well as in Germany itself.

There is something particularly vicious about the use of the star. The Jews of Germany (or France or anywhere) were not physically distinct from the people among whom they lived. And the sense of fitting in, of being one of the masses, of being able to circulate easily in society without arousing the ire of whatever anti-Semites they might encounter in the course of one’s day’s affairs—that sense of being indistinguishable from the rest of the populace was a key element in the feeling many Jews developed that they were safe and secure in their host nations and in the cities they had come to think of as their hometowns. As a result, pronouncements by those medieval monarchs who considered the fact that their Jewish subjects were not easily recognizable to be a problem in need of addressing took on a particularly ominous ring. Nor did that ominousness dissipate with the passing of centuries, and least of all in Nazi-occupied Europe, where the yellow badge was not just a mark of Jewishness, but more specifically a mark of Jewishness overlaid with a deep sense of creeping ill ease, of jeopardy, of menace.

For these last weeks since Charlottesville, the challenge for us all has been to steer a clear course between over-reaction and under-reaction, between seeing neo-Nazis behind every tree and falling into the trap of not seeing them at all because we so fervently wish for them not to exist. I’ve had to negotiate those straits myself, both when speaking from the bimah and when writing my weekly letter to you all, and even now I find myself unsure about how things truly stand. Surely, there is no incipient political movement gaining ground that is anything like the rising Nazi party in the waning days of the Weimar Republic. There was almost universal bipartisan agreement that the President’s initial comments about Charlottesville were equivocal and unworthy. There were, at the end of the day, about 250 people chanting “Blood and Soil” and “Jews Will Not Replace Us” in the streets of Charlottesville, not 250,000. Our nation has always harbored extremists and haters who abuse their First Amendment rights to defame others, yet the civil rights of citizens remain the cornerstone of our democracy nonetheless. The sense of decency and fairmindedness that is the hallmark of true American patriotism remains in place.  I myself am neither worried nor scared; my sense of my place in our nation is just as it has been for decades and is, I believe, as unshakeable as it is unshaken.

But we also remember the Jews of Germany who made the cataclysmic error of underestimating the haters. They too felt secure, safe, and possessed of inalienable civil rights! Of course, the fact that they were wrong doesn’t mean that we too are! But it means that when a public figure like Billy Joel comes on stage at one of the nation’s premiere concert venues and, in front of scores of thousands of fans, says with a single gesture that he is identifying these days with the Jews of 1940’s Germany—when a man such as he makes a wordless statement such as that, in my opinion at least, we should applaud his candor, his willingness to speak out, and, yes, his bravery. His was a valiant gesture at just the right moment and Billy Joel should be lauded both in Jewish and in non-Jewish circles for having made it.

Friday, September 1, 2017

Being Who We Are and Aren't


What is Jewishness exactly? We talk about it regularly as though it were a heritable genetic trait of some sort, one that—for some reason—is solely passed down from mothers to their children. Indeed, even when people argue the point and try to make a case for patrilineality as a valid determinant of Jewishness, they are merely arguing along the same lines and insisting that “it,” whatever “it” actually is, can be passed along by men to their offspring as well. Of course, the fact that conversion is permitted seriously undermines the genetic argument: if we’re talking about something akin to DNA that you either do or don’t have, how can any behavioral or attitudinal factor override not having it? But, it turns out, that doesn’t mean that there isn’t any a genetic component to membership in the House of Israel…and therein hangs an interesting tale.

I read a remarkable story in the Washington Post last July about an Irish-American woman from Chicago, one Alice Plebuch, who took one of the various “just-for-fun” DNA tests available on the market because she wished to learn more about her father, who had died many years earlier, and about her father’s family. (You can read the article by clicking here. You can also visit the websites of three of the larger companies that offer this kind of service to the public by clicking here, here, and here.) The results, however, were not at all what she expected: about half her DNA results confirmed what she already knew about her descent from people who hailed from various regions within the British Isles, including Ireland, but the other half pointed to a combination of Eastern European Jewish and Middle Eastern ancestry. One of her parents was apparently not as Irish as she thought…but which one? That was what she now felt herself obliged to find out.

There were, of course, lots of possible explanations for the unexpected test results. One set of her grandparents could have been Jews from Eastern Europe who so totally shed their previous identity upon arriving in Ireland that just a generation later there was no trace at all of it, and no recollection on the part of anyone at all that they had ever been anything other than “just” Irish. Alternately, one of her grandmothers could possibly have had an extra-marital affair and then simply allowed her husband to presume that he was the father of the child she subsequently bore. That, however, would have led to a quarter of her DNA being labelled as Jewish, not half. Could both her grandmothers have had affairs with Jewish men? Imagining such a thing about one of her grandmothers was hard enough, but about both felt wholly impossible. There had to be other some other plausible explanation!

Plebuch talked her brother into being tested, plus one cousin on her mother’s side of the family and another on her father’s side. Her test and her brother’s yielded the expected result indicating that their mother and father had to have been the same people. But the tests involving the cousins yielded one interesting piece of data and another that was truly confounding. The interesting information came from a comparison of the two cousins’ results and made it clear that the Jewish component in Alice Plebuch’s DNA came from her father’s side of the family. That was what she suspected anyway, but a far more amazing piece of information than that came from a comparison of her own DNA with that of one of her cousins, the son of her father’s sister, which effort yielded the categorical result that they had no blood relationship at all! In other words, reading her own DNA results against her cousin’s yielded the conclusion that her father and his sister were unrelated by blood.

I won’t describe the rest of the story in detail—although I really do recommend that Washington Post article as riveting reading—but the short version is that, after a lot of very detailed sleuthing, Alice Plebuch was able to conclude categorically that her father and another baby were switched at birth, or shortly after birth, at Fordham Hospital in the Bronx where they were both born on the same day of February in 1913. And she somehow managed to identify that other baby and to find his still-living daughter too, whom she felt honor-bound to inform that her father was an Irish Catholic at birth who was simply raised as a Jew by the Jewish people he came to know as his father and mother, neither of whom had any idea that they had brought home the wrong baby.

It sounds like the plot of a made-for-television movie—and not even that believable a one at that. And there surely are a lot of obvious questions to ask about how such a thing could ever occur in real life and who, if anyone, should be held accountable after all this time. But the question that the story raises that matters to me personally has to do with the nature of identity. The Irish Catholic baby brought home by a Jewish family turned into Philip Benson and was raised as a Jewish boy in a Jewish home, then grew up to become what any of us would call a Jewish man. Was he “really” Jim Collins, as the Jewish baby brought home by Irish Catholic parents and raised in their faith was known to the world? Was Jim Collins, the man Alice Plebuch knew as her father, “really” Philip Benson? Were both their lives essentially lies lived out against backgrounds that neither recognized as false but which were, historically and genetically, wholly untrue? Were they both essentially phantoms, men who were neither who they were or who they weren’t? It’s hard even to say what those questions mean, let alone to answer them cogently. Since there’s no reason to think that, had Alice’s grandparents brought the correct baby home from the hospital, that he would eventually have become would have ended up marrying Alice’s mother, Alice Plebuch’s very existence seems predicated on a mix-up that any normal person, other than her husband and her children and all her friends, would easily label a tragedy. Does that make her existence tragic? It’s sounds vaguely right to say that, but I’m not sure I could look her in the eye while I was saying it.

We all believe, or I think we do, that there are character traits that inhere in the shared genetic heritage of any recognizable group. Such talk often veers into tastelessness bordering on prejudice when we “assign” qualities, and usually negative ones, to people based on their race or ethnicity.  But does that mean that there are no shared traits that the members of groups with a common genetic heritage all share? (And, if that is the case, then why should those shared traits be uniformly positive? Surely negative traits can also be shared!) But what is the precise boundary between identity and shared heritage, between the autonomy of the individual and the shared genetic heritage that inheres in that individual’s DNA? Surely, both concepts impinge upon each other. But in what specific way and to what precise extent—that is a far thornier riddle to solve.

From a Jewish perspective, the issue is even more complicated. The man the world knew as Jim Collins was born to a Jewish mother and so was, according to all Jewish authorities, a Jewish baby. The Talmud has a name for a child who is spirited away from his parents at birth, or shortly after birth, and raised without reference to his “actual” heritage: this is the famous tinok she-nishba of talmudic lore. Nor is this treated as a merely theoretical issue: the Talmud goes into considerable detail with respect to the specific laws that apply to such a Jewish individual raised in total ignorance of his or her Jewishness. Most of those discussions revolve around intricacies of halakhic obligation when a particular infraction is repeated over and over in the course of years or even decades by a Jewish individual who, unaware of his or her Jewishness, has no inkling that some specific deed is forbidden to him or her by the Torah. Such a person is technically a sinner, but our sages understood easily how wrong it would be seriously to attach that label to someone whose sins are completely inadvertent and who lacks even an inkling of his or her real status as a Jewish individual. The debates are interesting. But there is no debate at all about the Jewishness of the tinok she-nishba, just about the specific way the law should apply to such a person.

Was Jim Collins a tinok she-nishba? Labelling him that way would seem to oblige us to consider Philip Benson a non-Jew. When viewed dispassionately, that sounds almost reasonable, particularly since any rabbi could “solve” his predicament easily enough with a trip to the mikveh, a visit to the bet-din, and a few minutes with a mohel. But let’s imagine that the truth about Philip Benson never came out. Would we really consider it a tragedy for a man raised as a Jew from birth, circumcised on the eighth day of his life, provided throughout his childhood and adolescence with a Jewish education, the husband of a Jewish woman and the father of Jewish children—would it truly be a disaster if the truth about his “real” parentage never came out? Part of me thinks it would be. But another part can’t quite embrace that level of ex post facto harshness.

Most of the time, it’s probably wisest just to allow people to be whom they appear to be. Mostly, we already do this. When I walk into the Kotel plaza in Yerushalayim and join a minyan for Minchah, no one asks me if I am really a Jew, much less if I am really a man! I look like a man, so that’s good enough for them. I apparently look like a member of the House of Israel too…and that too is good enough even for the guys who hang out at the wall wearing their giant black hats. (I don’t push it, however, by also self-identifying as a Conservative rabbi.) Ultimately, we are all Jews by self-definition…and that, really, has to be the bottom line. Sometimes, real wisdom lies in stepping away from the fine print and being content just to read what people possessed of normal eyesight can see, and then leaving it at that.

Should I buy one of those DNA test kits and find out where my people really come from? I haven’t decided one way or the other. But if I do…I promise (maybe) to share the results with you in a subsequent letter.

Thursday, August 24, 2017

Eclipses Then and Now


Apparently, the solar eclipse truly was awesome for those lucky enough to be exactly in what I now know to call “the path of totality.” For the rest of us, not so much.  But even without experiencing the awesomeness that was fully visible in Jackson Hole or Nashville, there was still something remarkable—and unexpectedly humbling—about the whole experience. Like so many others, I’m sure, I read all about it in the days leading up to the big event. But of all the articles I came across, the one that had the biggest effect on me was one written, not in the last weeks or days, but a cool eighty-five years ago.

The article, published without a by-line in the New York Times on Sunday, August 14, 1932, pointed out that Americans who wished to experience a full solar eclipse had basically two choices: to clear their calendars for August 31 of that year and look up (ideally after having learned how to look at the sun in eclipse without harming their eyes), or to do whatever it was going to take to guarantee a very long life since the next chance for most Americans to see a total eclipse was going to be…in 2017, towards the end of August. (There was, admittedly, a third possibility too: readers not reasonably expecting to live until 2017 but who thought they might well make it to 1970 were offered the possibility of moving to Florida, the only state from which the eclipse of that year was going to be visible. And there also the possibility of seeing a full solar eclipse in the northernmost regions of the country in 1979, but this was dismissed as at best a long shot in that seeing it was going to require observers “to brave the possibilities of a blizzard and 40 degrees below zero weather.”  So that, plus the iffy March weather in Florida, basically left readers with two viable options: preparing either for an event two weeks off or for another more than three-quarters of a century in the future.)

It was a long time ago, admittedly. But since I was already there, I spent some time perusing the rest of that newspaper from so long ago in which the article appeared. (You can too: click here.)  And what I found was America trying to come to terms with Nazism…and only marginally succeeding.

The German elections of the previous month had ended in legislative chaos—the Nazis won 230 out of 608 seats in the Reichstag, making them the largest party represented but not one possessed of the majority necessary to govern. This led to Hermann Göring becoming president of Germany and to the onset of negotiations regarding Hitler’s place in the government, but the lead article on page one of the paper (“Hitler Demands Office as Dictator; Hindenburg Bars It”) is almost achingly naïve in its sense of things. This is 1932. For those who might be unfamiliar with his background, Hitler is helpfully described as an “Austrian house painter.” And the article itself, by Frederick T. Birchall (regarding whom, see below), explains the chaos that was reigning in Germany in the wake of the recent election in terms that readers today will find, to say the very least, remarkable: “Tonight,” Birchall writes, “whether law and order shall prevail in this republic or whether it will be plunged again into the turbulence that marked its beginning depends very much on the whim of this same Hitler and the strange aggregation of semi-fanatics who surround him and are trying, as there is good reason to believe, to goad him to extremes to which he alone probably would never proceed.”

Towards the end of the article, a lone paragraph addresses Nazi anti-Semitism: “The Association of Germany Jewish citizens,” Birchall reports, “placed before President von Hindenburg today an exhaustive record of what they said were anti-Semitic threats and insults by the National Socialists from the platform and press, together with acts of terrorism committed on Jews. The President replied that he deeply regretted and strongly disapproved of infractions on constitutional and religious rights of German citizens and sent the documents to the Ministry of Interior for examination.”  Can you imagine that? A president who can find the words to condemn Nazi anti-Semitism but without feeling any sort of concomitant need to distance himself totally and absolutely from the Nazis themselves! It all seems so heartfelt, so sincere…and, of course, so tragically futile.

What would Frederick T. Birchall have made of Charlottesville? That is actually more interesting a question than it sounds at first. Birchall was a well-positioned observer—a Brit by origin and American by choice, he began working as an editor at the Times in 1912 and ended up winning a Pulitzer Prize for his reporting from Europe in the 1930s. Seeing the Nazis, who just a few years earlier were dismissively derided by most Germans as thugs and extremists, rising step by step to positions of power must have seemed amazing.  Being present as Germany embraced a political philosophy based on the absolute repudiation of the very liberal values and norms Germany itself had in large part brought to the world, even more so. But watching on as Germans showed themselves again and again willing to ignore the Nazis’ fiercely violent anti-Semitism and racially-motivated fanaticism—or rather to look on their radical racism as something akin to the political equivalent of the bitter aftertaste sick people with their own best interests at heart will gladly endure for the sake of swallowing medicine that promises to restore them to good health—that must truly have seemed unbelievable. Something of this was captured a few years ago in Erik Larson’s In the Garden of the Beasts, a very interesting book that I recommend highly about the first American ambassador to Nazi Germany, William E. Dodd, and his life in Berlin from 1933 to 1937. But Birchall filed hundreds upon hundreds of stories from Germany in the 1930s, all available at the New York Times online archive, and they tell their own story of a nation descending into an abyss of its own making precisely by refusing to repudiate Nazism when it still might have been possible to act decisively and definitively in the nation’s best interests, of a nation weirdly willing to take the danger that inheres in the mindless embrace of extremism and violence seriously and not seriously at the same time.

Did Birchall know about the total eclipse of the sun on the last day of August in 1932? I’m sure he read his own newspaper daily, so how could he not have? He died in 1955, but if he were alive today to comment, would he find it uncanny to see America again grappling with the forces of intolerance, supremacism, bigotry, anti-Semitism, and racial hatred as the skies dim overhead? My guess is that he would have! And although I understand that Shakespeare was entirely right when he wrote that “the fault is not in our stars, but in ourselves,” I imagine Birchall would join me in finding it more unnerving than amusing to see our nation again combining a deep sense of awe at the majesty of the universe with a sense of befuddled amazement at the sight of actual American citizens hoisting aloft actual Nazi flags and shamelessly chanting anti-Semitic slogans as they marched through what just a few days earlier would have been described by anyone as one of America’s most delightful college towns.

Elsewhere in the Times of August 14, 1932, we read about a meeting in Geneva of one hundred delegates representing, between them, twelve million Jews in twenty-five countries to discuss “the situation” affecting Jews throughout the world. I’m sure this was a sincere, hopeful effort. How could they have known what was coming? Should we mock these delegates now for not being able to imagine Treblinka then? Or should we respect them for doing what they could with the information they had? They surely understood the potential for Nazi anti-Semitism to turn even more violent than it already had…but how could they possibly have imagined genocide on the scale about to be unleashed upon the Jewish world? Certainly, they meant to do good. Possibly even they did do good. But, as we all know, whatever good they did manage to accomplish was unable to stem the tide of unimaginable viciousness about to be unleashed against their co-religionists in Europe? Did the Americans among them return home in time to see the sun vanish briefly from the midday sky and find in that experience an ominous portent? Was there anyone among them among them conversant enough with talmudic lore to connect the events under discussion in Geneva with the passage in Tractate Sukkah that declares a solar eclipse to be a siman ra— “a bad omen”— for the whole world…and thus to feel even less hopeful about the future of European Jewry? I suppose there may have been some talmudists among the delegates, but that only prompts me to look eighty-five years into our future and wonder if someone writing in 2102 will look back and wonder how people back in 2017 cannot have found troublingly deep ominousness in an eclipse that made the day dark just ten days or so after Charlottesville?

Thursday, June 22, 2017

Choosing Life

One of the most often-repeated talmudic aphorisms notes that “all Israel is responsible for each other,” a noble sentiment, surely, but one that most of us set into a set of larger and smaller concentric circles of obligation: we feel most of all responsible for our closest family members, then for more distant relatives, then for neighbors, then for the other members of the faith- or ethnic group with which we identify the most strongly, then for the other citizens of our country, and then for all humankind. Some of us go one step further too and include the inanimate world as well in the scheme, promoting the conviction that, as part of collective humanity, we have a kind of corporate responsibility not solely for the other residents of our planet, but for the planet itself. Eventually, in some distant age, the circle will probably widen to an even greater extent to allow for feelings of responsibility directed towards the inhabitants of the other populated planets with whom we will eventually learn that we share our galaxy.

None of the above will strike anyone as at all controversial. And even if some specific people might well order their circles differently—feeling more responsible for neighbors than for distant relatives, for example—the concept itself that responsibility for people other than ourselves inheres in our humanness hardly sounds like an even remotely debatable proposition.  But saying wherein that responsibility lies exactly—and how that sense of responsibility should manifest itself in law—is another question entirely.

The system of common law that undergirds American jurisprudence holds that one commits a crime by doing some specific illegal thing, not to failing to do something...and this principle applies even when the thing not done involves coming to the aid of someone in distress. There are, however, several exceptions to the rule. If you yourself are personally responsible for the other party being in distress, then you are in most cases deemed responsible to come to that person’s aid. Parents have an obligation to rescue their children from peril, as do all who act in loco parentis (like the administration of a child’s school or the staff in a child’s summer camp). Spouses have an obligation to rescue each other from peril. Employers have an obligation to come to the aid of workers imperiled in the workplace. In some places, property owners have a similar obligation of responsibility towards people invited onto that property; in others, that obligation extends even to unwanted visitors like trespassers. Otherwise, there is no specific legal obligation to come to the assistance of others. And this is more or less the way American law has evolved as well: other than with respect to the specific exceptions mentioned above and in Vermont, people are generally deemed to have no legal obligation to assist others in distress, much less to risk their own wellbeing to do so. Indeed, although all fifty states, plus the District of Columbia, have so-called Good Samaritan laws on the books that protect individuals who come to the aid of others from subsequent prosecution in the event of mishap, or some version of such laws, Vermont is the sole state to have a law on the books that actually requires citizens to step forward to help when they see others in peril or distress.

These last several weeks, I have been following with great interest the case in Taunton, Massachusetts, that culminated in the conviction of involuntary manslaughter of Michelle Carter, 20, in a case that involved a suicide for which she was not present and which her lawyers argued she therefore could not have prevented, but for which the court deemed her in some sense responsible nonetheless.

The story is a sad one from every angle. It revolves around two troubled teenagers who met while vacationing in Florida and subsequently began an intense twenty-first-century-style relationship that was primarily conducted digitally through emails and text messages. (After returning home to Massachusetts, the two only met in person on a few rare occasions.) But digital though their relationship may primarily have been, it was clearly intense. She was suffering from an eating disorder; he suffered from suicidal tendencies and had actually attempted suicide before, which fact he disclosed to her. At first, she was supportive and encouraged him to seek counseling and not to harm himself. But at a certain point, she became convinced—and we are talking about a teenager here not a mature adult, let alone a trained therapist—she became convinced that suicide was the right solution to her friend Conrad’s problems. And so she encouraged him, as any friend would, to do the right thing. “You can’t think about it,” she texted him on the day of his death. “You have to do it. You said you were gonna do it. Like I don’t get why you aren’t.”

Convinced that she was right, Conrad drove his truck to the secluded end of a K-Mart shopping plaza and prepared to asphyxiate himself with the fumes from his truck’s exhaust. And then we get to the crucial moment, at least in the judge’s opinion. Suddenly afraid and apparently wishing not to die, or at least willing to rethink his commitment to dying, Conrad got out of the truck and phoned his friend Michelle. Her response was as clear as it was definitive: “Get back in,” she told him. And he did, dying soon after that.

Was she responsible for his death? You could say that she did not force his hand in any meaningful way, that she merely told him to do something that he could have then chosen not to do. Or you could say that she had his life in her hands at that specific moment and that she chose to do nothing—not to phone his parents, for example, or 911—and thus was at least in some sense complicit in his death. “The time is right and you’re ready…just do it, babe,” she had texted him earlier in the day. That, in the judge’s opinion, was just talk. But when Conrad’s life was hanging in the balance—when he was standing outside the cab of his truck and talking on the phone as the cab filled with poison gas—and she could have exerted herself to save his life but chose instead to encourage him to get back in—that, the judge ruled, constituted involuntarily manslaughter in the Commonwealth of Massachusetts. And, as a result, Michelle Carter is facing up to twenty years in prison.

Involuntarily manslaughter, defined as behavior manifesting extreme indifference to human life that leads to the death of another, seemed to the judge to suit the details of this case. But what if Michelle truly believed that Conrad would be better off dead, that the dead rest in peace or reside in a state of ongoing bliss in heaven? What if she felt it was his right to die on his own terms and when he wished? What if she felt that he was right in his estimation that he would be better off dead than alive? Should she be held responsible for his death if she felt she truly believed any of the above? As someone who has spoken in public countless times about the dead resting in peace and their souls being resident in paradise beneath the protective wings of God’s enduring presence, there is something deeply unsettling for me personally even to ask that question aloud! Surely, I could answer, the metaphoric lyricism we bring to our efforts to deprive death of its sting aren’t meant to be taken literally, and certainly not to the point at which the taking of an unhappy person’s life feels like a justifiable decision! But what if someone did take them literally and then, not actually acting, chose not to act because of them—should that be considered criminal behavior?

I am neither a lawyer nor a legal scholar, so I won’t offer an opinion about the verdict in the Carter case per se, other than to note that the verdict is controversial and may ultimately be reversed. But the burning question that smolders at the core of the matter—what it ultimately means to be responsible for another—is not something anyone who claims to be a moral person can be at peace being unable to answer. The sense of the interconnectedness of all living things that we reference so blithely in daily discourse can serve as the platform on which all who would enter the discussion may stand. And, surely, the responsibility towards others that develops naturally from our sense of society as a network of closer and more distant relationships is undeniably real. Still, it feels oddly difficult—both morally and apparently legally—to say precisely what it means for one individual to bear responsibility for another.


The most-often repeated commandment in the Torah requires the faithful to be kind to the stranger, to the “other,” to the person who is not yourself…which is all people. Theologically, being solicitous of the wellbeing of others is a way of acknowledging the image of God stamped on all humankind. Whether criminalizing the willful decision to look away from that divine image is a good idea, or a legally sound one, is a decision for jurists, not rabbis. But the notion that all behavior that shows disrespect, disregard, or contempt for others—and thus denies the principle that all human life is of inestimable value regardless of any individual’s circumstances—is inconsonant with the ethical values that should undergird society is something we all can and should affirm. When the Torah commands that the faithful Israelite “choose life” over death, it is specifically commanding that the faithful ever be ready to step into the breach to save a life in peril and thus to affirm our common createdness in God and the responsibility towards each other that derives directly from that belief.

Thursday, June 15, 2017

Presidents and Emperors

Since I’ve been writing lately about the way I related to various events of 1967, I thought this week I’d write about yet one more: the performance of MacBird! I attended with, of all people, my mother. For readers too old or too young to remember back that far, MacBird! was a play by Barbara Garson starring Stacy Keach (at the very beginning of his career) and Rue McClanahan (long before she became the sexy one on The Golden Girls) that ran for almost a year at the Village Gate in 1967 and 1968, and which basically accused Lyndon Johnson of complicity in the assassination of John F. Kennedy. Layering the actual details of the Kennedy assassination and LBJ’s subsequent assumption of the presidency over the plot of Macbeth (with some side-dollops of Richard III and Hamlet), the show in its day was considered too radical for a real Broadway house and was relegated to the West Village, then (as now) the Manhattan-theater-scene equivalent of Siberia.  My mother, already slightly radicalized and by then a card-carrying member of N.O.W., was curious enough to want to see the show. I wanted to see it too…and I apparently wanted to see it badly enough to suffer the ignominy of going along with my mother. (And, for readers who were never teenaged boys, let me assure you that that we are talking about serious ignominy here.) Besides, I told myself, who would actually see me walking down Bleecker Street with my mom?

It was June, the same June on the first day of which the Sgt. Pepper album was released— a major cultural watershed-moment in my own life, as explained in this space a week or two ago—and during the first weeks of which Israel won the Six Day War. It was also the month of the Monterey Pop festival, the precursor to Woodstock that catapulted both Jimi Hendrix and The Who to real fame in America and brought them both, particularly Jimi, to my personal attention. It was, to say the least, an interesting month, that month of my fourteenth birthday. And, as if all the above weren’t enough, it was also the month I went with my mother to see MacBird!.

No one, not then and surely not now, actually thought or thinks that Lyndon Baines Johnson might possibly have played a role in the assassination of John Kennedy. Nor did anyone imagine (admittedly impossibly) that Johnson’s subsequent rise to real power was best understood as some sort of mystically-conceived prequel to House of Cards, the Netflix series that is precisely about the ascension to the presidency of an unprincipled, corrupt demagogue, the character of whose wife truly does feel as though it’s been modelled at least on part on the character of Lady Macbeth in Shakespeare’s play. But it mattered little that the point wasn’t actually to indict the sitting president of his predecessor’s murder, but merely to suggest the ultimate corruption of the political process…and the way that the fate of the nation had somehow come to rest in the hands of someone whose primary focus was not on the welfare of the nation he was charged with leading, but with the furtherance of his own personal political agenda. It was, as is all biting satire, overstated. But it caught the attention of the public, seemed somehow to capture the spirit of the time, and had a respectable 11-month off-Broadway run followed by productions in Los Angeles, San Francisco, and elsewhere.

Johnson, remembered now primarily for his “Great Society” legislative package and for his “War on Poverty,” was in 1967 primarily perceived by America’s radicalized youth as the bogeyman of the Vietnam War, as the man primarily responsible for the tens of thousands of American casualties—more than 22,000 American servicemen and women had died in Vietnam by the evening I saw MacBird! with my mother—in a war regarding the legitimacy and reasonableness of which the American people were, to say the very least, strongly divided. It wasn’t the fairest assessment. LBJ inherited Vietnam from Kennedy, who—at least in a sense—inherited it from Dwight Eisenhower. (The first American servicemen to lose their lives in Vietnam died in 1959.) And Johnson was, in a real sense, playing a zero-sum game by trying to fight a war in a distant land that had the inarguably noble goal of saving an ally from being overrun by Communist forces eager to reunite Vietnam as a single entity under the totalitarian leadership of its ruling cadre and, at the same time, not having the popular support at home to do the job successfully and effectively. Instead, we attempted to shore up the troops of the unpopular non-communist regime without understanding just how little support its leaders had among their own people. It was, therefore, a loser’s game. And, as happens when people play loser’s games, we lost. But that was still years in the future when I was making my way from the subway to the theater with my mother in June of 1967 and praying I didn’t run into anyone I knew from school on a theater date with my mom.

I was brought back to that whole experience just this week as I read about the turmoil the Public Theater’s production of Shakespeare’s Julius Caesar has engendered, turmoil serious enough to prompt two major funders, the Bank of America and Delta Airlines, to withdraw their support for the production.  Of course, all this controversy will paradoxically make it impossible to get tickets to see the show, but that, of course, was hardly the goal...which was to signal the former sponsors’ lack of interest in having their corporate names attached to a biting piece of overtly political theater that is openly and sharply disrespectful of the current President and First Lady, and which they feared could possibly be taken as calling for the assassination of the former.

Gregg Henry plays Caesar as Trump, depicting him as a self-absorbed, preening tyrant who bathes in a golden bathtub that matches his shock of golden hair. His wife Calpurnia, played by Tina Benko, dresses extremely well and speaks with a distinctly Slovenian accept. You get the idea. Any student of Shakespeare knows that Julius Caesar is far more about Brutus than it is about its own title character, somewhat in the way The Merchant of Venice is far more about Shylock than Antonio, the actual merchant mentioned in the title. (Brutus has at least four times as many lines as Caesar, and the psychological tension—the exquisite psychological tension—that gives the play its relentless, unsettling energy derives from Brutus’s efforts to negotiate his way through a maze of conflicting obligations relating to comradeship, patriotism, honor, and duty.)

Nor is the notion of “updating” Julius Caesar to suggest its enduring relevancy anything new: as recently as 2012, the Guthrie Theater in Minneapolis featured Bjorn DuPaty dressed up and made up to look eerily like Barack Obama in the title role in a production that appears not to have offended any major corporate sponsors at all. Of course, the concept there was to warn the public about the vulnerability of our first black President, not to encourage his murder! So here we have the same play, the same lines, the same plot—and even the same update concept of presenting Caesar as our sitting president—and yet the Public Theater’s Shakespeare in the Park production has provoked twin tidal waves of emotion, one responding as though the production were openly to be promoting the murder of Donald Trump and the other as though the principle of free speech itself were somehow to depend on Delta Airlines renouncing its right to choose to which cultural events it wishes to lend its name and where it wishes to spend its money. Both sides are just a bit overstated.

The function of art in society is to irritate and to provoke. But to imagine that the specific thing the Public is trying to provoke with this production is the murder of President Trump is really to misunderstand the play.  The key to the play, both as I remember understanding it in eleventh grade when it was explained to us by Mr. Bergman and as I understand it today, is to show how, although the assassination of Caesar was undertaken by people who surely felt themselves to be acting in their nation’s best interest, Caesar’s murder a true catastrophe for Rome…and, at that, one from which the Roman Republic never recovered. Caesar was assassinated in 44 BCE. Civil war ensued. Within a few years, Caesar’s adopted heir, Octavian, emerged as emperor of the newly-invented Roman Empire and democracy was gone from Roman soil for millennia. By acting violently to preserve democracy, the conspirators managed to destroy it instead.

The enduring brilliance of Shakespeare’s play lies in the questions it manages obliquely to ask. How far can the citizens of a republic legitimately go to preserve their nation by removing a leader working at cross-purposes with what they perceive to be the nation’s best interests? Who among the citizenry have the right to self-select as the nation’s saviors…and at what point does it matter that the path to salvation lies in violence?  Does the fact that no assassins can say with certainty what the consequences of their lawlessness will be mean, ipso facto, that all instances of extra-judicial violence are morally wrong…or merely ill-advised? Students of the Bible will think of Pinchas, valorized in the Torah precisely for having been so repulsed by the decadent, vulgar behavior of a fellow Israelite that he took it upon himself personally to serve as that individual’s judge, jury, and executioner. Students of history who feel deeply regretful about the failure of the famous plot to murder Hitler in the summer of 1944 will surely not feel that it is always wrong to act unilaterally to defeat a brutal tyrant. The simplest of assertions—that violence is always wrong, and that citizens may never act on their own violently to solve their nation’s problems—becomes far more complex in the discussing.


To the extent that the Public Theater’s production of Julius Caesar will usher its audience into the complexities of that discussion, it should be hailed as a legitimate piece of provocative theater. To the extent it reminds all who view the play just how devastating the consequences of even the most well-intentioned act can be, it will serve not as a spur to violence but, just to the contrary, as an argument against violence and lawlessness. To the extent that the Public’s production promotes the view of its artistic director, Oskar Eustis, that Shakespeare’s ultimate point is that “those who attempt to defend democracy by undemocratic methods [will ultimately] pay a terrible price and destroy their republic,” it should be hailed by all as a civics lesson for us all.

Thursday, June 8, 2017

Hope for the U.N., Possibly

Like most Americans—60% according to a recent Gallup poll—I think the United Nations is doing a poor job living up to its self-assigned task to serve as the one international forum in which all the nations of the world are welcome peacefully to work out their disputes. I suppose different Americans must come to this negative impression from different directions, but, at least for me, the determinative factor will always be the incredible bias the body has shown towards Israel for the last half century— a kind of almost visceral prejudice that has on many occasions crossed the line from “mere” hostility to the policies of this or that Israeli government to overt anti-Semitism.

Nor am I alone in my sentiments. In a remarkable show of non-partisan unity, the entire Senate—including all one hundred U.S. senators—signed a letter to U.N. Secretary General António Gutteres last week in which they asked him formally to address what they called the United Nations’ “entrenched bias” against Israel. Nor was the letter particularly subtle: by pausing to remind the Secretary General that the United States is, and by far, the largest single contributor to the U.N. budget—in 2016, the U.S. paid out an almost unbelievable $3.024 billion to keep the U.N. running, a sum that exceeds the contributions of 185 of its member states combined—the senators sent a clear message that that kind of almost unimaginable largesse cannot be expected to continue if the U.N. fails to treat all its member states, Israel most definitely included, fairly and equitably.  They didn’t need to issue an actual threat either—just mentioning the budget was, I’m sure, more than enough.

The letter, written by Senators Marco Rubio and Christopher A. Coons (a Republican from Florida and a Democrat from Delaware, respectively), also mentioned with great enthusiasm and approval the work of Nikki Haley as the U.S. Ambassador to the U.N.  And she deserved her shout-out too: it’s hard to remember the last time Israel had a defender as unwilling to mince words as Ambassador Haley. The Trump administration has had trouble, and continues to have apparently serious trouble, filling any number of crucial diplomatic posts. But the President chose well when he selected Nikki Haley to represent us in Turtle Bay. Americans should all be proud to have a person of her eloquence and candor in place in what must be one of the world’s most trying diplomatic postings.

Ambassador Haley, for example, made it crystal clear just last Tuesday that the U.N. Human Rights Council—a council of buffoons whose sole interest in the world appears to lie in decrying Israel’s every perceived misstep while blithely looking the other way when other states trample on even their citizens’ most basic rights—when she, speaking with her usual forthright directness, specified that the U.S. might simply withdraw from the council unless it abolishes its infamous Agenda Item 7, which guarantees that there will never be a meeting of the council in which Israel is not singled out for censure. Such a move would hardly immunize Israel against legitimate criticism. But it would, at the very least, put Israel on the same footing as other member states—the basic definition of being treated impartially and objectively in any legitimate forum. And it would also mean that the Middle East’s one true democracy will no longer endlessly be condemned with knee-jerk resolutions full of fury but signifying nothing, while states like Iran, Syria, and North Korea—all states in which the basic human rights of the citizenry count for nothing or almost for nothing—are ignored. (Resolutions condemning Israel at the Human Rights Council outnumber similar resolutions regarding all other countries combined.) Such a disparity would be almost funny if it weren’t tragic, but it’s part and parcel of what the U.N. does and, by extension, is. Therefore, Ambassador Halley was in my opinion entirely correct to indicate that continued hostility toward Israel on that level could conceivably trigger a U.S. withdrawal. She was certainly speaking for me personally when she said clear that “[The Human Rights Council’s] relentless, pathological campaign against a country that actually has a strong human rights record makes a mockery not of Israel, but of the Council itself.”

Yet there may be subtle signs that things are changing. Last month, Secretary General Gutteres took the extraordinary step of personally rejecting a U.N. report that used the language of South African apartheid to describe the plight of the Palestinians on the West Bank by saying clearly that it had been published without his approval. Nor does the Secretary General appear to be afraid to speak out in public. Just last April, for example, he appeared personally at a plenary assembly of the World Jewish Congress and addressed world-wide anti-Semitism and his own organization’s systemic anti-Israel bias in the same speech. (He was, for the record, the first U.N. Secretary General ever to visit an international forum of Jewish leaders.) Addressing the first issue, he pledged personally to be “on the front lines in the fight against anti-Semitism,” which specific kind of racist hatred he condemned unequivocally as “absolutely unacceptable.” And he also pledged that the U.N. would be in the forefront of a world-wide campaign to eradicate anti-Semitism from, in his own words, “the face of the earth.”

That much was impressive enough.  But then, almost unexpectedly, he went on to commit himself to working towards a reform of U.N. policies regarding Israel because, again to quote him precisely, “Israel needs to be treated as any other state.” And then he went even further, stating that he believes that Israel has an unequivocal right to exist, that Israel has an equally non-negotiable right to live in peace and security with its neighbors, and that “the modern form of anti-Semitism is the denial of the existence of the State of Israel.” (He presumably meant to reference the right of Israel to exist, not its actual existence—even its most implacable foes concede that there is such a place even if they wish things were otherwise.)

So there’s that. And then there was the almost unbelievable news last May that Danny Danon, Israel’s ambassador to the U.N., was elected by 109 nations to became the first Israeli to chair a permanent U.N. committee, the General Assembly’s Sixth (Legal) Committee. And now, on the heels of that unprecedented achievement, Danon has been elected vice president of the U.N. General Assembly, his term to begin in September and to last for one year. It is true that he is not the first Israeli to serve as vice-president. (That honor goes to former Ambassador Ron Prosor in 2012.) But even so…given the level of vituperative animus against Israel that characterizes so much of what the United Nations does, it was remarkable to learn that an Israeli was elected to any position of authority at all. It isn’t much—there are, for the record, 21 vice presidents of the General Assembly—but it’s surely something to celebrate for those of us who, despite everything, continue to harbor some hope that the U.N. could yet live up to its founders’ vision and become a force for good in the world.

And that sense of faint hope inspired me to return to an essay by Ambassador Danon himself that was published on the Politico website earlier this year in which he argued that the time has come for Israel to be granted a seat on the Security Council. (To see the Politico article, click here.) It’s an important article, one I earmarked to return to and then somehow never quite did…but now that I have reread it, I would like to suggest it to you as something very worth your time and consideration.

The ambassador begins by pointing out how Prime Minister Netanyahu’s announcement that Israel was poised to compete for a non-permanent seat on the Security Council was overshadowed, even overwhelmed, by the vote by that same body last December to question the historicity of the Jewish claim to Jerusalem. Ignoring not centuries but millennia of history, and mocking the work of a world of disinterested historians and archeologists, the Security Council voted on December 23 to recognize the Western Wall not as a Jewish holy site inextricably bound up both with the history and the destiny of the Jewish people, but as a Muslim shrine illegally occupied by Zionist usurpers intent on imposing their fantasy-based worldview on a world that should know better. (To reread my response to a similar UNESCO-based resolution earlier last fall, one so one-sided and biased against Israel that UNESCO’s director general, Irina Bokova herself felt the need to distance herself from it, click here.)

Nonetheless, Danon argues, the time has clearly come for the U.N., if it truly wishes to shed some of its shameful reputation, to welcome Israel onto the Security Council.  To be so elected, Israel will need the support of two-thirds of the General Assembly. But if it surely won’t be easy, it also shouldn’t be considered an impossibility. Israel has paid more into the U.N. budget over the years than the other 65 countries invited to sit on the Security Council as non-permanent members combined. And Israel has a clear role to play in encouraging the Security Council to enforce its own resolution 1701, which forbids the entry into Lebanon of any foreign armies or arms but which has mostly been ignored as Iran has poured arms into Lebanon to arm Hezbollah, now considered to have upwards of 150,000 rockets aimed at Israeli civilian centers.  Most of all, inviting Israel onto the Security Council would signal in a meaningful way that the decades of discrimination against Israel during which the U.N. has squandered the considerable moral capital it once had and sullied its reputation among all fair-minded people would finally be over.

As all my readers know, I could hardly think less of the United Nations. But I didn’t always feel that way. When I was a child, the U.N. was often held up as an example of the way that the world had turned a corner away from violence and bloodshed as the primary means of settling disputes and embraced the cause of mutual respect among nations and the peaceful resolution of conflict. One of my mother’s prized possessions, which I still have somewhere, was a letter bearing the first United Nations stamp issued and postmarked in New York on October 24, 1951. She, and so many of her and my dad’s generation, felt that the U.N. was the best hope for a world in which the horrors of the Second World War would never be replicated.  That sounds almost laughable now…but, who knows, maybe the U.N. could somehow regain its moral stature and thus also its potential. Electing Israel to the Security Council would be an unmistakable signal that the organization has turned a corner.