Friday, May 25, 2018

Guns and Cars


Yes, of course, when I looked at the pictures of the innocents gunned down in the high school in Santa Fe, Texas, last Friday, I felt some combination of anger and deep sadness. What kind of person could look at portraits of murdered children and not feel both those emotions welling up from deep within? And yet it’s also true that the incident itself independent of the victims—the actual scenario of a young person getting a gun somewhere, entering a school (in this case his own high school), and opening fire on whomever has the misfortune to be standing in his line of fire—the actual incident itself amazed me in precisely the opposite way: by failing to stir up the level of outrage that even I myself think any normal person should bring to his or her contemplation of an event like last week’s bloodbath at Santa Fe High. It’s just become so…so what? So routine, so almost ordinary, so weirdly and eerily banal? The sad truth is that it’s not even that easy after all these incidents for me to remember clearly which shooter attacked which school.
As a result, I found myself understanding easily when I listened to that video clip featuring Paige Curry, a seventeen-year-old student at Santa Fe. “It’s been happening everywhere,” Paige said. And then she added a thought that would have once been incomprehensible other than in a horror movie as the cellos start thrumming in the background. “I’ve always kind of felt,” she said, “like it eventually was going to happen here too.” And then, just to sharpen her point, she added the almost obvious: “I wasn’t surprised,” she said. “I was just scared.”

I get it. I’m sure I’d be scared too if I was present in the same building as an unrestrained shooter. But would I be surprised? I think I personally would be. But, of course, I’m not a high school student, much less one in Texas, to whose entire lifetime these incidents have served as a kind of terrifying, if almost ordinary, background. (Today’s high school seniors were born after the Columbine massacre of April 20, 1999, not before.) I was once a high school student, of course. And there were indeed school shootings across the land during my years at Forest Hills High. But what was absent in my day was the sense of randomness that the shooting incidents of these last years seem almost invariably to feature. There were, to be precise, exactly one dozen documented school shootings during the years I was in high school, seven of which took place in high schools or junior highs. The rest took place in universities or colleges, but the salient detail is that none was random: some, like the famous Kent State incident of 1970, took place in the context of political demonstrations; others were tragic, unintentional accidents; and still others, at least half, were targeted assassinations, usually of teachers or principals by disgruntled students. In other words, in none did a young person simply appear in school with a gun and just start shooting.
The earliest school shooting in the United States actually preceded the founding of the nation itself. It took place in 1764 in Greencastle, Pennsylvania, in the context of the now-long-since-forgotten war called Pontiac’s War in which native tribes banded together to protest British policies towards them. And it was thus, at least in their own minds, as an act of war that four Delaware Indians entered that town’s school building on July 26 of that year and shot to death the school’s principal, one Enoch Brown. Whether Greencastle can count as our country’s first brush with murder at school, or whether the murder of John Davis, the law professor at the University of Virginia who was murdered by one of his students on November 12, 1840, deserves to be considered the first American school shooting, seems to me at least debatable. (I believe the Greencastle incident is the only school shooting, even now, in American history that took place in the context of an actual war. But it seems odd to consider it an American event, given that the United States did not yet even exist.) But the more profound question is not which incident gets the most reasonably to be labelled our first school shooting, but whether we can stem this tide of senseless violence before it becomes even more endemic, even more a part of our national culture, even more inextricably woven into the fabric of our American ethos…and as such something that in the end simply cannot in any practical way be eradicated.

To my way of thinking, this is specifically not a Second Amendment issue and we have done ourselves no service by appearing unable to frame it any other way. Indeed, approaching the question from that vantage point—i.e., by wondering if Americans should or shouldn’t be allowed to bear arms or how that right should or shouldn’t be curtailed with respect to one or another subgroup within the citizenry—that seems to me to be the precisely least productive way to engage with this issue. Instead, this should be a considered a safety issue—and in the context of school shootings, a children’s safety issue—and the question framed, not in terms of the rights of citizens (or specific citizens) to own guns at all or specific kinds of guns, but in terms of the basic right of all citizens, most definitely including children, to be safe from harm as they go about conducting their daily affairs.
We’ve managed this in other areas, after all. In a truly remarkable essay published last November, Nicholas D. Kristoff made the remarkable point that, through a combination of innovation, legislation, and increased awareness on the part of the public, we have managed to reduce the likelihood of an American dying in an automobile accident by an unbelievable 95% since 1921. (To see Kristoff’s essay, click here.) Even more to the point is that we have done so not by prohibiting the use of cars, not by making cars increasingly less powerful with each model year, not by continually raising the age at which young people can get driver’s licenses, not by requiring background checks before permitting a dealer to sell a car to anyone at all, and not by requiring people to acquire government-issued permits to purchase motorized vehicles. Instead, we allowed what we know of cars—and, no less crucially, what we know of the people who drive them—to inspire innovation after innovation intended to diminish the likelihood of an American dying in a car accident.

We all know how this has been accomplished. Seatbelts were introduced in 1950 and eventually made mandatory in all fifty states. Federal safety standards were first imposed on automobile manufacturers in 1968. The national 55 m.p.h. speed limit was imposed on most American highways in 1974. Car safety ratings, giving consumers the opportunity to purchase vehicles based on the degree to which they were considered safe to operate by unbiased experts and not merely the degree to which they were touted as such by their manufacturers, were introduced in 1993. Front-seat airbags became mandatory in 1999. We introduced mandatory reporting of defects by car manufacturers in 2000. And the result? In 1946, there were 9.35 deaths per 100 million miles driven in the United States. In 2016, there were 1.18 deaths per 100 million miles driven. That is a truly amazing statistic, one all Americans should bear in mind as they search for a way to make safe our schools and protect our children. It surely can be done. We just need to figure out how.
As Mount Kilauea continues to erupt in Hawaii, there has apparently been a resurgence of interest in Madame Pele, the traditional Hawaiian goddess of destruction imagined to govern that fiery mountain and to control its lava flow. I doubt most Hawaiians take these beliefs too literally, although there are apparently those who take them very seriously indeed. (Click here to read more.) Nor is the idea of a god or goddess of destruction unfamiliar to me—the Israelites themselves used regularly to flirt with the idea of bringing some version of Mot, the Canaanite god of death and destruction, into the Israelite cult as a kind of sub-deity deemed responsible for destruction and death in the world. The prophets inveighed against that kind of potential deviation from strict monotheism, but I can certainly understand the appeal of explaining away at least some of the terrible things that happen in the world by blaming them on perverse deities intent on bringing mayhem to the world. But when it comes to the scourge of gun violence in our land (and particularly the version directed at children or teenagers in school), it feels ridiculous to blame the situation on malevolent gun gods or on our national ethos, or in describing it as the inevitable consequence of our right to bear arms.

It’s easy to be cynical. I’ve lost track of how many times I’ve heard people say in the last little while that there simply is no solution, that if Sandy Hook wasn’t enough to rouse Americans to action than nothing ever will be. I suppose there’s something to that. But the dimensions of the problem need to rouse us to action anyway: if you include suicides, there have been more gun deaths in our nation’s history (about 1.4 million) than deaths in all the wars in which our country has participated since the Revolution itself (about 1.3 million, as Shelter Rockers who come to Yizkor all know). In most years, more Nursery-School-aged children die from gunfire than police officers risking their lives in the line of duty. We have created this situation and I simply can’t imagine that we can’t also solve it.

Thursday, May 10, 2018

Torture and the American Soul

Gina Haspel, President Trump’s nominee to head the CIA now that former head Mike Pompeo has become Secretary of State, is facing stiff opposition on Capitol Hill primarily over two specific issues: the question of the use of torture to extract information from prisoners at a secret CIA “black site” detention facility in Thailand that Haspel supervised in 2002 and her role in the 2005 destruction of almost 100 videotapes of interrogation sessions, some of which are thought to have captured scenes of actual prisoners and detainees being tortured. Speaking indirectly to both charges, Haspel on Wednesday told the Senate Intelligence Committee that she will not reinstitute the brutal interrogation techniques that were in use in Thailand when she was in charge there and elsewhere. Haspel did not, however, condemn torture as an absolute wrong, thus suggesting that there may well be circumstances under which even the most violent, pitiless and ruthless techniques for eliciting information from detainees could be justified.

Most of the authors I’ve been reading lately who oppose the use of torture to extract information from prisoners fall largely into one of two categories.
Some understand the use of torture regardless of circumstances to constitute what legal philosophers call a malum per se, something that is morally wrong “in and of itself” in which moral wrongness inheres by its very nature, as opposed to the malum prohibitum, which term references the act that is wrong only because it is prohibited by law. (The law requiring people to drive on the right side of the road would be a good example of that latter—a perfectly reasonable law that forbids behavior no one would describe as intrinsically evil.)

Still others, perhaps less philosophically inclined, oppose torture on the practical grounds that they feel that it rarely, if ever, yields truly useful information because it merely brings the individual being tortured to the point at which he or she will say anything at all to gain relief and because information so acquired is therefore highly unlikely actually to be accurate. As an example, people in this camp point to the torture-obtained admission by Khaled Sheikh Mohammed, the Pakistani national named in the 9/11 Commission Report as “the principal architect of the 9/11 attacks,” that he had recruited black Muslims in Montana to carry out future terror attacks, a confession that he later recanted and which apparently had no truth to it at all.
To help refine my own thinking on the matter, I’ve had recourse in the last few days to two important works: Torture: A Collection, an extremely interesting and rich collection of essays published by Oxford University Press in 2004 and edited by Sanford Levinson, a professor at the University of Texas Law School; and a very long and detailed essay by Rabbi J. David Bleich called “Torture and the Ticking Bomb” published in 2006 in Tradition, the quarterly journal of the Rabbinical Council of America, the largest organization of Modern Orthodox rabbis. (Used copies of the Levinson book are available for purchase online for less than $2.50; to see Rabbi Bleich’s essay, click here.) In the Oxford volume, I was particularly taken with the essays by Miriam Gur Aryeh and Alan Dershowitz. But I’d like to focus primarily today on Rabbi Bleich’s argumentation.

After a long and very interesting survey of modern and pre-modern approaches to the topic, he turns to a specific question of Jewish law, the one concerning the ticking bomb mentioned in the title of his essay. He begins by noting that the Torah’s commandment at Leviticus 19:16 to the effect that one may “not stand idly by the blood of another” has been interpreted since ancient times to mean that there is a legal, not merely a moral, obligation to come to the aid of someone who’s life is in danger. And then he poses the question about torture against the background of that concept by proposing a situation in which terrorists have placed a weapon of mass destruction, say a “dirty” bomb or even a more sophisticated nuclear device, in a public place where it will take the lives of thousands if it explodes. And let’s imagine further than one of the terrorists, one who is considered at least likely to know the specific location of the bomb, is apprehended, but refuses to reveal what he knows. Is there a moral limit to the amount of force that may be applied to extract that information from such a prisoner?
There are lots of ways to approach the question. Is it a matter solely of numbers? In other words, the example above imagined thousands of lives on the line. But what if it were tens or hundreds of thousands? What if it were millions? And that is where the distinction between a malum per se and a malum prohibitum comes into play. Is the prohibition of torture a line that by its very nature may never be crossed? Or is it just a bad thing that rarely produces good results and that the law therefore rightly forbids…but which could be morally justified under certain specific circumstances? The Romans used to say fiat justitia et pereat mundus (“let justice prevail even if the world be destroyed”). Those words have a noble ring to them…but the acid test is not whether you would learnedly cite them in a law school application essay but whether you, who are on record as abhorring the use of torture, would dare say them aloud to someone whose children are in the city where the bomb has been planted and where it will probably, or even just possibly, explode if its location is not discovered in time.

Rabbi Bleich begins his analysis by introducing the concept of the rodeif, the “pursuer.” According to Jewish law, when someone’s life is in danger because that individual is being pursued by someone who appears intent on killing him or her, it is deemed permissible to save the individual being pursued even if the sole means available to do so requires taking the life of the pursuer. This is how the mandate not idly to stand by the blood of another mentioned above is applied in our sources: there is an absolute obligation to safeguard life and, in a situation in which one individual is attempting to kill another, it is not merely permitted but required to do what it takes to make sure that the pursued party survives even if doing so costs the pursuer his or her life.
In the situation described above, the one in which the apprehended individual has information about the location of the bomb, is that individual a rodeif? What if there is no evidence that this specific individual did anything at all—and certainly not that he armed the bomb or knows how to disarm it—but merely might know where it is. Is that enough to make him into a rodeif whose survival may be risked for the sake of saving others? That, Rabbi Bleich maintains, is the question at the heart of the matter. The prisoners the CIA waterboarded to make talk were not at that moment trying to kill anyone. But they had information, or possibly had information, that could have saved the lives of innocents. Does that justify doing what it takes to make them tell what they know? (And, no, you don’t get to have a different opinion if the innocents in question are your own children or other people’s.)

What comes through in Rabbi Bleich’s analysis again and again is how complicated this all is. When the prisoner is being tortured, there is often no way to know in advance if he possesses any usual information at all. Nor is it possible in advance to know if the information successfully elicited with have any worth. Also in the mix is the understanding of our Torah that the prisoner also has a sacred obligation to save human life. So getting such a prisoner to provide information that could conceivably save thousands is somehow both an assault on his physical being and a way of assisting him or her to behave ethically.
For most of us, it also depends how the question is phrased. When asked if torturing prisoners to see what they might know is morally defensible, for example, I think most of us would easily answer in the negative. But when asked if there should be limits placed on the CIA agents attempting to elicit information that could potentially save the lives of, say, a thousand school children, I think most of us would answer quite differently. Nor does this inconsistency have to do solely with the number of potential lives saved. When the question references physical pain, as in electrical shocks or simulated drowning, some of us would condone torture and others would oppose its use. But if the question were to be rephrased to reference sexual assault—for example, if we were to be asked to approve of a female prisoner being repeated and brutally raped to elicit information from her, even of the kind that could save innocents—my guess is that most of us would refuse categorically to condone the practice.

Finally, Rabbi Bleich mentions the concept of hora∙at sha∙ah, the temporary suspension of the law. Americans will think back to the Alien and Sedition Acts of 1798, to Lincoln’s Civil War suspension of habeas corpus, and to the detention without trial of Japanese-Americans during the Second World War. But this notion that even the most basic laws may be suspended in times of great national peril is part of Jewish tradition as well and extends under certain circumstances even to the most basic prohibitions.  So, in the end, the question really is whether the war against our nation’s enemies, particularly violent extremists like Al-Qaeda and the Islamic State, is enough to justify the suspension of our natural disinclination to condone torture as a “regular” means of eliciting information from our nation’s foes. Some will stick to the malum per se argument and say, with the ancients, fiat justitia et pereat mundus. But others will think back to 9/11 and recall that our tradition also teaches that saving even a single life is the moral equivalent of saving the entire world. 

Thursday, May 3, 2018

The House Chaplain


It came as quite a surprise to me the other day when I read that the Rev. Patrick J. Conroy, a Roman Catholic priest, was fired from his position as Chaplain to the House of Representatives by the Speaker of the House, Paul Ryan. Why exactly Father Conroy was fired has not been made entirely clear, although the Speaker did insist that religion—and I know how odd this is to say—that religion wasn’t a factor here really and that he, the Speaker, although fully identified with Evangelical Protestant Christianity, was not prompted to act merely because he did not wish to have a Catholic priest as “his” chaplain. (Father Conroy was originally appointed in 2011 by Roman Catholic then-Speaker John A. Boehner.) Another suggestion, widely cited, was that the priest crossed a line—at least as far as the Speaker of the House was concerned—when, during the recent debate about the overhaul of the income tax system, he prayed in public that our lawmakers act to “guarantee that there are not winners and losers under the new tax laws, but [rather] benefits balanced and shared by all Americans.”  But it seems to me at best unlikely that such an innocuous prayer could have been the straw that broke the camel’s back. (What should the chaplain have prayed for? That our lawmakers create a new tax code that will only benefit some specific Americans but not the country as a whole?) And it is so that Speaker Ryan took issue with that prayer, admonishing the chaplain for failing to stay above the political fray and apparently for also forgetting that he was in place solely to pray publicly for things so little contentious that no one could possibly care what the prayer said anyway. So maybe that was the reason!
The House itself responded along party lines almost to a person: of the 148 representatives to sign a letter to Speaker Ryan insisting that he reveal the precise reason or reasons that he dismissed Father Conroy, only one sole signatory, North Carolina representative Walter B. Jones, was a Republican. Nancy Pelosi, the Democratic leader in the House, raised the interesting question of whether the Speaker actually had the authority to fire the House Chaplain in the first place. And she also revealed that Speaker Ryan had offered her a specific reason for wishing to be rid of the House Chaplain, one unrelated to that prayer about the tax code: because he had given an interview to The National Journal in which he spoke openly about sexual and workplace harassment issues, and in which he raised the possibility that the entire Congress was in the grip of a spiritual crisis the specific nature of which he did not identify.

I have no specific opinion about Father Conroy’s qualifications, having neither ever met the man nor heard him speak. But it isn’t the specific details regarding his dismissal that intrigue me, although I do have to admit that I’m curious how this will all play itself out. No, naïve citizen that I am, I was surprised to learn that the House of Representatives even has a chaplain hired (and paid a salary) to serve as its spiritual leader. Who knew?

Obviously, the ideal would be to have a House chaplain who has no specific allegiance to any specific religion. But, given that a chaplain is by definition an ordained individual trained in some specific faith tradition, seeking a chaplain who merely represented “religion” but not any specific one of the world’s religions would be something like trying to hire an orator to deliver an address in “language” but not in any specific one of the world’s languages. And yet, given the allegedly iron-clad wall erected by the founders between church and state, isn’t that exactly what we all should want, a chaplain whose allegiance is to “religion” itself, but not to any specific one of them? But wishing for that specific (impossible) thing means assuming Congress wants or needs a chaplain at all.

The first thing to know is that this all goes back a real long ways. After being prodded to action by Samuel Adams, the Second Continental Congress appointed one Jacob Duché, an Anglican priest, as their chaplain on September 5, 1774. There wasn’t even lip-service paid to the concept of nonpartisanship: Father Duché led the Congress in Anglican worship that day and delivered some extemporaneous prayers rooted in his Anglican tradition. Later, after the Constitution called into being a bicameral legislature, Congress resolved that there should be two chaplains, one for the House and one for the Senate. And, indeed, starting in 1789, chaplains were called upon to open both houses of Congress with a prayer. As the years pressed on, chaplains from different Protestant denominations were appointed to both houses of the legislature. This seemed fair to some, but not to all—James Madison, for one, was opposed to the whole practice on the grounds that, since it would be unthinkable to invite a Catholic priest or a Quaker to serve as chaplain, it was not in keeping with the spirit of the Constitution to invite solely Protestant clergy to participate. (Note that Madison was not proposing that the chaplaincy actually be opened up to Catholics or Quakers, just opining that because it was unimaginable to do so the best course forward would be to cancel the whole concept and have no one at all in the position.) I can’t even begin to imagine what people who considered the appointment of a Catholic priest to be inconceivable would have had to say about the notion of a rabbi serving as House chaplain, let alone an imam or a Native American shaman. Or rather, I can. There have, at any rate, been fifty-two different individuals who have served as House Chaplain. All were men. All were Christian. All but one were Protestants. Probably, I should hold onto my day job even if the position is currently vacant.
The Senate chaplaincy has a similar history. The Senate, meeting for the first time in New York in the spring of 1789, selected Reverend Samuel Provoost, the Episcopal Bishop of New York, to serve as its chaplain. This set a pattern and, when the Senate moved to Philadelphia the following year, that city’s Episcopal bishop was appointed chaplain. To date, there have been in total sixty-two chaplains of the Senate. All have been men. All have been Christians, all but one Protestants. The current chaplain of the Senate is Rear Admiral Barry C. Black (Ret.), a Protestant minister and former Chief of Navy Chaplains.
Over the years, there have been regular challenges to the practice. (For a detailed account of those challenges, click here to see an essay on the topic by Christopher C. Lund published in the William and Mary Bill of Rights Journal 17:4 and a truly fascinating survey of our nation’s ever-evolving attitudes towards religion.) But none, including one suit that got all the way to the Supreme Court, has been successful. I certainly do not expect any to be successful in the future. But, contrary to what readers might expect, I do not wish that the chaplaincy were more inclusive and that, say, rabbis such as myself were invited to participate along with the clergy of all faiths represented in our American mosaic. If we’re going to have a chaplain serving Congress, I suppose he or she should be chosen from the full spectrum of religions to which Americans adhere. But far better, and far more principled in my opinion, would be the decision to dispense with the chaplaincy entirely.

It is never in the best interests of Jewish Americans—or, for that matter, of any Americans—for the wall between church and state to be breached, and that is so even in when the breach in question appears to be relatively benign and inconsequential. Why should anyone care, after all, if someone stands up when the Senate opens to recite some prayer that the members of the Senate either do or don’t listen to, but which none is obliged to take to heart or even to consider in passing? When considered in that specific light, the issue appears hardly to matter at all. Even I think that!
But when considered in terms of the larger picture, it does indeed matter. Several years ago, I explained in one of my weekly letters why I found the White House seder introduced during the Obama years so irritating. (To revisit those remarks, click here.) I also find the White House Christmas tree inappropriate to the point of almost being vaguely threatening. And I certainly do not, in some peculiar calculus of impropriety, find comfort in the willingness of the last few presidents to host White House Chanukah Menorah lighting ceremonies.


In my ideal version of America, the wall between church and state would truly be impermeable. The President has private quarters in the White House—that is where he or she should celebrate the festivals of his or her faith and where the symbols of those festivals should be displayed and enjoyed, just the same as in anyone’s private home. The Congress, on the other hand, should assert its impartiality by showing no favoritism to any faith at all.  And to imagine that the fact that all 114 of our congressional chaplains have been Christian men is not overtly suggestive of precisely the kind of favoritism (and gender bias) our national ethos supposedly derides, discourages, and disdains—that seems to me to be, at best, a fantasy rooted in our desire to be something that we have not yet quite become.

I have no real idea why Speaker Ryan dismissed the House Chaplain, but I wish he would move forward in the wake of that dismissal and suggest that the position itself be abolished along with its parallel in the Senate. Or, at the very least, that it should be filled by someone able to deliver an opening prayer in “language” itself…but not in any specific one.






Thursday, April 26, 2018

Dr. Sims' Legacy


Regular readers will know that that there are certain specific themes that surface again and again in my writing. The nature of heroism is one of them. And the specific relationship of the Jewish people to the Land of Israel, and particularly to Jerusalem, is another. But the single topic I believe I possibly have returned to the most times over these last eleven years of writing weekly is the question of the reasonableness of expecting an individual to transcend the beliefs that are considered ordinary and uncontroversial in his or her society, and in so doing to see the world as it truly is and to act accordingly. It sounds as though it should be the simplest thing in the world to look out at the world through your own eyes…and then to process and respond to what you are actually seeing and not what you’ve been told, in some cases even since childhood, that you see. In reality, however, it is not only a challenge, but—at least for some—among life’s most daunting, difficult tasks.
In that context, I have wondered how reasonable or rational it would have been for the world to expect a German child, every single one of whose authority figures at school, in church, and even at home was either a Nazi or a Nazi sympathizer, to condemn Nazism—and particularly Nazi anti-Semitism—as something depraved and debased, and to risk everything to save, say, a Jewish child facing deportation to the camps. I’ve written about slavery in antebellum America in that vein as well, and particularly about the way so many clergypeople, Christians and Jews alike, seemed almost weirdly incapable of transcending what everybody knew about race to see slavery as the brutal affront to human dignity that it seems to us so clearly to have been. And I’ve also written about the strange inability of so many Jews for so long—and we are talking here not about centuries but about millennia of persecution and endless exile—about their inability to see that they didn’t actually have to live in exile at all, that they could just as easily have made their way to Eretz Yisrael and settled there, that they could have claimed their rightful patrimony not by praying that it come to them via miracle or as a chapter in the coming redemption of the world, but actually by making it so through the labor of their hands and the sweat of their brow.

Clearly, it isn’t that easy to look around at the world, to take note of details that everybody knows to be true…and to reject them as absurdities without caring how that act of rejection might conceivably impact on you personally and on your place in society. Nor do you win any points for knowing better than people in past centuries with respect to propositions that the whole world has long since rejected! The challenge is to look at the actual world in which you live, to take note of its givens and its most (not least) normative beliefs and opinions, and then to have the moral courage to reject as false the propositions that seem to you personally not to be in harmony with what you actually see of the world when you look out at it through your own eyes and allow the matrices of your own working brain intelligently and morally to process what you see.
So these were the thoughts that came to mind as I saw the pictures in the paper the other day of workers removing the statue of J. Marion Sims from its pedestal on Fifth Avenue just outside of Central Park. I mentioned him in a letter last October when I was writing about the issue of whether statuary honoring Confederate war heroes should be removed or left in place. (Click here to revisit that discussion.) But I didn’t elaborate much on his situation, which is why I’d like to return to it today.

J. Marion Sims was, by all accounts, one of the great physicians of his day. Indeed, he is often acclaimed as the father of modern gynecology and as the personal innovator of any number of surgical procedures that have helped countless women, including the repair of both vesicovaginal and rectovaginal fistulas, twin problem that ruined the lives of countless women before he proposed a successful way to intervene surgically to fix them both. And if that—and all his many other achievements—were all there were to his story, his would be an uncontroversial legacy of medical greatness and his place in the pantheon of surgical and medical innovators would be uncontested.
The problem, however, is that, between 1845 and 1849 in Montgomery, Alabama, Sims developed his surgical technique to repair the kind of fistulas mentioned above by conducting experiments on slaves who were delivered to him by their owners to see if he could fix their problems. Three of them he personally named Anarcha, Betsy, and Lucy, and it was on them that he performed the bulk of the experiments that led him to the discovery of surgical techniques still in use today to assist women. And that leads to the question I would like to consider today: how shall we, sitting on the moral high ground more than a century and a half after the passage of the Thirteenth Amendment, judge a southerner in antebellum Alabama who didn’t see slavery for what it was?

It feels as though reasonable arguments could be made in both directions. On the one hand, these women actually were suffering from the problem Sims set himself to solving and, as noted, each was cured as a result of his efforts. On the other hand, Sims had no way to know if his experimentation would be successful and was, in effect, using these women as human guinea pigs to see how they would respond to various ideas he had about how to repair the various kinds of fistulas that so plagued women in his day. It was a long, unpleasant process for the women involved. Anarcha, for example, underwent thirteen different surgeries before her recto- and vesicovaginal fistulas were declared permanently and successfully repaired.
Also, none of the women was anesthetized before her surgery. (They were, however, given opium to address their pain after the procedures were completed.) To be fair, though, I should note that the 1840s were the very years in which the use of anesthetic drugs before surgery was first being introduced, and it was apparently not uncommon for doctors trained in an earlier era to resist their use. Indeed, when Sims later moved to New York to found the Women’s Hospital, he declined to use anesthesia on his (white, non-slave) patients too when he set to repair their fistulas surgically.

Sims’ own testimony regarding the crucial issue of consent is key. Writing in 1855 in the New York Medical Gazette and Journal of Health, Sims wrote as follows:


For this purpose [of therapeutic surgical experimentation] I was fortunate in having three young healthy colored girls given to me by their owners in Alabama, I agreeing to perform no operation without the full consent of the patients, and never to perform any that would, in my judgment, jeopardize life, or produce greater mischief on the injured organs—the owners agreeing to let me keep them (at my own expense) till I was thoroughly convinced whether the affection could be cured or not.
And so the water becomes considerably muddier. He claimed he had their consent. But what could that mean when the women involved were the property of their white owners, whom he openly writes “gave” the women to Sims to see if he could find a way to cure them? Was he lying about them giving consent? What could that even have meant? What he really said to them, no one knows. What would have happened if any of them had declined to proceed, ditto. But what we do know is that these were slaves, the human property of their white masters, and that they were given by those masters to J. Marion Sims for medical-surgical experimentation.

Was Dr. Sims a precursor of Dr. Mengele? That seems exaggerated—Dr. Mengele, who was not even a real physician (his title references his Ph.D. in physical anthropology from the University of Munich), conducted incredibly cruel experiments on innocents at Auschwitz and elsewhere, and was probably insane. Dr. Sims, on the other hand, was a real physician not at all unreasonably revered by many as the father of gynecology and as the individual, even today, the most responsible for solving the most terrible of medical problems specifically faced by women in his day. Clearly, Sims was unable to see slavery as the morally depraved institution it surely was and took advantage of black women “given” to him by their owners to further his work. And so we are left with a relatively simple question: does the fact that the doctor’s experiments were successful and led to tremendous advances in the care of women obviate the fact that the experimentation that led to those advances was done on un-anesthetized slaves who were apparently called upon to hold each other down during the experimental surgical procedures performed on them?
The City of New York has decided that it does not. And just a few days ago, the statue—in place across from the New York Academy of Medicine since 1934 and for two decades before that in Bryant Park—was duly removed and will be set up near the doctor’s grave in Green Wood Cemetery in Brooklyn.

It’s a difficult issue to resolve. All agree that the man did great good and that his work led to countless women being rescued from horrific ailments. All also agree that the path the doctor followed to that great good was morally flawed, and more than just slightly. And so we come back to the question I opened by asking: how reasonable is it, exactly, to condemn people ex post facto  for not seeing the world as we see it now, for not having the insight to transcend the things everybody just knows about the world and its institutions, for not being able to see what millions in his day also could not clearly see?

The world has changed considerably since I was a child. Attitudes towards gay people, towards the place of women in society, towards the place of racial minority groups in American culture, even towards animals in the zoo—all of these have undergone sea changes since I was a child learning about the world from my teachers and, because my parents taught me to respect those teachers, believing what I was told. Should I be condemned for not having had the maturity as a nine-year-old to see things more clearly than my own teachers, then my own parents? I bristle at the thought. But I also wonder what Anarcha, Betsy, and Lucy would say if the fate of Dr. Sims’ legacy were placed in their hands to determine. Ultimately, the decision really should be theirs to make. But given the impossibility of resolving the issue that way, I think the City of New York probably acted correctly in moving the statue to a position near Dr. Sims’ grave and taking it from its pre-eminent position on New York’s grandest avenue facing one of its pre-eminent medical institutions.



Thursday, April 12, 2018

Yom Hashoah 2018


The first Yom Hashoah was observed on December 28, 1949, a date chosen by the Israeli rabbinate not because it bore any connection to any specific Holocaust-related event, but because it corresponded to the tenth day of the Hebrew month of Tevet, a minor fast day that already existed and which, it was felt, could reasonably be co-opted to do double-duty both as a marker in the annual cycle of days connected to the destruction of Jerusalem in the 6th century BCE and also as a national day of mourning for the k’doshim who died as martyrs during the Second World War. The intended symbolism was clear enough—that if we survived the Babylonians and we survived the Germans, we can survive any onslaught directed against us—and was surely intended to inspire hope for the future of the State of Israel. But as Israel became a powerful nation defended both by a mighty fighting force and a formidable arsenal, the need to co-opt the Shoah as a symbol of hope for a national future for the State was destined to fade.
And so, although Yom Hashoah was observed on the tenth of Tevet again the following year, in 1951 the Knesset voted formally to establish the twenty-seventh of Nisan as Yom Hashoah instead. That date too had no specific connection to the Holocaust, but was considered a meaningful choice nonetheless because it fell squarely between Pesach, our annual festival of freedom, and Yom Ha-atzmaut, the day on which Israeli independence was declared in 1948. The message embedded in this date too was clear enough: that the establishment of Israel as a sovereign state was the only rational response to the Shoah, just exactly as the exodus from Egypt itself had once been the only possible solution to the enslavement of the Israelites. That, in my opinion, is a profound thought. But is it enough to motivate diasporan types such as ourselves to embrace Yom Hashoah not specifically as an Israeli holiday, but as a Jewish one as well?

It’s not like we don’t have alternate dates to consider. International Holocaust Remembrance Day, for example, was established by the United Nations in 2005 and scheduled for January 27, the day the Red Army liberated Auschwitz. Meant as a memorial day not only for the six million Jews murdered by the Nazis, but also for all the other victims of Nazi terror—including particularly the quarter-million mentally ill individuals euthanized by the German authorities, as well as the 200,000 gypsies, the 9000 gay men, and the uncountable other innocents murdered by the Germans and their willing henchmen. Surely, no sensitive Jewish soul finds anything objectionable in preserving the memory of all the victims of the Nazis, not solely the Jewish ones. But there too the message embedded in the choice of date is meant to teach a simple message, that just as the world acting in concert was able to defeat fascism even in its most brutally powerful version, so should it be possible for the world’s nations, were they only similarly to act in concert, be able to defeat the forces of darkness that threaten the peaceful future of the world’s peoples today. That too strikes me as a profound thought, and my general inclination to hold the United Nations in contempt does not really spoil the cogency of the concept itself. But my question here nonetheless remains the same: is that notion powerful enough to justify an annual Shoah memorial day not specifically as a way of embracing the possibility of peaceful co-existence among nations working together to create a world of free people, but as a way formally of responding personally as Jewish individuals to the annihilation of European Jewry and acknowledging the specific place the Shoah occupies and possibly always will hold in the consciousness of Jews the world over?
Maybe the concept itself is flawed. Memorial days are by their very nature inclusive, but is it really possible to include millions upon millions of people in a single gesture of recollective grief? I’ve just finished rereading Amir Gutfreund’s magnificent novel, Our Holocaust, which I found even more stunning the second time ’round. The book is remarkable in a dozen different ways, but the key element that it stresses over and over is how each single individual murdered by the Nazis was an entire universe—a whole world of culture, personality, and potential. Under normal circumstances, that is, of course, precisely how we do respond to the murder of a single child or of an adult: as a tragedy of indescribable proportions precisely because of the inestimable value of any human life.

When seventeen children were shot down in Parkland on February 14, Americans responded as one with a kind of national paroxysm of grief that has yet totally to abate. That felt and feels entirely normal. But during the summer of 1944, 12,000 Hungarian Jews were murdered daily at Auschwitz. How can the same language be used to describe the murder of little Etan Patz, of the Parkland seventeen, and of the 437,000 Hungarian Jews murdered at Auschwitz in the summer of 1944? And 437,000 is less than a twelfth of the total number of Jews killed during the Shoah. No wonder there is something overwhelming even about the concept of having a memorial day to honor their memory—as though loss on that level even could be conceptualized, let alone conceptualized successfully enough for a single day of mourning to pay anything more than cursory, formal honor to their memory!
One tentative solution comes from, of all places, Germany itself.

There’s no easy way to translate the German word Stolperstein. The verb stolpern means “to stumble” and is, in fact, a distant cognate of the English. A stein is a stone. A Stolperstein, therefore, is a stone in the road that you stumble over, that you come across and pause for a moment to look at. But these Stolpersteine are not just inconveniently placed paving stones, but rather part of a remarkable effort first undertaken in 1992 by the artist Gunter Demnig, who had the idea to memorialize the Nazis’ victims one by one by placing a marker in the street at each individual’s last known address.  
In the years since then, more than 67,000 such Stolpersteine have been set into the pavement all across Europe in more than 1,200 different municipalities. They do not solely mark the last known address of Jewish people, however—they honor the memory of all the Nazis’ victims, including Allied soldiers murdered by their German captors in flagrant violation of every conceivable standard of decency in wartime. In Germany, each inscription begins with the words hier wohnte, the German words for “here lived,” which are then followed by the name of the individual being memorialized in his or her own home setting. When the numbers of victims connected with a certain address was simply too great for single stones bearing the name of each—for example, when Demnig’s team wanted to memorialize the 1,160 mentally-ill individuals deported to their deaths by the Nazis from the train station in Stralsund, a picturesque town on the Baltic Sea, they came up with the companion notion of a Stolperschwelle, a “stumbling threshold” set into the ground beneath the train station’s front entrance through which the unfortunates were made to march on their way to the trains that took them to their deaths in Poland.


It’s all too much to fathom. I live in a normal world, one in which the trial of a nanny accused of murdering two children has been written up in the paper on a daily basis for weeks. People’s interest in that trial hardly needs justification because the murder of children is among the most horrific crimes imaginable. And yet…this is the same world, this one in which the nanny is on trial for the murder of two—this is the same world in which twenty-five hundred children in the Kovno ghetto were sent to Auschwitz on two consecutive days in March 1944.

Maybe Gunter Demnig has it right—the numbers are just too great to contemplate, let alone actually to conceptualize, and the only reasonable way to mourn is on a victim-by-victim basis: one man, one child, one woman, one address, one year of birth and another of death, one fate, one loss.
When Sam Solasz, the father of our fellow Shelter Rocker Mark Solasz, spoke at Shelter Rock on Thursday evening, he was one man telling one story. It was gruesome, to be sure. And although he personally survived, he was able to make it clear how little likely his personal survival was…and how atypical his fate when compared to the Jews of his hometown and the other members of his family. His was one story…and that was perhaps why it was fathomable: one man, one set of details, one story of survival. That the dead aren’t here to tell their story leaves us in an obvious quandary, but the idea in both cases is the same: the burden Yom Hashoah lays at our feet is not to nod mutely at some unfathomable number, but to find the courage to accept that each individual victim of the Nazis was the loss of an entire world, of an entire universe.  It feels almost as though the burden is too heavy for any of us to shoulder alone. But that is the whole point of coming together as a community on Yom Hashoah, of course: to bear what none of us could bear alone, to shoulder a burden none of us could carry alone, to mourn in a way that none of us could bear on our own as individuals.





May they all rest in peace, those with graves and those with none, those with surviving family members and those with none, those with Stolpersteine marking the homes they were forced to leave on their journeys to oblivion and those whose homes are as yet unlabeled…or forgotten and unknown.

Friday, March 9, 2018

Possible/Impossible


I was struck by several different things when I read the obituary the other day of Roger Bannister, the first person recorded to have run a mile in under four minutes.

Bannister, who died in Oxford, England, at age 88 last Saturday, achieved world-wide fame for his feat even despite the fact that he wasn’t necessarily the first person to run a four-minute mile, there having been human life on earth for about 200,000 years but the stop-watch only having been invented in 1821. So that leaves about 199,802 years during which no one knows how fast anyone ran and races were won merely by running faster than the other people in the race without anyone knowing anyone’s actual time. Nonetheless, it was considered in its day—and still—a remarkable accomplishment, the doing of something that it was widely thought simply could not be done.

It wasn’t under four by much: his time was 3 minutes and 59.4 seconds. Nor did his record stand for long: the next person to run a mile in less than four minutes, an Australian runner named John Landy, replicated Bannister’s feat just a few weeks later and even managed to shave 1.4 seconds off Bannister’s time. Still, Bannister’s accomplishment was not the momentary blip in the record books it could have been: by the end of the 20th century, the International Association of Athletic Federations certified that the fastest-mile-record was broken no fewer than 32 times, culminating in the 3 minute, 43.13 second mile run in 1999 by a Moroccan runner named Hicham El Guerrouj. Of course, not every runner who runs the mile in less than four minutes breaks the standing record. And, indeed, since Bannister set his record on May 6, 1954, well over a thousand runners have been certified to have run a mile in less than four minutes.
Bannister’s subsequent story is also quite interesting. Realizing, I suppose, that there wasn’t actually any way to earn a living as a competitive runner, but also knowing himself well enough to understand that he wished to pursue a career in medicine rather than in the world of professional or amateur athletics, Bannister went on to attend medical school and from there to become a distinguished neurologist. In 2004, on the fiftieth anniversary of his accomplishment, Bannister was asked by an interviewer if he considered being the first to break the four-minute mile to have been his life’s crowning achievement. Bannister’s response, modest and thoughtful, was that he considered his four decades of medical practice as the great achievement of his life, particularly when the various new neurological procedures he personally introduced were taken into account. In a world that seems so often to value celebrity over mere accomplishment, it sounds at first like a surprising answer. But why should it be? And, indeed, when you think about it carefully, pathetic indeed would be the individual who devotes an entire life to the care of the sick and the development of innovative techniques to cure them, yet who considers all that good to be outweighed by having one single time run a mile really, really quickly.
I write about him today, though, neither specifically because of his death last week nor because of the record he broke per se, but rather because of what the whole incident says about the possibility of impossibility. Or, rather, about the whole concept of impossibility itself.
We could begin by asking where the notion that the four-minute mile was an impossibility came from. It obviously wasn’t true—well over a thousand people have replicated Bannister’s famous achievement since that blustery, damp day in May 1954 at Oxford’s Iffley Road track when he earned his place in the record books—and there obviously can’t have been any actual data to back up such a wholly arbitrary assumption about human ability. Yet it was thought—and, as far as I can see, universally—that no human being could run that fast. Everybody just knew it. Just in the same way that everybody once knew that there was no way to sail west from Europe and end up in India. Or, in a slightly different key, that America would never elect a black president. Or that it would be physically impossible for human beings to travel to the moon and return safely. Or that cars could ever self-drive.
All of those are examples of things that everybody just knew…until somebody decided not just to know it and instead to proceed as though the allegedly impossible was just something no one had figured out yet how precisely to pull off. Taking this thought to its natural conclusion, the great science-fiction author Robert A. Heinlein once wrote that, until it is done,everything is theoretically impossible. One could write a history of science in reverse by assembling the solemn pronouncements of highest authority about what could not be done and could never happen.” That more or less sums up what I think too
As an interesting exercise in the possibility of impossibility, I’ve assembled a list of my three favorite things that everybody just (magically, somehow) knows are impossibilities. 
At the top of my list is the notion that peace between Palestinians and Israelis is simply impossible because the Palestinians, having failed to embrace partition in 1947, won’t ever give up their claim to every inch of Mandatory Palestine, which basically makes it impossible for Palestine and Israel both to exist. The Palestinian leadership is not especially flexible, that surely is true. Yet the world is filled with examples of nations that chose compromise over endless struggle, with countries (including our own, the U.K., Mexico, France, Greece, Hungary, Ireland, Germany, Poland, Ukraine, Japan, and many more) that simply decided to live in peace with the neighbors rather than to hold on endlessly to land claims that there was no reasonable expectation ever to see satisfied. The Palestinians have made such a fetish about their knee-jerk rejectionism over the years that it just feels like an impossibility to imagine them behaving differently. But if the Germans can move past the sense that East Pomerania (now part of Poland) and Alsace (now part of France) should be part of Germany, then the Palestinians can move past their irredentist claims as well. (Have you forgotten what irredentism is and why it’s an important term for students of Middle Eastern politics to understand? Click here!) The world just needs to find a way to nudge them forward in a way that feels constructive rather than degrading…and then the impossible will suddenly feel entirely possible.
Moving along to the Jewish world, my second example of something everybody just knows is that it will be impossible for non-fundamentalist religion to survive in the long run, that the adherents of the liberal versions of all faiths—including Judaism, Christianity, and Islam—are doomed by the very tolerance and reasonability they vaunt as primary spiritual values to lose the battle against assimilationism and, eventually, to lose their sense of purpose and of self. It surely is true that the more people are taught to view people outside their own group with suspicion and hostility (both, hallmarks of fundamentalism), the more challenging it will be for members to feel justified in leaving the group. But it is also true that the virtues promoted by non-fundamental religion—open-mindedness, rationalism, and respect for alternate points of view—can exert a siren call on the human spirit as well, as evidenced by the millions of people who, despite all the predictions of doom, actually do belong to such faith communities. The further decline of non-fundamentalist religion in the West is not inevitable. And neither is it impossible to imagine a world in which it is the fundamentalists who perennially lose their people and versions of religions that promote absolute spiritual and intellectual integrity that increase almost without having to self-promote hardly at all, let alone actually to proselytize door-to-door.
And my third example of something wide known to be impossible is an American one—the widely held belief that it is simply impossible to imagine an American political landscape that features politicians reaching across the aisle to create policies and laws that benefit the nation as a whole through the strengthening of its core values and the legislative expression of those values. The common wisdom, as everyone knows, is that that kind of willing cooperation, desirable though it may sound, is simply nothing that could ever be an actual feature of our legislators’ work in Washington, that the whole Congress is so riven by factionalism and interparty dislike and mistrust that cooperation on the level that would be necessary for our legislators actually to work together for the people and not solely against each other is simply an impossibility. And yet…why should that be the case? Our legislators are mostly lawyers (43%), but all have other ways to earn a living yet have chosen to devote some or, in some cases, all of their professional lives to service of our country. Surely at least some of them—maybe even most—could make more money elsewhere! The notion that they are all agenda-driven, that nothing matters to any of them more than pushing his or her personal set of initiatives without respect for the public weal or the nation’s best interests—that seems, at the very least, to be only how things mostly seem, not how they inevitably have to be. Also worth noting in this regard that is almost 28% of the bills passed in the House and in the Senate pass unanimously and without opposition. That points to a different reality than the one we’ve trained ourselves to expect from these people: if Congress is narrowly divided in half along party lines with a slight edge for Republicans in the Senate and a slightly larger one in the House, how can more than a quarter of bills brought to a vote be passed unanimously? Clearly, these people can work together when properly motivated! So that is not an impossibility, just something we’ve been trained to think of that way!
And that concludes my list of possible impossibilities. None of my readers would mistake me for a natural optimist, but contemplating Roger Bannister in life and death buoys me slightly by making me remember that, in the end, most things deemed impossible are merely things that no one has managed to do just yet. May he rest in peace!

Thursday, February 15, 2018

Dreamers


I haven’t been much of a fan of House Minority Leader Pelosi since she first rudely turned her back on Prime Minister Netanyahu when he came to address Congress about the Iran deal in the spring of 2015 and then tetchily exited the chamber before he even left the podium. But I do have to say I was impressed by marathon oration she delivered in the House last week on behalf of the so-called Dreamers, a speech in which she read personal accounts written by young people facing deportation if no way out of our nation’s immigration quagmire is found, quoted the Bible at length, and attempted to cast the issue as a moral issue rather than a political one, let alone a legal matter best assigned to our nation’s criminal justice system for handling. Man, she went on! The speech lasted more than eight hours, possibly the longest speech ever in the House of Representatives and definitely the longest since 1909, when a representative from Missouri spoke for more than five hours about some now-long-forgotten issue related to tariffs and taxes. I was impressed, and not solely by her apparently iron-clad bladder (although by that too): I was also impressed that was able to remain standing in four-inch high heels for the length of her entire speech. And, yes, also by her rhetoric, behind which were lurking the various issues that I’d like to write about this week, and foremost among them the issue itself of the Dreamers, which term has become the almost universally used name for individuals brought to this country illegally as children and now facing deportation as illegal aliens unless Congress can find some sort of solution to what has become one of thorniest and intractable issues facing the nation.

As everybody surely knows by now (even without listening to Minority Leader Pelosi’s speech), President Obama announced in June of 2012 that his administration was going to stop deporting undocumented immigrants who meet the criteria set forth in the Development, Relief, and Education for Alien Minors Act, a legislative proposal first introduced in the Senate by Senator Dick Durbin (D-Illinois) and Senator Orrin Hatch (R-Utah) in 2001, but which has never been enacted into law. The criteria are few and simple: to qualify under the act, a young person would have to have been under age 16 when arriving here, to have lived here for more than five years before the enactment of the bill into law, to have been between 12 and 35 at the time of the bill’s enactment, to be of good moral character (which the act leaves undefined but which appears mostly to mean that the individual has never been arrested and/or charged with too serious a crime), and to be enrolled in school if not already a high school graduate or in possession of a GED.

So that was then. But now that program, called Deferred Action for Childhood Arrivals, is coming to an end. Last September, President Trump announced that the Obama-era program was going to be wound down and instructed the Department of Homeland Security to stop processing new or renewal applications. The President then challenged Congress to deal with the issue by passing legislation that would incorporate a program for dealing with the Dreamers, which sympathetic name the President himself uses all the time in public discourse. That, as everybody knows, hasn’t happened. And so the first of those who were eligible to stay under the DACA program are facing deportation as early as next month.

The issue is a complicated one for all of us, but particularly for people like myself.

On the one hand, I myself am descended from immigrants who came here not to escape war or to save their lives per se (although it can’t be considered irrelevant that they would almost surely have been murdered with the rest of their Jewish neighbors had they stayed home in Nowy Dwór), but merely (merely!) to seek a better life in a free land characterized by almost unlimited opportunity for all. So to claim that the siren call of everything that truly makes America great—our prosperity, our values, our rigorous dedication to the promotion and preservation of human rights, our justice system, our lack of a national religion that by its nature condemns the faithful of all other religious groups to outsider status, our representative democracy, and the impartiality of our justice system—to declare myself simply unable to seize why anyone would want to leave his or her homeland and settle here instead is to deny the reality of my own family’s story. How could any patriotic citizen not understand why others would want to live here?

On the other hand, however, I am not only descended from immigrants but also married to one. And to bring Joan here—and to procure first a Green Card and then full citizenship for her—that, was no simple undertaking. There were, as most readers probably don’t know from first-hand experience, a gazillion hoops to jump through: countless forms to fill out, affidavits to attest to, oaths to take, interviews to schedule and then successfully to complete, and years upon years of waiting until it was finally Joan’s turn to appear in Citizenship Court to take her oath of allegiance to our nation and then proudly and enthusiastically to register on the spot to vote. We followed all the rules…so at the same time I am influenced by my forebears’ history I am also influenced by my own family’s experience.

Does it matter to me that it cost us a fortune in lawyers’ fees to make this all happen correctly and in as timely a manner as possible? The truth is that that detail works on me in both directions at once, both making me irritated with people who opt to save their money (and we are talking about a lot of money here) by just skipping the whole procedural nightmare and instead choosing to live here illegally, but also making me sympathetic to people who simply cannot afford to hire an immigrant lawyer to smooth their path forward and who therefore have no choice but to attempt to negotiate this truly impenetrable thicket of confusing rules, picayune details, and nearly-incomprehensible forms on their own and, for the vast most part, in a second or third language they may not even speak perfectly fluently.  Yet, you are theoretically allowed just to fill out the forms on your own without a lawyer’s guidance. But saying that is the equivalent of saying that you are allowed to fill out your income tax forms on your own (and with your own soon-to-be-non-deductible pencil) and mail them in without any help or advice from an accountant…or at least an on-line tax-bot programmed to review your work and point out all the errors you made.

The endless harping on the moral character of the Dreamers strikes me as hugely irrelevant: one of the glories of our republic is supposed to be the blindness of a justice system that treats everybody fairly and equally, specifically not allowing extraneous details to influence the decision of a judge or jury with respect to an accused person’s guilt.

Far more relevant, in my opinion, is the fact that the initial illegal act in question here—a non-citizen coming to this country without permission and settling here illegally—was by definition undertaken by DACA-eligible young adults as children in their parents’ care. Some were babies, but even those who were toddlers or older children can hardly be made to suffer forever because of their parents’ bad deeds. Indeed, the real reason the Dreamers’ plight is so popular to talk about is precisely because it is so much easier to imagine their dilemma being resolved than their parents’, not to mention the other 11 million or so undocumented foreign citizens living illegally in this country, none of whom can claim that they were brought here by someone else and all of whom made the conscious decision to see if they could get away with breaking the laws that govern immigration to our country.

For the Dreamers, though, I have a solution to propose.

The rule—the entirely sensible rule—is generally that citizens of other countries have to apply to come here as would-be immigrants while still residing in their countries of origin. But what if that specific rule were to be relaxed in this one instance? What if we were to require Dreamers to acquire passports from whatever country they are actual citizens, and then to apply to “emigrate” from their home country but without insisting that they actually set up residence there? That would require bending the rules a bit. But it would lead to two good things: requiring the Dreamers themselves to own up to the fact that they are citizens of the countries that they actually are citizens of (it’s a sign of how strange this whole situation is that that sounds like a radical idea, even to me) and requiring that our government exert itself to solve a problem tearing our nation apart merely by bending a procedural rule slightly. Just for the record, Joan applied for her Green Card both times while resident in the U.S., something permitted to her because on both occasions—when we first married and then when we returned here after sixteen years abroad—she was here fully legally. So it’s not like it doesn’t happen ever. It’s just not supposed to happen to people who aren’t here legally. That’s the detail I am proposing we relax in this one instance.

After that, the process should be the same one that applies whenever anyone applies to come here as an immigrant. For persons deemed worthy, the path should open up to acquire first a Green Card and then, eventually, to become a citizen. Persons not approved for immigration should be helped, financially if necessary, to return home…and that should be the rule even if that person doesn’t think of that country as home at all. What can’t go on forever is this ever-burgeoning numbers of illegals: we need to find a way either to make the undocumented among us into citizens or to help them find their way back to their home countries. Hoping the problem will go away if we ignore it long enough is not a rational plan forward for our country.