Thursday, September 24, 2015

Murders Without Murderers

Most of my readers surely know by now of my predilection for murder mysteries, although for some reason mostly of the off-shore variety by authors like P.D. James, Georges Simenon, Arnuldur Indridason, Jo Nesbø, Colin Cotterill, Janwillem van de Wetering, Batya Gur, C.J. Sansom, and many others. But today’s letter is about murder not abroad but at home.  And not the kind of murder generally featured in the books of the above-named greats either.

If I were writing a dictionary, I think I would define murder as the doing of some specific thing that leads directly to the death of another person. Clearly, pointing a gun at someone’s chest and shooting a bullet through his or her heart would qualify.  So would feeding someone poison or pushing them off the roof of a tall building. But other variations on the theme make the concept feel murkier to me. In law, intentionality—that is, the question of whether the defendant intended to take the dead person’s life—is a big piece of the puzzle.   But even that is a complicated concept for non-lawyers like myself to negotiate: what, for example, if the defendant can clearly be shown not to have intended to kill the specific person who died but could or should have known that his or her actions were inevitably or almost inevitably going to lead to the death of someone. Since most—okay, all—of my legal training comes from reading the authors mentioned above (plus the occasional John Grisham novel) and watching Law and Order on television, I’m not entirely sure how qualified I am even to have an opinion!  Yet, even for a non-lawyer like myself, the obvious questions to be addressed feel obvious. What should the precise definition of “almost” in the expression “almost inevitably going to lead to the death of someone” be? How long can or should the chain of responsibility actually be before it becomes ridiculous to consider someone even involved, let alone legally responsible, for the death of another? How should the concept of awareness be folded into the batter? In other words, how aware of the potential consequences of one’s acts should one need to be to qualify as a murderer if those acts lead to the deaths of others? What if one had no awareness at all? Is that an excuse? Should it be? And, if it should be, then just how aware exactly does one need to be about the potential consequences of one’s actions?

My first story has to do with one Stewart Parnell, 61, up until just recently the CEO of the now-defunct Georgia-based Peanut Corporation of America but about to spend his next twenty-eight years of his life, supposing he lives that long, as an inmate in a federal prison. The specific crimes with which he was charged are of the kind relating to interstate commerce that non-lawyers will find it complicated to unravel, but the short version is that the defendant was convicted of knowingly providing tons of salmonella-tainted peanut paste (the main ingredient in peanut butter) shipped over state lines to major customers like Kellogg’s with phony lab reports indicating that salmonella screenings had been duly conducted and that the results were negative. Kellogg’s and the other companies then used the peanut paste to manufacture various food products (and not solely peanut butter) that they sold under their own brand names.  Yet email records uncovered by federal investigators revealed unequivocally that Parnell and other in his organization knew that foodstuffs confirmed by lab tests to contain salmonella were knowingly shipped to customers. (Parnell’s brother Michael was also convicted and sentenced to twenty years in prison.) The results were, literally, sickening: more than seven hundred people in forty-six different state were poisoned and nine died after eating foods made from peanut paste that originated in Parnell’s plant.  Nonetheless, Parnell was specifically not tried as a murderer. Indeed, U.S. District Court Judge W. Louis Sands noted in his remarks both that the theoretical maximum number of years of incarceration to which Parnell could have been sentenced was more than eight centuries’ worth and that that was without him specifically being accused or convicted of murder. Indeed, the judge’s word were explicit: “This is not a murder case,” he said plainly.  And, indeed, it wasn’t: the judge was speaking as a jurist, not an ethicist, and was merely commenting on the nature of the charges laid against Parnell, not extemporizing about the nature of his ultimate responsibility for the consequences of his actions. Nonetheless, let’s leave those words ringing in our ears as we move on to consider my next story.

This one is slightly more far-fetched, yet it too features people being killed but without there being any actual murderers to prosecute.  At a certain point, General Motors became aware of a huge defect in at least some of the cars it was selling the public: a defect in the ignition switch that effectively made it impossible for the front passenger-seat airbag to deploy. Aware of the seriousness of the problem, but apparently hoping the problem would somehow just go away on its own, GM sold cars with these defective systems for ten years after becoming aware of the problem. The results were horrific: 120 deaths and another 1385 individuals hurt seriously in accidents that were the direct result of their GM car failing to operate as promised.  As a result of a deal announced just a few days ago by U.S. Attorney Preet Bharara, General Motors has agreed to pay a fine of $900 million dollars…despite its official position on the matter being that it admits to no actual guilt. Settlements with individual parties are in the $575 million category. And on top of that is the cost, to be carried solely by GM, of fixing more than two and a half million vehicles sold under false pretenses when the company knew perfectly well that at least some of them would not perform as advertised in the event of an accident. 

A fine approaching a billion dollars seems almost unimaginable to me, but it’s not the biggest fine ever paid out—it’s less than the $1.2 billion dollars Toyota paid out last year to atone for its sloth in dealing with the unexpected acceleration issue that plagued some of its vehicles, for example—but it surely is an impressive one. Yet, despite the fact that we are talking about specific actions that led directly to ten dozen deaths, no one is apparently going to be charged with anyone’s murder in this case either. Indeed, the New York Times reported the other day that the plan now is not to press any charges at all against any individual employees of General Motors. Let me quote from a September 16 article by Ben Protess and Danielle Ivory: “After more than a yearlong inquiry into the defect,” they wrote, “…federal prosecutors in Manhattan and the Federal Bureau of Investigation struggled to pin criminal wrongdoing on any one GM employee. They concluded instead that the problems stemmed from a collective failure by the automaker.”

As I keep reminding my readers, I’m not a lawyer. Maybe even that’s my problem, the reason I can’t quite understand how specific decisions can lead to more than a hundred innocent people dying, many in the prime of their lives, without anyone at all being indictable for their murder. (Isn’t that what it means when the government decides not to prosecute anyone for murder when someone is killed, that it cannot say that any specific person is personally culpable enough to be convicted in a court of law? Nor do I fully understand how a corporation can be responsible for its action without the actual people who made the decisions that led to those actions being similarly responsible. Corporations are just legal constructs, after all; only the people who work for them can actually make decisions. Maybe it’s not such a bad thing that I’m not a lawyer.) But I do find myself wondering how this story fits in with Stewart Parnell’s: he too did something that led to the deaths of innocents, yet was not charged with murder. I get it, obviously. Too many in-between steps. Too twisted the path from original deed to end-result. Too many intermediaries. Too little like putting a gun to someone’s chest and pulling the trigger. A line so curvy as by its nature not to lead anywhere in a straightforward enough way to make reasonable the hope for a conviction.

Other examples will sound even more far-fetched. Daraprim, a drug I hadn’t heard of until last week, is used in the treatment of toxoplasmosis, a parasitic infection that can cause life-threatening problems for babies born to women who become infected during pregnancy as well as to AIDS patients or others with compromised immune systems. The reason Daraprim was in the news last week was because its manufacturer, Turing Pharmaceuticals, increased the price of a single pill from $13.50 to $750, thus bringing the cost of a year’s worth of pills into the hundreds of thousands of dollars’ range. If someone stops taking Daraprim because of the prohibitive cost—I should mention that the outcry was so intense that Turing has agreed to lower the price, although without saying by how much—but if someone were to be obliged to give up the drug and then actually did die as a result…would anyone be culpable? Certainly not the CEO of a drug company that manufactures a pill that someone could have taken but couldn’t afford to purchase! Wouldn’t that be like suing restaurant owners if someone in the neighborhood dies of malnutrition because they didn’t just give their food away for free? But if someone does die as the result of a specific action undertaking by someone else…how can no one at all be responsible?


These are philosophical, not legal questions sensu stricto. Yet I find them unnerving to consider at this time of the year particularly because I myself spend so much time trying to feel not guilty about the consequences of my own actions if they are only far enough down the pike from where I am standing for me not to feel really responsible…even though it was my own action, my own decision, that led eventually to the consequence I am trying so desperately not to feel responsible for or guilty of. I understand the difference between moral responsibility and legal culpability, and I also understand that being ethically responsible for misery that you yourself didn’t cause but could have alleviated—which intrinsically Jewish notion takes this line of reasoning even further afield of the criminal justice system—is not the kind of value the courts should or even could enforce. But as citizens of the world, we need to hold ourselves to a higher standard…and not to suppose ourselves free of guilt merely because we are so far along the chain of responsibility from the consequences of our actions that no district attorney would dream of indicting us of any specific crime.  The legal system has its own rules and its own goals to pursue. But as we move into the third third of our season of judgment (the one that follows Yom Kippur and only ends finally with Hoshanah Rabbah, the last intermediary day of Sukkot), we need to look past the question of whether we could possibly be tried in court for some tragedy or another that has befallen someone somewhere and instead ask ourselves what it means truly and honestly to be responsible for the world and for the people with whom we share our planet. The earthly courts have their own rules, obviously. But that is specifically not the court in which we stand before God in judgement during these holiday weeks…and that thought bears consideration too as we move forward towards Sukkot.

Thursday, September 17, 2015

Go to Azazel!

I had a remarkable experience in the Judean Desert last summer, one I resolved on the spot to keep to myself (mostly) and finally to write to you all about on this Friday between Rosh Hashanah and Yom Kippur. We’re often in Israel, but we don’t usually have too many guests come to visit us. But this year was the exception. Our daughter and son-in-law came to spend a week with us. We spent time with four different Shelter Rock couples traveling in Israel. And we had Joan’s brother and brother-in-law in and out over the course of three weeks in July. It was lovely having everybody come to call! But it was with Jonathan and Bela that we had the experience I want to write about today.

The Judean Desert is a desert (what else?) east of Jerusalem that descends all the way to the Dead Sea in the south. Although I generally like exploring new places on my own, we were strongly advised to engage the services of a guide with a jeep to visit this place and that is what we ended up doing. (He was great too—recommended to us by Shelter Rockers Sandy and Richard Cohen, I will be happy to pass his contact information along to anyone contemplating a similar tiyyul in the desert.) And so, on one warm day in July, we set off to find our guide (and his jeep) in a gas station near Ma’aleh Adumim. And there he was when we got there, waiting for us just where he said he’d be. We were off to a good start!

I’d love to tell you all about what we saw in the four hours we spent with him in the desert. But there was one moment that was so exceptional that I want instead to write to you this week only about that one specific experience.  The desert is one of those places that looks like there’s nothing there until someone shows you how much you are missing completely while you’re too busy deciding that there’s nothing in front of you but a huge amount of empty space. (That’s why you need a well-trained guide, ideally one who speaks geological, botanical, and zoological English well.)  And we had moments like that over and over as we moved along, occasionally encountering some sheep or some camels—both the property of mostly invisible Bedouin tribesmen—but more generally encountering nobody at all. When people talk about the emptiness of the desert, this sense of complete aloneness in an immeasurably vast expanse of uncharted wilderness is surely what they mean. (The implications of all that emptiness for the two-state solution is a different matter, one I’d like to write about some other time.) 

And then, suddenly, our guide stopped the jeep and we all got out. I looked around. Nothing. Just blue sky all above—I’ve been to Montana, but this struck me as even bigger big-sky country—and endless wilderness in every direction. We were, however, standing on the edge of a very high cliff. Our guide waited for us to take it all in and then, after pausing for maximal effect, he announced our location. This, he said, was Azazel.

That stopped me in my tracks. Azazel? Really? When Israelis want to tell each other to go to hell, they use Hebrew words that literally mean “go to Azazel.”  But who thought there really was such a place? It was a bit like discovering, after a lifetime of hearing my father talk about his parents’ shtetl in Poland, that there actually was such a place, that it exists in real time and space, that you could actually go there if you wanted and spend a few days looking around.  (I wrote about the experience of discovering our shtetl’s web-presence in a letter to you all about six years ago; if you’d like to reread what I wrote then, click here.)  But this was even more amazing: on some level I suppose I understood that that shtetl must really have once existed and was merely shocked to learn that it was still there and that its team played in the Polish Football Federation, whereas this was Azazel, which I never actually thought existed at all.

Or did I? Azazel is mentioned in the Bible four times, all in the sixteenth chapter of Leviticus. The text there, mysterious in a dozen different ways, is about Yom Kippur and particularly about the great, complex ritual that was ordained to be carried out on that day in the Tabernacle in the desert and then, eventually, in the great Temple in Jerusalem. The ritual has so many parts that it’s hard for me, even after all these years, to keep it all straight without going back to consult the book again and again. But the part of the ritual that is the most stirring, but also the most challenging emotionally, surely has to do with the two goats. The High Priest is commanded to take two goats and to bring them right up to the entrance to the sanctuary. There, he is to cast lots in some unspecified way so that one of the goats ends up designated as the one “For God” and the other, as the one “For Azazel.”  (Just as an aside, the English word “scapegoat” comes to us directly from a misunderstanding of that last expression. In 1530, when Walter Tyndale published the first translation of the Torah into English, he based himself on earlier translations into Latin and Greek and presumed that Azazel was a Hebrew word meant to denote the “goat that was to escape,” for which animal he coined his own neologism: “scapegoat.” I suppose “escapegoat” must have sounded clumsy to his ear, just as it does to mine.)

And then the ritual becomes even more challenging. The goat designed “for God” is sacrificed by the High Priest himself as a sin offering.  But the other goat, the one left living, was not killed at all (or at least not yet) but was rather sent off into the desert to bear the sins of the people, apparently the ones that could not be undone with something as simple as a sin offering, to Azazel, a spot in the desert a long ways off from Jerusalem.  For me personally, it would have been more than enough just for the goat to be sent off in the desert to survive as best it could. And that is, more or less, was Scripture appears to wish as well: that the goat be taken to the place called Azazel and that from there “the goat be sent [even further] off into the wilderness.” In actual fact, the goat was pitched off the cliff and, as the Mishnah (in my opinion just a bit too enthusiastically) reports, never actually got all the way to the bottom while still alive.  So that’s Azazel—a metaphor of some sort (for us, not so much for the goat) for the relinquishment of sin meant to inspire worshipers to feel cleansed of wrongdoing, thus able to face Judge God unburdened by the fear of punishment for sins perhaps inadvertently committed but now at least ritually undone.

And that’s where I was standing. Not in a book or in a dream-temple, but on an actual cliff in the actual Judean Desert, precisely as far from Jerusalem as tradition says (and logic dictates) the “slow-placed man” Scripture ordains accompany the goat to Azazel might plausibly have reached in the time allotted. I looked around. It was just us up there. I detected no traces of prior visitors, let alone ancient slow-paced Temple employees. A bit timidly, I edged towards the cliff. It looked pretty far down to me! The guide, now gilding the lily just a bit—in my experience, this is a feature of Israeli tour guides in general—solemnly informed us that archeologists have combed the terrain at the bottom of the cliff and found…the bones of no animals at all other than goats. That seemed a bit too much to swallow—the last time anyone accompanied a goat to Azazel was in the first century CE, almost two thousand years ago. And the bones are still there? I don’t think so!

But the cliff really is there.  Officially called Mount Azazel by the Israelis and Jabel Muntar (literally, “Mount of the Watchman”) by the local Arabs, it is, at 1720 feet above sea level, clearly the highest peak in the desert. And there we were…wondering what it can possibly have been like in ancient times when the slow-paced man finally arrived at the peak and granted efficacy to the High Priest’s prayer by sending the goat off the cliff into the air. In my heart, I’d like to think that that was just the official version, that when he got there all alone and no one was looking he just sent the goat off to fend for itself as best it could. But, really, who knows…and for me, two thousand years after the last goat went sailing off the cliff or didn’t, the real issue at hand was the notion that sin even can be eradicated through prayer and rituals involving the transference of those sins to animals.

Of course, the ancients didn’t really think that, any more than we really think that throwing breadcrumbs into some stream somewhere somehow cleanses us of sin really.  To me, all of these ancient and modern rituals have one single truth at the core: not that wrongdoing can be magically erased absent the kind of true repentance on the part of the wrongdoer that could conceivably trigger the forgiveness of God, but that human beings—for all we find it almost hypnotically pleasant to imagine otherwise to be the case—can change, can let go of their baser quirks and disreputable ways merely by summoning up the resolve to grow into a newer and finer iteration of themselves. That just as the goat can leave the Temple behind and meet its fate on its own, so can we all leave the rituals particularly that attend Yom Kippur behind and meet our destiny on our own terms, alone in our own wilderness and unencumbered by the need endlessly to self-justify. To stand in judgment before God is the central idea around which Yom Kippur revolves….but to do that thing one needs neither immense learning nor any level of facility with ritual at all. To face Judge God, one needs to be possessed of the things that we all find the most complicated to acquire: uncompromised spiritual integrity, a deep sense of personal probity that actually makes it as impossible to lie to ourselves as it is to lie to an all-knowing God, and a will to grow personally into a finer version of oneself that is specifically not undertaken to garner the approval, or even the respect, of the world.


The goat leaves the Temple, finds its way to Mount Azazel…and brings the sins of Israel to the wilderness where they can dissipate and do no harm like the florets of a dandelion in the morning breeze. The challenge in considering that ritual from the vantage point of the ages is to find it in us to do the same thing…only, in our day, without the goat. Probably, that’s all for the best. (It’s certainly better for the goat!) But, I can promise you, a tiyyul to the Judean Desert and a few moments atop Mount Azazel is excellent for the soul…and a stimulating way, even in July, to begin to prepare for the Days of Awe now upon us. I recommend it highly!

Thursday, September 10, 2015

Rosh Hashanah 5776

As we all feel Rosh Hashanah barreling down on us (or is it we who are barreling down on it?), I find myself particularly drawn to two stories in the news and, even more than just to their interesting detail, to the lesson embedded in them both for all of us as we approach the Days of Awe. Both involve people—in these specific cases, two women—who are trying to remain faithful to their own principles in a world that seems only to want them to abandon them or at least temporarily to set them aside.

The first, and as of now by far the better known, is Kim Davis, the county clerk of Rowan County, Kentucky, who spent five days in jail last week because she refused to issue marriage licenses to two gay couples who applied for them. (To give her her due, she also denied licenses to several heterosexual couples, thus, I suppose, hoping to avoid charges of discrimination by serving no one at all…equally.) Even so, this all led to a court order instructing her to issue licenses to all couples who had been denied service, but she refused to obey and instead instructed her own attorneys to file an emergency application with the Supreme Court that, had it been granted, would have stayed the lower court’s ruling until she could pursue an appeal. Unimpressed, the Supreme Court declined to act, but Clerk Davis continued to refuse to issue the licenses, now claiming herself not to be under the jurisdiction of the Supreme Court after all, but to be acting instead under “God’s authority.” On September 3, she was found to be in contempt of court and was sent to jail. And there she remained for five days until federal district judge David Bunning, a judge of the United States District Court for the Eastern District of Kentucky, ordered her released on the condition that she not impede the issuance of same-sex marriage licenses by others in her office.

Whether she will comply with this eminently reasonable compromise remains to be seen. (Since she can comply by not doing anything at all other than allowing other county employees to do their job, it seems unlikely she can claim that she is being forced by the government or by the courts to betray her conscience or to go against her religious principles.) The Attorney General of Kentucky, Jack Conway, now has the unenviable task of deciding whether or not to pursue a charge of official misconduct in the matter.

The other woman in the news is Charee Stanley, a flight attendant who works for ExpressJet, an Atlanta-based regional airline. And she too is in the news because she wishes not to be obliged by her employers to behave contrary to her religious principles. A recent convert to Islam, Stanley only recently discovered that her new faith not only prohibits her from drinking alcohol but also from serving it to others. And, with that, she announced that she no longer would serve alcoholic drinks to ExpressJet passengers. For a while, the obvious compromise—that she serve the non-alcoholic drinks and that a different attendant serve the alcoholic ones—worked well. But then there were complaints from her co-workers that they were, in effect, being obliged to do her work, whereupon she was placed on administrative leave. To that development, Flight Attendant Stanley responded by lodging a charge of discrimination with the Equal Employment Opportunity Commission, the federal agency charged with enforcing laws prohibiting workplace discrimination, who now will decide whether there is probable cause that discrimination occurred.

The cases are not exactly parallel, but both concern employees who wish not to perform one of the tasks associated with their jobs because they are opposed on principle to doing so. Leaving aside the obvious difficulty in applying the requirement of Title VII of the Civil Rights Act of 1964 that employers  reasonably  accommodate their employees’ religious beliefs “unless it would be an undue hardship on the employer's operation of its business,” the situation feels almost intractable: surely employers can reasonably expect their employees to do their jobs, but it feels just as wrong to imagine that employees in a free nation should be obliged to betray their own moral or religious principles or lose their jobs. Nor does it help particularly that the word “undue” in that last expression is also open to widely varying interpretation.

These cases will eventually be resolved, but I write today not specifically to declare myself one way or the other in their regard—although both seem easily resolvable if the parties involved are open to reasonable compromise—but to suggest that they could form an interesting backdrop to the work of the High Holiday season almost upon us.

Most of us find comfort in the fact that we aren’t murderers or thieves who need to fear God in the way actual criminals naturally fear being tried in courts of law. We find that line of thinking calming, and particularly when we arrive at Unetaneh Tokef and the cantor sings of all humankind passing one by one before God like sheep passing beneath a shepherd’s staff as our divine Judge evaluates us one by one, passing judgment, and then either inscribing us or not inscribing us for good in the great Book of Life. But the great confession that we finally recite on Yom Kippur—the Al Chet litany of sin and wrongdoing—doesn’t actually mention murder or theft…or any of the kind of crimes for which one could actually end up arrested by the police and tried in a court of law. All of the sins listed—forty-four in all—are ethical missteps that lead, not to bloodshed or violence, but to a betrayal of our own principles. Nor are the principles listed ones foisted upon us from without, but are rather the ones we ourselves endlessly insist we hold dear, that we truly cherish. It is those missteps, those brief, almost unnoticeable instances of stepping away from a self-proclaimed value or of betraying a moral principle that are on the list…because, each in its own way, leads us away from the principled, virtuous life we all insist we want as our own.

I suppose that even Kim Davis knew that she’d eventually have to find some way to live with the law, that she was going to have to compromise. I imagine that Charee Stanley knows that too, that she will have to find a way to maintain her principles and do her job. I imagine all my readers have strong feelings about both cases. But it’s so easy to know how other people should behave and so difficult to know personally when to hold ‘em and when to fold ‘em, when to insist on behaving according to pre-accepted moral principles and when reasonably and rationally to step away from them long enough to accommodate…a friend, a co-worker, an employer, a parent, a child, whomever.

I face this almost every day of my professional life, this unwanted, unexpected, highly unpleasant obligation suddenly to decide whether to draw the line and stand firm or whether to step back, whether to be less strident for the sake of a greater good, to be less strict—with myself or with others—so as to achieve something that will not be achievable at all without a bit of moral flexibility on my part. Does that expression, “moral flexibility,” even mean anything? Or is it just a flowery way to justify behavior I know in my heart is incorrect yet which some given situation seems nevertheless to require? Even the notion of the greater good is a slippery fish to try to hold in your bare hands for too long: the phrase usually implies some larger and desirable goal that will be achieved by abandoning some self-imposed rule or principle that at the moment is serving merely to impede this much more important good thing. But can virtue come from the abandonment of virtue? Can it ever be right to be wrong? Are we being reasonable and good-hearted by compromising even on values we have maintained for a lifetime for the sake of achieving some momentarily desirable end? Or are we merely making ourselves feel less bad about behaving poorly by telling ourselves that there are no absolutes in ethics, no moral stone so fixed in place that it can never be appropriate moved to the side just for a moment?


These are difficult, stress-inducing questions for all of us to ponder as the holidays approach. Like everybody, I like to think of myself as a fine person possessed of the finest moral qualities. But when Elul is upon us and I begin the slow effort painfully to review my year and honestly, even if just within the chambers of my own heart, to ask myself if I lived up to the standards I so regularly proudly trumpet as my own…the ease with which I use phrases like “the greater good” and “moral flexibility” makes me feel unsettled, even slightly queasy. I don’t put these ideas down on paper because I wish to answer them in public with respect to myself, but merely to demonstrate that they can be asked. And they can be answered too…but only by people willing to set down the attractive portraits of themselves behind which they generally hide and step forthrightly and honestly into the light. It isn’t easy. It certainly isn’t pleasant. But it’s the key to a meaningful chag and I wish us all the fortitude to make ourselves truly ready for the season almost upon us in just that way. And, of course, I also wish you all a shanah tovah u-m’tukah, a happy and healthy new year for us all and our families, and a year of shalom al yisrael…a year of peace for the House of Israel in all the lands of our dispersion and in Israel.

Thursday, September 3, 2015

Watching the Watchman - An Elul Meditation

Am I the only American my age who didn’t read To Kill a Mockingbird in high school? I certainly could have read it—the book had already been in print for seven years when I began tenth grade and hasn’t ever really stopped selling: to this day, the book has sold an almost unbelievable thirty million copies.  But I somehow didn’t read it then and, as the years passed, I continued not to have read it…until just a few years ago when Joan patiently explained to me that admitting to not having read To Kill a Mockingbird was not much worse than admitting to never having read Heidi or The Lord of the Rings. Remembering the “three strikes you’re out” rule all too well from my single season in Little League as an eleven-year-old, I knew I had to act quickly. So Mockingbird it was. (It really wasn’t much of a choice.) I downloaded a copy, set to reading…and was completely enthralled. After being disappointed so many different times by books that I was told I simply had to read, here for once was something that I actually did have to read: a story that was inspiring, riveting, clever…and extremely well-written. The characters were nuanced, balanced, and believable. The story—unbelievable in a certain sense, but unfolded so artfully that in the context of the book it hardly feels that way at all—the story was uplifting in the best sense of the word. (And I write as someone who generally finds fiction widely touted as “uplifting” mawkish to the point of being off-putting and anything but inspiring.) I loved the book. I’m sure all of my readers who read the book when they were teenagers will agree that it more than deserved the Pulitzer it won in 1961, the year after its initial publication, as will all who have read it since.


The part of the book I loved the most, of course, was the depiction of Atticus Finch. Everybody thinks of him as looking like Gregory Peck, the handsome actor who played him in the 1962 movie and won an Oscar for his efforts. I suppose even I think he looked the part, but it is hardly Atticus’ dashing good looks that make him the hero of the book: it is his simple dedication to the cause of justice that leads him, a white lawyer in Alabama in 1936, to defend a black man accused of rape not because he feels sorry for the man or because he is personally eager symbolically to strike a blow against the endemic racism of his time and place, but simply because he believes the man to be innocent of the charges brought against him and wishes personally to ensure that he is defended in court vigorously and competently.  He clearly knows that he is going to lose, that all black men accused of violence against white women lose when tried in court in his time and place. He knows that, but somehow feels unable to desist from mounting a compelling, more-than-competent defense. When he loses and his client is convicted, he begins calmly to plan Tom’s appeal. When Tom, in despair and knowing all too well that the appeal too will fail, takes his own life (or rather provokes his own murder in a sequence as horrifying as it is riveting,), Atticus personally takes himself into the black part of town to bring the grim tidings personally to Tom’s widow, Helen.

Far from being a plaster saint (as New Yorker critic, Thomas Mallon, called him in print a few years ago), Atticus’ depiction seemed real to me. I spent eleven years in and out of Waycross, Georgia, where I had a student pulpit for seven of my eight years at JTS, and for all the years after that until I took my first full-time pulpit in 1986. I was hardly an expert, but I did spend a lot of time in a very southern part of the South…and, although Mockingbird was set in the 1930’s, decades before I was born, the feel of the book rang true to me, as did the ambiguous, complicated relationship between black people and white people that I myself witnessed and personally experienced during those formative years as I figured out the ins and outs of serving a tiny Jewish community as its only, albeit very part-time, rabbi.

Until this year, To Kill a Mockingbird was Harper Lee’s only book. She was (and is) widely acknowledged as one of twentieth-century America’s finest authors, mentionable in the same breath as William Faulkner (nineteen novels, 125 short stories, twenty screenplays, and six collections of poetry), John Steinbeck (sixteen novels, five collections of short stories, and six non-fiction books), and Ernest Hemingway (ten novels, ten short story collections, and five works of non-fiction).  And now, in 2015, at age eighty-nine, Harper Lee has suddenly become the author of a second published novel, Go Set a Watchman.

The whole story of how the book came into print is complicated and not specifically what I want to write about here. But the short version is that Watchman, although set twenty years after Mockingbird, was actually Lee’s first book. Widely described as a the first draft of To Kill a Mockingbird, the book is not in any sense a draft of Lee’s famous book: it merely concerns the same characters, or some of them, twenty years after the story that made them famous. So it’s hard to know what to call it—it is a sequel (in the sense that it tells the story of what happened later on to the people depicted in the first book) but also a kind of prequel (in that it was written first, presumably before even the author herself knew what would eventually become the backstory to the book that made her famous. The reviews, or at least the ones I personally read, were not particularly kind, taking some sort of critics’ perverse pleasure in noting its flaws, in observing that it would never have become a bestseller if the other book, the later one, had not paved the way for its great success with its own stupendous success. (And stupendous is hardly an exaggerated term to use in this context: To Kill a Mockingbird still sells about a million copies a year.) But what the critics pounced the most mercilessly on was the depiction of the older Atticus Finch, now revealed (or, I should say, prevealed) to be neither a lawyer-saint nor a true hero, but a man of his time and place, a racist whose view of black people was negative in the extreme, paternalistic, and base.

I read Go Set a Watchman this summer in Jerusalem. I can see why the publisher to whom Lee sent the manuscript sent her back to her study to reset her story in an earlier day and to turn a depressing tale about racism in Alabama in the 1950s into an uplifting (that word again!), deeply satisfying tale of moral courage in the face of almost universal adversity. You could practically hear the crowing behind the prose in some of the reviews I came across. You see, he wasn’t such a great man…in fact, he wasn’t great at all. He was a man of his time and place, a man whose defense of poor Tom Robinson was the aberration not the constant, the deviation from the norm rather than the norm itself. And, indeed, life in Maycomb, the town in which the story is set, is depicted in a particularly unappealing way throughout the book. Even the black people who in Mockingbird are mostly shown to be as noble as they are oppressed, are in Watchman mostly depicted unappealingly…including the saintly Calpurnia, the Finches’ maid, who has turned into a bitter, angry older woman whose deep affection for the narrator when she was a child seems not to have outlasted her long years of employment in the Finch household.

As we make our way through the month of Elul, the month that precedes the High Holiday season during which Jewish people are bidden to devote time to introspection, to self-analysis, and to the stress-inducing work of looking in the mirror without flinching or turning away even from the last appealing part of what they see therein reflected, I’d like to suggest a different way to read both books.

The great debate the publication of Watchman has ignited is regularly framed as a question of which Atticus, the almost-fifty-year-old of Mockingbird or the almost-seventy-year-old of Watchman is the “real” Atticus, the depiction of the man as he truly was and not just for a long moment how he somehow appeared to be. Nor does the debate itself seem too serious: almost universally, the assumption seems to be that the older man, the one possessed of racist sentiments and a deeply prejudicial worldview, is the “real” man, his earlier iteration a kind of aberrant blip that made him briefly seem other than he truly was.  But, of course, Atticus isn’t a real man at all—this is a work of fiction, after all—just a literary character depicted at two different moments in a life that never happened other than in the pages of two novels. There are, therefore, no other incidents in his life: just these two moments artfully and intriguingly set forth for us to compare and to weigh one against the other. He isn’t really either man; he is both and neither: two sides of a coin that exists only within the literary constructs of the author’s imagination.

Taken seriously as a literary character, then, Atticus is two different things: a fine, decent man who rises to greatness when a door somehow opens through which only someone possessed of the finest moral virtue would have the courage to step and a man of his time and place who, like most of us, exists as one person among many and takes for granted what everybody in Maycomb believes to reflect, not a meanspirited and racist worldview, but simply how things are. In other words, by taking the books as snapshots of a man at two different moments in his otherwise non-existent life, a portrait is drawn that should be very familiar to all of us.

Just like in art, perspective is only attained in fiction when an author has the skill to draw two portraits in relationship to each other convincingly. And so we are left with a kind of a literary portrait that, because Harper Lee is a very talented writer, suggests the kind of perspective that can be simultaneously unsettling and stirring.  And now that I can view Atticus with some perspective, I feel closer to the man than ever before. He is me, in a real sense…not because I harbor secret racist sentiments, but because I too am a child of the world in which I live, of the society in which I labor and function. Mostly, that’s what everybody is. But occasionally, through some unexpected juxtaposition of circumstances and impetuses, through some combination of unseen forces…we can (and occasionally do) rise to greatness. Atticus’ defense of Tom Robinson was not an aberration in the sense that the author told us a lie about who he was and what he could have done. (Authors of fiction can’t lie, of course—whatever they say happened is exactly what did happen; that’s the whole premise of fiction.)  But it was an aberration in the best sense of the word, an example of a regular person rising unexpectedly to greatness and doing, even if just that one time, something remarkable, something noble and good, something worthy of the great praise and respect his depiction in Mockingbird correctly earned him the minds of millions.

As we pass through Elul, we should take that to heart. We are all children of our time and place. We all believe what everybody believes, just as we all take for granted what the world around us tells us to be true. We are thus all enslaved to the givens of the universe in which we thrive…but we are also all capable of greatness, of stepping away—even if just for a moment—from the norm, from the expected, from the predictable. We are all capable of shucking off the shackles that mostly hold us successfully in place and setting them aside as we rise, unbidden but fully really, to greatness. That, I think, is how to read Atticus in light of this new perspective offered by Harper Lee’s new novel, as a call to readers to notice that, for all we live in prisons fashioned of the ideas society imposes on us, there actually is no lock on the jailhouse door, that we all really do possess the power to step into the light…and to behave nobly and decently, even greatly, even if most of us turn back into mice when midnight strikes. That is the message of both books read at once, and it is a fine and very welcome set of ideas to take along with us as we prepare to enter the season of judgment and repentance that begins in just a few short weeks. I recommend both books highly and suggest that they would make excellent Elul books for people eager to gain perspective on their own lives as the season of judgment is almost upon us.