Thursday, December 20, 2018

At Year's End


I’ll be away next week, so this will be my last letter of 2018 to all of you, my faithful readers for all these many years. (For the record, this is my 442nd letter since I began writing them in 2007.  So those of you who have been reading along for all these years probably know me better than I sometimes think I know myself.) As regular readers know, I often like to riff off of recently published essays elsewhere. And so, for this final letter of a soon-to-be-gone secular year, I’d like to respond to an essay that appeared in the Sunday Review section of last Sunday’s New York Times. The piece, written by a Laura Turner (whom the Times vaguely identifies solely as “a writer”), was basically written to set forth the triple thesis that, to quote the author, you can’t go to church on your phone, internet church is not real church, and you cannot be part of an actual community by putting yourself in the company solely of virtual people.
It’s not that hard to see how we’ve come to a point at which such an essay sounds like a measured, thoughtful response to a phenomenon of modern life and not like the premise of a wacky science fiction novel.

I myself am a good example. I hardly ever go to the post office, because I conduct more or less all my correspondence by email. I rarely go to the bank because I pay all my bills electronically and deposit any actual paper checks I receive through my phone. (I can, however, also remember when that last part really would have sounded like science fiction.) Even though there are any number of large department stores within a fifteen minute drive of my home, I buy most of what I purchase online. All of these would once have been opportunities for interaction with other human beings, with neighbors and strangers, with people I might over years have come to think of as friends…or at least as familiar faces. There was some degree of pleasure in those encounters, even in the fleeting ones, but each has now been replaced by a streamlined process that permits me to accomplish wholly on my own what would once have required at least some actual contact with other people. Efficiency is a good thing, obviously. But there is a point of diminishing returns as well: it’s a lonely world we’ve constructed for ourselves, this ultra-efficient digital age in which it feels necessary to justify taking the time to do even something as inconsequential as buying a pair of socks when you can accomplish the exact same thing—and probably for less money—without even standing up from your computer, let alone getting in your car and driving somewhere.
So Laura Turner’s essay is addressing what really is just the next step in an already ongoing progression of societal innovations that purport to improve life by making some specific task doable without lifting your hands from the keyboard: the invention of the virtual congregation able to provide the succor and support of affiliation without imposing the tedious necessity on the would-be congregant of actually leaving home and sallying forth into the world, much less having to encounter actual people on a spiritual journey that, at least in a certain rarified sense, must be taken alone anyway.

But there’s alone all by yourself and there’s alone in the company of likeminded others. And Jewish legal tradition—which certainly acknowledges the intensely personal nature of prayer—is also very clear about the importance of communal prayer being undertaken within the bosom of an actual community. Indeed, our traditional texts go so far as to discuss—and entirely seriously—if a minyan of ten can be duly constituted of nine people well inside a room and a tenth merely standing in the doorway, or of nine people in a room and a tenth standing outside the building but looking in through an open window. As usual in cases like this, the exceptions prove the rule: our classical texts permit the three needed to recite the formal introduction to the Grace after Meals merely to be within eyeshot of each other and specifically not to have to be in the same room, but that is because, in the end, there is not imagined to be any enhancement for the individual in the spiritual experience of being part of the requisite three (called a zimmun). And, indeed, when we move up from the basic introduction liturgy to the slightly expanded version solely recited in the presence of ten, however, then mere visual contact is specifically deemed not enough and the ten have actually to be in the same room, in the same space.
There is some rational wiggle-room provided in the sources. Once the quorum is achieved, then even individuals who may not be counted in the quorum because they are not physically present may be permitted to respond when the liturgy requires a response. (For a detailed exposition of the whole issue—and the full range of halakhic options and possibilities—written by my colleague and my friend of forty-plus years, Rabbi Avram Israel Reisner of Baltimore, click here.) And there is a whole interesting Pesach-related dimension to the discussion because the Torah’s original framework for requiring that worship take place in a single “place” has to do, not with prayer at all it turns out, but with the ingestion of the paschal lamb on the eve of Passover.

That Pesach angle is more pertinent than it might at first seem in that the whole concept seems to be rooted in the interesting notion that, because the festival exists specifically to commemorate the Israelites’ passage from slavery to freedom, the formal ingestion of the paschal sacrifice can only take place in the context of community because true freedom can only be attained in the context of community, of people occupying the same space, breathing the same air, making room for each other in their personal ambits and responding not to each other’s voice or image but to each other’s actual presence. If a tree falls in a forest where there is no one to hear it crash to the earth, there is no real sound produced. Whether that’s really true, who knows? (My physics teacher in eleventh grade said it was and she seemed pretty sure she was right.) But what’s indubitably true is the corollary idea relating to community: if we grow freely to the fullest flower of our personhood without the presence of others to note our growth, to evaluate it, and to respond to it, there is no real growth at all…only the self-generated illusion of spiritual, emotional, or intellectual progress.
But that notion—that true personal freedom does not derive from the unfettered ability to act as one pleases but rather in the cohesive sense of belonging that comes from participation in the group, in a minyan, in a community of supportive friends, in society itself—that notion feels slightly out of step with our American ideal of the self-reliant individual who stands on his or her own two feet, who specifically does not need input from others to know where he or she stands, and whose rugged individualism is the human face of the Constitution’s ideal of the natural right of people to exist according to their own lights and without reference to the opinions or prejudices of others. Ralph Waldo Emerson, the greatest of all American essayists and nineteenth century thinkers, wrote about this stirringly and very convincingly in his essay, “Self-Reliance,” which I have read many, many times over the years. And so I, filled with the greatest respect for Emerson and his philosophy of life, could easily find myself on the horns of a serious dilemma, part of me standing with Emerson and seeing community as, at best, the platform upon which individuals stands as they grow towards freedom, and the other part of me understanding community not as the platform on which free individuals stand but as the context that grants them their freedom. But that dilemma does not really exist.

For millennia, the opposite of freedom was slavery. That was Emerson’s world, and his essay reflects that fact. (“Self-Reliance” was written in 1841.) But that was then and today the opposite of freedom for most moderns is loneliness, not chattel slavery. And that is why Laura Turner’s op-ed piece was entirely correct: because no one, not even the super-digitized hipster who wouldn’t dream of setting foot in a bank or a post office, becomes less lonely other than by being in the presence of other flesh-and-blood people. Looking at images on a screen cannot provide the succor of presence any more than looking at pictures of food can provide nourishment. And there is no greater barrier to spiritual progress than the sense of aloneness that leads to alienation from society, disaffection with one’s neighbors, and creeping estrangement from God. I think I know enough of Emerson to risk saying that I think he would agree wholeheartedly if he knew enough of our world to form an opinion.
As 2018 draws to a close, I think of the way my community comes together when someone suffers a loss and how remarkable the shiva experience is precisely because of the remarkable level of inner-communal support we are able to provide because so many are so eager to help. I think of my own brief hospitalization a few weeks ago and how many people—more than I could keep track of—texted me and left me emails and voice mails, each trying to provide from afar some of that sense of communal support that each understood to be requisite to the healing process. And I think of what it means to so many of us to be growing older in the context of a rich, caring community of friends ready to reach out and shore us up when we start to list to one side or the other, or when we suddenly feel ourselves to be on shaky ground, or when we momentarily lose our self-confidence and fall prey to the fears we mostly manage to keep at bay. It’s a blessing, being part of any community…but my special blessing is to be part of ours. The Marlboro man may exemplify the individualist who can ride off on his own into the next chapter of his life because he serves as his own universe of discourse, as his own arbiter of taste and style, as his own judge and his own jury. But for me, life is with people…and it is involvement with others that sets me free, that makes me free…and that enables me to live free in a world of people trying to go to church on their phones and uncertain why it never feels quite right.

Thursday, December 13, 2018

The DNA of Shared Experience


The modern science of genetics has made it clear how parents can pass aspects of their own specific heritage along to their offspring. And, indeed, the reason no one finds it at all startling to assert that specific physical traits can be transmitted from generation to generation is precisely because we can see easily enough how a child’s hair color or eye color often matches one or both of his or her parents. From there, it’s not that much of a leap to considering non-physical attributes—say, a predisposition to excel at athletics or as a musician—in that same vein. And, indeed, we also all know instances of children appearing naturally to be good at some skill at which one of their parents excels (or at which both do). But can that notion be extended to include specific experiences parents may have had as well? At least at first blush that sounds like a stretch: the notion that something can happen to me and that that experience can somehow end up encoded in my DNA if it only crosses some theoretic line of genetic responsivity—that feels hard to imagine. But learning about the science of epigenetics has altered my thinking in that regard, and altered it powerfully. What I’ve learned in that regard is what I want to write about this week. And also about its implications for my understanding of the nature of Jewishness itself.
I was prompted to start taking the possibility of the transmissibility of experience seriously by a study published in the Proceedings of the National Academy of Sciences just this fall. Written by Dora L. Costa, a professor of economics at UCLA, and by Noelle Yetter and Heather DeSommer of the National Bureau of Economic Research in Cambridge, Mass., the study focused on data from Civil War days and concluded that the sons of Union Army soldiers who suffered severe trauma in the course of their time as prisoners-of-war in Confederate prison camps were significantly more likely to die without reaching old age than the sons of Union soldiers who were not captured or incarcerated by Confederate forces. Since all the sons in the study were born after the end of the war, the study suggests that they must have—or at least could have—somehow inherited their fathers’ traumata and suffered from their aftereffects. There were even subtle gradations of experience to consider: the sons of men who were imprisoned in 1863 and 1864, when conditions for prisoners of the Confederacy were especially brutal and inhumane, seem to have been even more likely to die as young people than the sons of Union soldiers taken captive earlier on in the war. (To see the original study, click here. For an excellent analysis of the study published in The Atlantic last October that most readers will find far more accessible, click here.)

The phenomenon has been demonstrated to exist in the animal kingdom as well. A few years ago, for example, scientists were able to demonstrate that when mice who were trained to associate the smell of cherry blossoms with the pain of electric shocks were bred to produce offspring, both the next generation and the generation after that responded anxiously and fearfully to the smell of cherry blossoms in a way wholly unlike mice whose parents or grandparents hadn’t been trained to associate that scent with that level of pain. The study, written by Emory University School of Medicine professors Brian G. Dias and Kerry J. Ressler and published in the journal Nature Neuroscience, concluded that the mice had somehow inherited a response built into their parents’ history of experience. (For a very interesting account of this experiment published in the Washington Post a few years ago, click here.)
And then there came the study of the Dutch famine victims. In the winter of 1944-1945, to punish the Dutch for having attempted to assist the Allied advance into Europe by shutting down the railway links that were being used by the Germans to bring troops to the front line, the Nazis blocked food supplies from coming into Holland so severely that more than 20,000 people died of starvation by the time the war ended the following spring. (For more information, the best source is Henri A. van der Zee’s The Hunger Winter: Occupied Holland 1944-1945, published in 1998 by the University of Nebraska Press.) This would just be one more horrific story of German savagery during the war, but it led to some interesting scientific studies, one of which was published jointly by seven scholars led by Professor L.H. Lumey of Columbia University in the International Journal of Epidemiology in 2007 and which appeared conclusively to prove that the wartime experience of famine impacted not only on the poor souls who had to live through that dreadful winter in the Netherlands, but on the children born to them after the fact: as a group, the children of people who lived through the famine and survived to become parents later on experienced higher rates of obesity, diabetes, and schizophrenia than their co-citizens. They were also noticeably heavier than Dutch people born to parents who did not live through the Hunger Winter. And they died younger than other Dutch people did on the average—the study found that people born to Hunger Winter parents were still experiencing a full 10% higher rate of mortality even sixty-eight years after the famine ended. How exactly this all works, or might work, is beyond me. (Click here for the study itself and here for a far more easily understandable summary of its results published in the New York Times last winter.) But what seems easy to seize is the basic principle that the severe trauma suffered by people living in occupied Holland while the Germans were actively trying to starve the civilian population into submission was so severe that children born to those people ended up with the experience somehow encoded in their own DNA even though they themselves did not experience the famine at all.

This accords well with an essay by Olga Khazan published last summer in The Atlantic in which the author was able very convincingly to demonstrate that victims of intense racial discrimination seem as a class to experience a process called methylation on the specific genes that are connected with bipolar disease, asthma, and schizophrenia, and that this specific genetic change too can be passed along to subsequent generations. What methylation is exactly is also beyond me. The simplest explanation I could find was on the www.news-medical.net website (click here), but was still far too sophisticated for a mere liberal arts major like myself to fathom. The basic principle, though, is clear enough: the experience of intense discrimination can apparently imprint itself on your DNA in a way that makes it possible for children born to you even long after the fact to have to deal with traumatic experiences they didn’t personally experience because those experiences have somehow ended up encoded in their DNA.
And that brings me to the Jewish angle in all of this.  Just two years ago, Dr. Rachel Yehuda of Mount Sinai Hospital here in New York discovered evidence of this methylation process affecting the gene associated with stress not only in the DNA of Shoah survivors, but in the DNA of their descendants as well.  The study, published in Biological Psychiatry in 2016 and not meant for any but specialist readers, was not universally praised—but mostly because only survivors and their children were included in the study, but not the survivors’ grandchildren or great-grandchildren. (For an example of a hostile response published in the U.K. in 2015, click here.) But those scientists are basing themselves, as scientists surely should, on the specific way Dr. Yehuda used the empirical data that was available to her. I, on the other hand, not being encumbered by an actual background in science, find her work wholly convincing and more than easy to believe. Indeed, I have spent my whole life wondering where Jews like myself, who can reasonably say that not a single day has passed since adolescence in the course of which some thought or image related to the Shoah has not surfaced invited or uninvited in my consciousness, from whence this obsessive involvement in the Holocaust derives. Both my parents were born in this country. Therefore, neither was personally a survivor. But both were adults when the war ended and the details regarding the camps and the mass executions became known; the trauma the Jewish community experienced over the months that it took for the true story to become known—and the compounded trauma of slowly coming to terms with the degree to which the Allied forces chose consciously not to interfere with the daily transfer of Jews to the death camps—that trauma, it now seems to me, is at the core of my own worldview, of my own sense of who I am and what the world is about.

Nor is this just about the Shoah in my mind. Ancestry.com recently updated my DNA profile and declared me genetically to be 100% Ashkenazic Jewish. (I had previously been hovering between 96% and 97%). Am I also carrying around the epigenetic markers associated with the First Crusade and the butchery and devastation the Crusaders brought to the defenseless Jews of the Rhineland? I suppose that will only seem an obscure question to readers who don’t know me personally. 

Thursday, December 6, 2018

Chanukah 2018

For most North Americans, Chanukah is a sort of “us vs. them” affair: the foe wanted to obliterate us (or, depending on who’s telling the story, our faith or our culture or our way of worship) but the Jews of that time were unexpectedly, even perhaps miraculously, able to resist the enemy’s dastardly plans and to chase the minions of the evil king back to wherever it was they came from before they could bring their despicable plan to fruition. Doesn’t that sound about right?

Like all (or at least most) oversimplifications, this one is not entirely incorrect. There really was a King Antiochus on the throne of the Seleucid Empire—the Greek-speaking kingdom with its capital at Antioch in today’s Syria that ruled over the Land of Israel in the second century BCE—and he did promote the eradication of traditional Jewish norms of worship even in as sacred a space as the Jerusalem Temple to make them more universal and less ethnically distinct. There was every reason to expect the ragtag group of guerilla warriors who gathered around the Maccabees—who seem to have come out of nowhere to do battle with Antiochus’s legions—there really was every reason to expect them to go down to defeat, yet they were successful and managed against all odds to expel the king’s armies from what was in those days, after all, a province of his own empire and—even more unimaginably—to wrest some version of autonomy from the central government and thus to install a kind of self-rule that lasted for almost a century. And if the darker part of the story—the one we generally ignore featuring large numbers of Jewish people more than eager to make Jewish ways less particularistic and more in step with the great cultural tide of the day (called Hellenism, literally “Greekishism,” because of its origins in the culture of classical Greece) and very happy to have the king’s support in their effort to reform the Jerusalem cult and make it more appealing to themselves and to outsiders looking in—if that part is generally ignored, that’s probably all for the best. Who wants an ambiguous yontif anyway? Much better to stick with the Hebrew School version and not to stir the pot unnecessarily! We don’t have enough to deal with as it is?
This week, therefore, I would like not to talk about the well-known part of the Chanukah story and its key players at all. (Shelter Rockers will hear me speak about that part of things in shul on Shabbat anyway.) Instead, I’d like to start the story in media res and begin to say why Chanukah really does still matter by introducing a personality that almost no readers will ever have heard of, one Judah Aristobulus.

And here he is, at least as Guillaume Rouillé, the inventor of the paperback, imagined him in sixteenth-century Lyons. But who was he really? And why do I want to start my peculiar, start-in-the-middle version of the Chanukah story with him of all people?
Everybody has heard of Judah the Maccabee and most know that he had several brothers as well as a famous father. But what exactly happened to them all—that is the part no one knows. And more’s the shame, that—because the most profound part of the story is precisely its least-well-known part.

Jerusalem was taken in the year 164 BCE, but the fighting continued for years and, indeed, Judah himself died in battle in 160 and was replaced as commander-in-chief of the Jewish army by his brother Jonathan, who at the time was already serving as High Priest. Jonathan was as much a politician as a general or a priest, however…and he made a fair number of enemies by attempting to transform an autonomous Judah within the larger Seleucid empire into a truly independent state by signing treaties with any number of foreign countries. He lasted for almost two decades, but was finally assassinated by someone who apparently found his politics intolerable and was succeeded by his brother Simon, the last of the original Maccabee brothers.  The inner politics of the day is interesting enough, but what fascinates me in particular is the way that the Maccabees, who started out only wishing to prevent the Seleucid emperor from disrupting traditional Jewish life, became more and more intoxicated with the power they saw themselves able to seize. Judah was a kind of a general. Jonathan was a general and High Priest.  And Simon convened a national synod that formally recognized him as Commander-in-Chief, High Priest, and National Leader. Most important of all, he negotiated a treaty with the Roman Senate that cut the Seleucids out of the action entirely and acknowledged solely the Maccabees as the legitimate rulers of their land.
The story only gets bloodier. Simon was murdered in 134 BCE by his son-in-law, a fellow named Ptolemy, and thus became the first Maccabee to be succeeded not by a brother but by his own son, a man known to history as John Hyrcanus. In his day, the war with the Seleucids flared up again. The details are very confusing, but the basic story is simply that the Seleucids took back all of Israel except for Jerusalem itself, then abandoned it all when Antiochus VII died in 129. Indeed, as the Seleucid empire slowly fell apart, John Hyrcanus embarked on a military campaign to seize what he could of the adjacent world. And he was successful too, conquering a dizzying number of neighboring states, in the course of at least one campaign, the one against the Idumeans (the latter-day Edomites), he forced an entire nation to convert to Judaism. Most important of all, he cemented the nation’s relationship with Rome, agreeing to work only in the best interests of the Roman Republic in exchange for their agreement to recognize Judah as a fully independent state. He established relations with Egypt and Athens too, thus making Judah into a real player on the international scene. And then he died in 104 BCE, one of the very few Maccabees to die of natural causes.

His eldest son was Judah Aristobulus. The original plan was for Judah to become high priest and for his mother to become the political leader of the nation. Judah Aristobulus (also sometimes called Aristobulus I) found that irritating, however, so he imprisoned his mother and allowed her to starve to death in jail. Then, for good measure, he also imprisoned all his own siblings but one. (He had that one killed eventually too.) And it was this Judah Aristobulus who, not content with just being High Priest, commander-in-chief, and political leader, also named himself king.
It didn’t last. He himself didn’t last—he was sickly to start with and then, after one single year on the throne of Israel, he too died and was replaced by his oldest brother, known to the Jews as King Yannai and to the rest of the world as Alexander Jannaeus.

It’s easy to get confused by the details. I’ve read the part of Josephus’s Antiquities of the Jews that covers the Maccabean years—the only sustained, detailed narrative covering the entire period—a dozen times. It couldn’t be easier to get lost in the forest amidst so many different trees—and the fact that there are so many different people with the same names only makes it more confusing. But when you step back and look at the larger picture, you see something remarkable…and deeply relevant to our modern world.
The Maccabees—known to history more regularly as the Hasmoneans—started out as highly and finely motivated as possible. They had an emperor ruling over them who held their national culture in disdain, so the Maccabees rose up and somehow won a measure of autonomy for their people that most definitely included the right to run their own cult and to pursue their own spiritual agenda. But the power they won on the battlefield corrupted them from within, leading them not only not to act in the nation’s best interests but to cross a truly sacred line when Judah Aristobulus finally broke with the very religious tradition his family came to prominence to protect by declaring himself king.

He wasn’t from the tribe of Judah. (The Maccabees were priests, so of the tribe of Levi.) He wasn’t descended from David. He had no legitimate or even illegitimate claim to the throne. But he took it anyway…and that act of self-aggrandizing sacrilege set the stage within just a few short decades for a massively blood civil war undertaken by two of his nephews who were vying for the crown, which disaster opened the door to the Romans who saw in it an opportunity to occupy Judah and make it part of their empire, which they did in 63 BCE. The next time Jews managed to declare in independent Jewish state in the Land of Israel was in 1948 CE, a cool 2011 years later.
It is never a good thing when a nation’s leaders see in public service not a way to contribute to the welfare of the nation but an avenue for self-aggrandizement, self-enrichment, and self-promotion. The Maccabean descendants became wealthy and powerful. They hobnobbed with the delegates from the world’s most important nations, including the world’s sole super-power at the time, the Roman Empire. They reduced even something as innately sacred as the office of High Priest to a mere stepping stone capable of leading to still greater authority. As they became more and more entangled in their own inner-familial struggles, they relied increasingly on generals who themselves had a wide variety of personal agendas to pursue. And then they crossed the line and, in an act of spiritual madness, made themselves the kings of Israel despite the fact that they had no justifiable claim to the crown.

Public service is a burden and a privilege. Our greatest political leaders have always been people who saw that clearly and who allowed themselves to be saddled with the millstone of public office out of a sense of personal honor and deep patriotism. We have had American leaders that like—Abraham Lincoln, I believe, was such a man—and our nation is the richer and better for their service. But the larger story of Chanukah—the one we never tell in Hebrew School—has its own deeply monitory lesson to teach: that greatness in governing is a function always of personal character…and never one of mere opportunity.