Thursday, January 30, 2014

The Life of Objects

Earlier this year, I read Susanna Moore’s novel, The Life of Objects. It’s an interesting book—spare in its prose, sharp in its focus, devastating in its author’s willingness to describe the horror and terror faced (for once, at least in my reading history) by “regular” Germans living under Nazi rule.  (The ones in the book, fabulous wealthy collectors of unimaginably valuable art, are second-level victims of the Nazis’ brutish barbarism, although surely not in the sense that we would normally use the word: they survive, mostly, and their “stuff” mostly survives too.) At first, I found myself offended by the title, suggestive (as it is clearly meant to be) of the preposterous notion—as insulting to “real” survivors as it is demeaning to the dead—that these gorgeous things that the Metzenburgs own were “victims” of the Nazis too, the latter depicted here not solely as the murderers of countless innocents but also as boors who had no respect for objets d’art, not for their value and certainly not for their beauty.  That’s what the author has to say about the Nazis? That they had bad taste?

But then, upon reflecting upon the book in the weeks after I read it—and I really do recommend it to you, not least of all because of the portrait drawn of its protagonist, an innocent Irish girl who signs on as a domestic servant in the Metzenburgs’ palatial home and at their country estate mostly for the sake of escaping the doldrums of her hum-drum life in a town on the west coast of Ireland. But what stays with me after all these months is not that portrait, or at least not that portrait per se, but rather the author’s more basic premise: that objects, that things, have lives of their own. And that they are not merely the props in the plays that are our lives, but—in some real, non-Disney-esque sense—players themselves that come on stage and then disappear not because they must or can, but because they do. The notion that young Beatrice grows through her admittedly horrific wartime experience through her association not only with her employers but also with their things…and that those things are not mere trinkets but family members in their own right (and, believe me, I know how odd that must read to people reading this who haven’t actually read the book)—that is the idea that’s stayed with me.

Shelter Rockers all know that I have my own odd relationship to the world of things. I have returned many times in my preaching to the question of what happens to the things of the world once the people holding onto them let go either for a moment or for good. Like everybody, I had four great-grandmothers and eight great-great-grandmothers. Those dozen women must all have had wedding bands, but the rings have all vanished. Surely, no one would sell a mother’s wedding band for a few dollars! Were my great- and great-great-grandmas buried with their rings on their fingers? But even if they were buried wearing them, which I doubt, then where are their k’tubbot, their wedding certificates? I have my mother’s parents’ k’tubbah framed nicely and hanging in our dining room. Our own k’tubbah, made by myself in one of my rare forays into the world of graphic arts, is hanging in our living room. But where are my other ancestors’ k’tubbot? I had four grandparents, eight great-grandparents, sixteen great-great-grandparents. That would yield a total of fourteen k’tubbot, of which all but one are gone. (I don’t have my own parents’ either, which I also find semi-amazing.) Where did they all go? How can it be that, of all those households, not a stick of furniture, not a book, not a framed piece of needlepoint, not a ring or a bracelet or a string of pearls, not a k’tubbah or a prayerbook or a tallis…how can it be that all of it is simply gone from the world? And then there is the horrifying corollary of that thought: will some one of my descendants in the 23rd century say the same of Joan and me, that we must have had a k’tubbah, that we must have had wedding bands, that we must have had a whole houseful of stuff. You see where I’m going. And it’s not the most encouraging place to go in terms of the fantasy I both cherish and foster that I personally will make a mark, will always be remembered…at least by my own descendants, that my books will remain a cherished part of our family’s heritage even long after I personally have drifted off to the World of Truth, or wherever.

Last week, we had a flood in our basement. It was one of those bitterly cold days. Everything was frozen solid—everything, that is, except the pipes in the wall behind our washing machine, which I wish had been frozen solid but which burst from some bad combination of the cold and the water that expanded as it froze inside them. By the time I realized what had happened, there were at least six inches of very cold water on the floor of the basement. Not having any good idea what to do, I called Andy Canle, who manages our custodial staff and who is enormously handy and capable with respect to all physical things…including burst pipes. He came over, knew exactly how to turn the water off, got the water, at least, to stop rising. Then he went to get a pump and to phone the plumber and I myself was left at home alone to take whatever steps I could to save what I could. There was a lot of stuff! Some of it was on the floor, but the water rose up into the lower shelves of all those many cupboards in the basement in which are stored the stuff of our lives that isn’t in current use.

The Pesach dishes did fine. (We wash them each year before using them anyway.) But the rest was a grab-bag. The TV was ruined. (No loss there—why did we even have a television set in the basement?) The piano—the piano of my childhood that used to live in my parents’ living room—came away unscathed.  The fold-out couch will be fine, I think, once it dries out completely. But the rest of the rest, the more ephemeral stuff…Joan’s teaching files from back when people xeroxed stuff and put it in manila folders for later reference, hundreds of floppy disks (both the 5.25 inch and the 3.5 inch variety) from more than a decade of my own research, thousands of pages of galleys for Siddur Tzur Yisrael that I never quite got around to dealing with, files with mortgage documents from our houses in British Columbia and California, cartons of papers relating to novels I published years ago or hoped to publish one day, even some papers and bills from our first New York period (the one that followed our marriage in 1980 and lasted until we left for Israel in 1983)…all of it floating on a sea of ice-cold water and daring me not just to throw it all out.

I passed the test. Out it all went. I don’t even own a computer with a floppy drive, so who needs those disks? (Whatever data was worth saving has been securely—I hope—stored in the cloud for years now.)  In the unlikely event that Joan takes up Hebrew School teaching again, whatever teaching materials she could possibly need are all on-line. No one—not even my most intensely investigative biographer (hah!)—will be interested in mortgage documents relating to the purchase of 10240 Sandiford Drive or 10462 Kozier Drive. And so…out it all went in one sodden mess. And somewhere in those many black garbage bags soon to be sitting on our curb lay the answer to my question: this is what happens to people’s stuff. Life happens. Floods happen. Burst pipes happen. The world keeps turning, but everything else comes and eventually goes. Heraclitus, it turns out, was right—it’s all in flux, the world is an endlessly flowing river, it’s not just time that passes by never to return but the things of this world as well. Including k’tubbot. Including floppy disks. Including wedding bands. (That still seems unlikely to me…but then where are all those rings?) Definitely including old mortgage documents. And definitely definitely including the notes for unwritten or partially written novels.

The Torah mentions almost en passant that shortly after the manna began to fall from heaven, Moses told Aaron to take up a measure of manna and to put it on display “before God,” so that subsequent generations would be able to see it and develop appropriate feelings of gratitude towards the God who sustained their forebears in the wilderness. And so, the Torah also reports, did Aaron do, putting a container of manna right before the Tablets of the Law in the Holy of Holies. Other than the stone tablets themselves, how could there be a more sacred relic? And yet it somehow disappeared. As did the tablets themselves and the wooden ark that contained them. As did the flask of oil used to anoint the priests of Israel. As did Aaron’s rod, the one that sprouted almond blossoms to prove that it was he and his descendants that God chose to be the priests of Israel and which Scripture specifically says too was kept in the Holy of Holies before the Tablets of the Law. The holiest of things…and all gone into the mist of time past. (The Talmud says clearly that the ark itself was hidden away by King Josiah…but where exactly it went and why it was never retrieved—those are questions that no one can or ever has answered.)

If relics of the greatest intrinsic sanctity could somehow disappear, then anything can. As I was wading in my bare feet in our basement (not the smartest idea given how cold the water was, but what else could I do?) and watching the things of our life simply float past me on their way to becoming landfill somewhere, I felt distressed and very unhappy. Later on, though, I felt better…and now it seems to me that this is how it all really does work. It’s not amazing that most things disappear. That I have my grandparents’ k’tubbah hanging in our dining room, that’s what’s amazing! That Joan has my mother’s wedding band. That my mother’s gorgeous diamond-and-sapphire earrings somehow turned into my daughter’s wedding ring. It’s not amazing that it all doesn’t last. It’s amazing that anything at all does.

I didn’t enjoy the whole experience, but maybe I did manage to learn something from it. I have no need to repeat the experience. But one thing my mother did actually leave behind is the lesson that this whole incident reminded me to remember: that no experience that teaches you something important should be regretted…even if the experience itself isn’t so pleasant. Being smarter, Mom used to say, is better than being comfortable. Learning a lesson that you will retain permanently is better than avoiding a few minutes of discomfort. It’s not that easy a point to embrace while you’re up to past your ankles in frigid water. But eventually, later on…once your feet thaw out and you feel unexpectedly at peace with the world and the ephemeral nature of its things…that’s when my mother’s point seems fully cogent. Knowing how the world works is better than having dry feet. At least in the long run!

Thursday, January 23, 2014

The Last Hanger-On

It’s always interesting when news stories that have clearly been written to elicit one set of emotions in its readers end up drawing me along in an entirely different direction from the one the reporter writing the article clearly intended.  Did the reporter see something in the story at hand that I myself missed? Or is the opposite the case, and did the reporter simply miss some part of the story (or, more likely, some part of the back story) that seems crucial, or at least pertinent, to me in terms of what the larger story means or should mean? I suppose in different cases the answers to those questions will be different. But I know what I think with respect to the news story many of you may have noticed about the death last Thursday in Tokyo of Hiroo Onoda.

Onoda, whose fifteen minutes of fame came and went forty years ago, was the Japanese soldier who remained at his post in what he assumed was still the Japanese-occupied Philippines for twenty-nine years after the war ended. And there, presumably, would he have stayed for years into the future had he not been located by some enterprising Japanese student who set out to find him in 1974. (He had been declared officially “dead” in 1959, but this student, Norio Suzuki, felt the records relating to his death were so filled with inconsistencies and unlikely suppositions that he set out to see if he could locate either the man or his grave. He found the former and brought him back to Japan.)  Onoda received a hero’s welcome when he arrived home, which is not that surprising, but also received a full pardon from then-Philippine president Ferdinand Marcos, which really is quite surprising given that, in the course of his three decades in the jungle, Onoda had killed or participated in the killing of about thirty Philippine villagers who made the mistake of coming too close to his hiding place and whom Onoda or one of his colleagues took for enemy soldiers or agents. There had originally been four of them. One surrendered in 1950. Two others were shot and killed by Filipino police officers who were searching for them and who returned fire when they were fired upon, one in 1954 and one in 1972. Eventually, there was only Onoda. And he hung on for another two years until he was finally located by the student and brought back to Japan. He was fifty-two years old then and ninety-one when he died last week.

That motif—of the Japanese soldier who stays at his post for decades, either not having heard or not having believed that the war was over—is famous. There were others. Shoichi Yokoi, for example, remained hidden in the jungles of Guam for twenty-seven years rather than surrender to American forces. He was finally captured—not by American soldiers or the Guam police, but by two American hunters who surprised him while he was setting a fish trap in a river near his hiding place—and returned to Japan in 1972, at which time he made many of his co-citizens uncomfortable by speaking openly about his personal sense of shame at having returned home alive when so many of his fellow soldiers died attempting to prevent the American liberation of Guam. (The Japanese invaded Guam on December 8, 1941, the day after the attack on Pearl Harbor, and held control of the island until American forces prevailed at the Battle of Guam in July, 1944.)  And he spoke without any hesitation about the fact that he was sustained during his decades of solitude not by the hope of seeing his family or his homeland again, but by his unwavering sense of duty to serve the Emperor of Japan, a concept that by 1972 seemed beyond alien to most of his co-citizens that lined the highway to give him a hero’s welcome upon his return home. It was, in fact, the story of his return from Guam that prompted the research that led to the search that eventually led to the discovery of Hiroo Onoda in the Philippines. Eventually, one final hold-out, a man named Teruo Nakamura, a Taiwanese who enlisted as a volunteer in the Japanese Army, was discovered on the Indonesian island of Morotai in 1974 by the Indonesian Air Force. He skipped, probably wisely, the whole “last-surviving hold-out to return to Japan” thing and instead chose to be repatriated to Taiwan, where he lived quietly until his death from lung cancer a few years later in 1979.

I grew up with these stories. Since the Professor too died last week—I mean, of course, Russell Johnson, the actor who played the Professor—it seems reasonable to start by remembering the famous episode of Gilligan’s Island aired in 1965 in the course of which the castaways come into contact with a Japanese soldier still at his post on some mini-submarine that washes ashore on “their” island. (The plot line sounds a bit strained in retrospect. But I was only twelve in 1965 and it was cogent enough for me!) But both before and after Gilligan, that motif of the Japanese soldier who hasn’t heard that the war was over, was a staple of the American entertainment industry. Whole movies were built around it. (I’m thinking primarily of The Last Flight of Noah’s Ark with Elliot Gould and Geneviève Bujold, but there were others.)  Clearly, we were supposed to laugh. The notion, after all, that a soldier would still be obeying his last orders decades after the war was over was intended to be funny. (For the record, Onoda did find some of the leaflets dropped by American forces over the Philippines announcing that Japan had surrendered and the war was over, but he took them for propaganda and refused to believe that they were true. Hardy-har-har!) The whole concept of duty taking precedence over one’s personal wish to return home or to be reunited with one’s family—that one would keep one’s word no matter what and decline to abandon one’s post until ordered to do so by a superior officer, and not by some spurious leaflet dropped from the sky by the enemy—that was the part we were supposed to find amusing. It was funny on Ensign O’Toole!

But maybe it’s not actually that funny.  We live in an age of conditional loyalties, an age in which people take promises as expressions of hope rather than iron-clad obligation. Even the pledge of fidelity to a spouse is considered by most to be more than elastic enough to stretch around the occasional act of infidelity without necessarily breaking. The promise of faithfulness to an employer, to a mentor, to a friend, to a sibling…all these are deemed today by most to constitute desirable but optional virtues rather than truly unbreakable bonds. When the congregation hears the Kol Nidre solemnly intoned aloud on the eve of Yom Kippur, it’s the rare congregant who truly feels devastated by the realization that, yet again, he or she has failed to live up commitments undertaken freely…and just as freely abandoned when the toast seemed more thickly buttered on the side of non-compliance. We mean it when we give our word…but we also don’t mean it, not in the way it could or should mean to people whose word truly is their bond. And then, when the news features someone who took an oath to serve and then spent decades on his own doing just that, we find ourselves more amused than impressed. He stuck it out for how long without abandoning his pledge to obey his orders? What a fool! At least the people on Gilligan’s Island had no choice….

When contemplating these hold-out soldiers, it would be easy—even satisfying—to focus on the horrors perpetrated by the Japanese during the course of the war: on the Rape of Nanking, on the Bataan Death March, on the thousands massacred at Pearl Harbor, on the Manila or the Kalagong Massacres, or on the countless thousands of women chosen for indescribable degradation as “comfort women” for the use and abuse of the Emperor’s troops. To focus the image of these soldiers hanging on in the jungle through mental images of Pearl Harbor or Nanking yields the sense of them as single cells of a malignant cancer that was almost entirely eradicated through chemotherapy and yet which someone managed to remain hidden in some lonely crevice of tissue until they were finally located by some enterprising oncologist who knew where to look. But there’s also the possibility of considering these hold-outs in terms of their unwavering dedication to duty, of their sworn obligation to serve their country until formally relieved of that obligation, of their willingness to subvert their own dreams to the single goal of keeping faith with a commitment they accepted and never felt free to step away from…even once it became clear that that commitment was going to entail not years but decades of their lives.

My sense is that very few Americans think of service to the nation as a sacred calling. We have no compulsory military service, so those who do serve are by definition volunteers. There is, therefore, a voluntary sort of feel to the whole enterprise of serving in the military, and that creates a strange sort of backdrop against which to read the story of Hiroo Onoda, a man who took an oath to serve his country and then kept it. Pledging loyalty to one’s nation is, most would say, a virtue.  We even mean it when we say that, as we regularly do. But that we think it grist for the comedy mill when someone pays the big price for remaining true to that pledge does not speak well for us. Really, not very well at all!

Thursday, January 16, 2014

Ariel Sharon

I find myself unexpectedly affected by the death earlier this week of Ariel Sharon, the eleventh prime minister of the State of Israel and one of its greatest military strategists.

It’s true that Sharon has been gone from the Israeli political scene, and from public life itself, since suffering a stroke in January 2004, so it’s not as though his death will alter any part of the day-to-day scene in Israel in any material way. But, nevertheless, his death doesn’t feel inconsequential to me at all. Just to the contrary: his passing feels like a turning point to me, even despite his absence from the public arena over these last years. Sharon was an old-school leader of his people, one who led by example and who had—and in spades—the courage of his convictions, but also one who was able to grow intellectually and politically in the course of his years in power, and who had the inner strength to allow that growth to alter his opinions and his policies as he grew older. In fact, it was precisely that capacity for inner growth that set Sharon apart from those who have followed him in office; he was a great man not because of his stubbornness, although he was by all accounts a very stubborn man, but because of the elasticity of his intellect and his ability to develop intellectually and morally, to morph forward into ever-more-refined versions of himself, and to see things differently as the light shifted and illuminated what he saw before his eyes differently than previously.

The outlines of his life are well known. Arik, as he was universally known in Israel and by many abroad, was born in 1928 in what was then British Palestine. His parents, Shmuel and Vera Scheinerman, were immigrants from Russia who had settled in Kfar Malal, a village in central Israel named for Moshe Leib Lilienblum, one of the earliest Zionist philosophers and theoreticians. He was still a teenager during the War of Independence, but he participated in the Battle for Jerusalem and ended up as a platoon commander. Ben Gurion himself bestowed the name “Sharon” on him, partially because of its assonance with Scheinerman and partially because Kfar Malal is on the Sharon Plain, but also as a way symbolically of detaching him from his family’s past and charging him with the fulfillment of his destiny to lead his people into the future. 

When he was still in his twenties, he became the founder and commander of Unit 101, a Special Forces unit of the IDF charged with combatting terrorism. By 1956, when the Suez War broke out, Sharon was commanding a brigade of paratroopers and led the successful effort to seize the Mitla Pass in the Sinai from the Egyptians who were defending it. Sharon became known as an extremely aggressive, strong-minded strategist, but many of his efforts were clouded by controversy regarding his tactics, the losses (on both sides) he was prepared to find acceptable, and his general inability to subordinate himself to his superiors. Nevertheless, Sharon emerged as one of Israel’s most admired military leaders, a man whose entire life was subjugated to the single goal of making the citizens of Israel safe and their nation secure.

He played a major role in the Six Day War as well, commanding Israel’s largest and most powerful armored division in the Sinai and playing a key role in securing the Sinai for Israel in the course of a war that lasted for less than a week.  But  Sharon’s greatest hour came during the Yom Kippur War in 1973, when he led the effort—devised by himself—simultaneously to incapacitate Egypt’s Second Army and to encircle the Third, thus effectively to neutralize the ability of both armies to participate in the conflict. This was later understood by most Israelis to constitute the turning point of the war and Sharon emerged as its hero, as the single military leader who had done the most to secure victory.

But Sharon’s future lay not in the military arena, but in politics. He was elected to the Knesset in 1973, but resigned the following year. By 1977, he was Israel’s Minister of Agriculture. But it was in 1981, when Sharon became Israel’s Minister of Defense, that he really became a key player.

The following year, Sharon personally masterminded the 1982 Lebanon War, which was successful in that it effectively ended the PLO’s ability to function as a kind of state-within-a-state in Lebanon, but which also led to the massacre of civilians, almost all of them elderly men, children, or women, in the Sabra neighborhood of Beirut and the adjacent Shatila Refugee Camp. The Israelis did not perpetrate the massacre, which was carried out by Christian Phalangist troops, but the Kahan Commission later determined that Sharon knew that the Phalangists were entering the neighborhood and the camp and so bore responsibility “for ignoring the danger of bloodshed and revenge and not taking appropriate measures” to prevent either. As a result, Sharon was forced to quit his position in the Defense Ministry, although he remained in the Begin government and in subsequent governments in various other capacities: as minister without portfolio, as the Minister for Trade and Industry, as Minister of Housing Construction, as Minister of National Infrastructure and, eventually, as Foreign Minister. And then, in 2001, Sharon was elected Prime Minister by an overwhelming margin, defeating Ehud Barak by winning sixty-eight percent of the vote.  And he remained in office until the end of 2005, when he suffered the first of two strokes that permanently ended his career. He remained in a coma until he died last Saturday and was buried next to his second wife, Lily, the sister of his first wife and the mother of his two surviving children. (Sharon’s oldest child, a boy named Gur, died after a terrible accident in 1967 in which he was accidentally shot to death by a family friend.)

Sharon’s greatness lay in his ability not to be enslaved to his own past. In the early part of his career, he was entirely convinced that the only hope for a secure Israel lay in the use of brute force to make attacking Israel so unpalatable to its enemies that they would eventually desist. He was a warrior in the traditional sense of the word, one who felt that peace can only come from the defeat—and particularly the military defeat—of a nation’s enemies. That some of those enemies fought in regular armies and could be engaged on the battlefield while others chose the path of terrorism and had to be combatted, so to speak, on their own terms and on their own turf—these were mere details that had to be taken into account in planning a successful path forward towards the eventual resolution of conflict through the annihilation of the forces arrayed against one’s country. He was, in that sense, Israel’s Patton—a military man who believed in gaining the upper hand through aggressive offensive action against the foe wherever that foe may be found.  What General Patton might have become had he lived—he died in 1945 after a tragic automobile accident near Speyer, in occupied Germany—no one can say. But Sharon grew past that part of his own past and, when he was through being Patton, he became Eisenhower: a fierce warrior who eventually came to realize that peace in the world grows not from the annihilation of the enemy, but from the resolution of conflict in a way that makes former adversaries able to live together in the world without conflict and in peace.

By the time he became Prime Minister in 2001, Sharon’s sense of how to create a secure Israel had changed dramatically. He publicly spoke about the reasonableness of the Palestinians having a state of their own and openly endorsed the so-called Road Map for Peace put forward by the United States, the United Nations, the European Union, and Russia and supported enthusiastically in its day by President George W. Bush. When the plan stalled, Sharon proceeded unilaterally to press forward with the Israeli withdrawal from Gaza, which involved the forcible expulsion of more than nine thousand Israelis from twenty-one settlements in Gaza. For anyone else, this would have constituted political suicide. Joan and I were in Israel in the summer of 2005 as this was going on and we personally witnessed the extreme emotions, both positive and intensely negative, that the Gaza withdrawal stimulated in every corner of the country. Sharon survived a leadership challenge led by Israel’s current Prime Minister, Benyamin Netanyahu, but ended up leaving his own party and forming a new one, Kadima, in the fall of that year. New elections were called, which Sharon was widely expected to win and which victory would have given him a clear mandate to continue on with his plan to withdraw from most of the West Bank. (The details of that plan have only recently been made public; interested readers viewing this electronically can click here to read an interview with Rafi Eitan, a now-retired high-ranking Mosad official and later a government minister, who knew Sharon personally and was privy to his thinking in the months before he was felled by his strokes.)

It is precisely in his ability to grow intellectually and to act forcefully, not on opinions once held, but on the way he had come to understand Israel’s best chances for a peaceful future—in that remarkable elasticity lay Sharon’s greatness. He was a man who had nothing to prove. His entire life was devoted to his people and to his country. He was a big man—physically huge, personally fearless, politically daring, and the very embodiment of military and political courage—and he was, in my opinion, one of the greats. He erred repeatedly, but he learned from his mistakes and seemed willing to go where none had gone previously when he saw that the path previously chosen was not leading his country where he wished to see it go. In my opinion, that is what it truly means to be a hero.

I have written to you many times now about the role of the hero in modern life and the peculiar way the concept of heroism itself has been debased and eroded in our day. But that does not mean that there are no heroes in the world, only that modern society has chosen over and over to award the designation to mere opportunists whose “bravery” consisted mostly of reckless efforts at self-aggrandizement. Sharon was not in the category. He was, if not the “Lion of God” or the “King of Israel” as his supporters liked to reference him, a brave man who weathered endless controversy for the sake of doing what he believed to be best for the State of Israel and for its citizens. Israel could use more leaders like that. And so could we!

Thursday, January 9, 2014

Remembering the Future

Memory—the ability to identify things we perceive through our senses with things that we have previously seen or heard (or touched or smelled or tasted)—is the sea in which we swim from the moment we are born until we draw our final breaths. It’s how we know what we are looking at, how we identify what we are hearing. It’s how language works and how culture does as well: the ability to read a book or to admire a work of music or art rests in our ability to contextualize the experience by bringing our memories to bear in the interpretation of the data at hand. Yet, for all this reliance on memory is basic to our ability to decipher and interpret the world, it is still more than possible to ignore its existence entirely. Indeed, somewhat in the way air is crucial to human survival and invisible, memory could not be simpler to look past.  Surely no jury would ever convict someone of perjury for saying on the witness stand that he saw an accused person in a specific place and at a specific time when what is really true is that, at the specific moment that he is giving his testimony in court, he remembers having seen the accused in that specific place and at that specific time. We take what we remember of the past to be what happened; the real difference between the past and the future seems to most of us to rest precisely in that we can remember the one and only fantasize about the other.  But whether our memories are more real than our fantasies…that is an entirely different question!

These were the thoughts that came to me the other week when I read an article in Nature, the weekly journal of science, about the recent work of one Marijn Kroes, a neuroscientist at Radboud University Nijmegen in Holland, who, together with a team of colleagues, has learned how to use electroconvulsive therapy (called ECT or sometimes electroshock therapy) not merely to treat depression and other varieties of mental illness, but actually to target and erase—the word they use is “disrupt”—patients’ memories of particularly disturbing events in their past. (If you are reading this electronically, you can access the article by clicking here.) Erasing memories cannot obviously undo the past. But if it can seriously improve the future…then the only reason not to jump on the bandwagon would have to be because there is something ignoble, perhaps even immoral, in altering what we know of the past merely to make ourselves feel better. But is there really any such a thing?

The whole procedure has to do with something called “memory reconsolidation,” which theory posits that the brain actually removes memories from memory storage banks for use, then rewrites and refiles them for subsequent use in the future.  There is, therefore, a period during which the memory in active use is not actually anchored in its place in the brain’s memory bank. And what Kroes and his colleagues have discovered is that, by artfully timing the ECT treatments, it is possible to alter, or even totally to erase, memories that a patient finds devastating or distressing.  A neuroscientist based at Mt. Sinai Hospital here in New York, Daniela Schiller, is cited in the Nature article as describing the results of Kroes’ work as “compelling evidence…that a window of opportunity exists to treat bad memories,” by altering or disrupting them. Whether the therapy could be fine-tuned permanently to eradicate memories that have been called up from the brain’s memory storage area to prevent them from being refiled—in effect creating a future for patients in which they will not recall horrifying events that befell them—remains to be seen. But the idea itself that the past can be “fixed,” thus altering our “the past is history/the future is fantasy” approach to reality rather dramatically, is what I find so fascinating to contemplate.

Obviously, forgetting some horrific memory could never mean that the event it references didn’t actually occur. But what will occur, possibly, is that someone unable to move forward in life because of some traumatic incident will no longer be held back by the memory of that event, a memory that will no longer exist. At first blush, that sounds like a great thing…for the patient in question and perhaps even for society in general. Or is it?

I didn’t see Michel Gondry’s 2004 movie, Eternal Sunshine of the Spotless Mind, which starred Jim Carrey and Kate Winslet and which won the Oscar for best original screenplay that year, but I’ve now read all about it. The plot centers on two people who meet on the Long Island Railroad and strike up a conversation, and then a relationship, years after both have undergone a surgical procedure to have the recollection of their two-year-long miserably unhappy relationship erased by a company that has perfected the art of targeting unhappy memories and eradicating them from the brain’s memory holdings. The plot sounds complicated—perhaps it’s easier to understand when watching the movie than when reading the plot synopsis in a Wikipedia article—but the basic principle is that two people who were unhappy are given a chance to re-invent their relationship unburdened by the recollection of their common past. On the silver screen, it sounds ideal: the potential for future happiness grows directly from the fact that unhappy lovers can no longer recall anything of their former unhappiness and are now free to begin again. Maybe I’ll see the movie. I’d actually like to see it now that I’ve read about it, but the question at the heart of the matter remains difficult for me to answer. Does the past exist as more than recollection once it passes from the present into the past? If it doesn’t, then erasing memory is tantamount to altering the past…but is that a good thing? That’s the question that troubles me and which I thought I’d write about this week.

Let’s imagine that someone was abused sexually as a child to the point at which he or she, even in adulthood, cannot form lasting relationships or find real happiness in love. If the abuser were to be brought to trial, the victim would have to testify. But what if the abuser has died in the interim so the victim can be certain that there will never be such a trial?  Or what if the abuser has already been tried and found guilty, so that there could not possibly be any need for future testimony? Would it be a kindness to erase such an awful memory and, in so doing, to offer to someone crippled by an unhappy past the possibility of moving forward in life unimpeded by the recollection of bad things that once happened? Or would that not be a kindness at all, but merely a band-aid solution that would create some temporary relief from paralyzing memories but which would also make it impossible for the victim ever successfully to work through the factors holding him or her back from finding happiness in life in a truly meaningful way?  It’s not that easy a question even to formulate, let alone honestly to answer.

Nor can I contemplate this question without referencing the Shoah. The brutality, the violence, the unspeakable cruelty, the unimaginable suffering that the Nazis inflicted on their victims—survivors wrestle with their recollections of these horrors every single day of their lives. Some have learned to live with their memories and, by successfully wrestling the worst of their demons to the ground, have come to live normal, happy lives. But what of those who, even all these years later, are still paralyzed by their recollections? Would it be a service or a disservice to erase those memories? Would that be setting such people free…or would it effectively prevent them from ever healing by denying them the possibility of working past the trauma by confronting it purposefully and meaningfully?

Years ago, when we lived in Vancouver, I heard a CBC interview with a child survivor of the genocide in Rwanda, a reign of terror directed against members of the Tutsi tribe that cost as many as 800,000 innocents their lives. The child described the murder of his entire family in such flat, even tones that he sounded as though he were reading from a textbook, and the lack of emotion in his voice as he described the specifics of his parents’ and siblings’ torture and murder was beyond chilling. He sounded calm, but there was a kind of deadness to his voice that, even after all these years, stays with me.  From time to time, I wonder what became of that boy.  He was a little boy in 1994 when the slaughter in Rwanda took place. Now he must be in his mid-twenties, perhaps a bit older. Has he found any peace? Has life returned to his voice? Has he moved past the horror of his own memories to find the courage to build a life not rooted solely in the recollection of terror and horror? I don’t remember his name. I have no way to trace him. I’m not even exactly sure when exactly the interview took place. But I wonder…would he be better off not remembering the murder of his own parents—which he witnessed from some sort of attic in which he was hiding under some blankets? Or would that be murdering them again, this time not by taking them from the world but by taking them from the memory banks of the sole witness to their deaths? What exactly is that poor boy’s responsibility to his parents? And where exactly does that responsibility segue into his responsibility to carry on their legacy by getting over his own misery and creating a family of his own, one in which he can grant his parents’ posthumous grandchildren to carry their legacy forward into the future?

Every Shelter Rocker knows the lesson of the Baal Shem Tov inscribed on our Shoah memorial to the effect that the path to redemption lies through remembrance. That sentiment supposes that memories may best be understood collectively as the form the past must take if it is suitably to provide a platform for building a future. In national terms, that is surely so. But is it as true with respect to individuals? Do I have a moral obligation to remember everything that I have ever witnessed, everything that ever happened to me? We think of the past as real and as the future as unreal, but the past doesn’t exactly exist either…except within the realm of memory and artifact. But artifacts are mute and our sense of history is thus solely a function of our ability to remember the past. We must, therefore, remember…if we wish the future to be built on the past in a positive, productive way.  Forgetting the past would thus be the mental equivalent of hiding from a bully rather than facing one’s tormentor face on, of lathering salve over a wound rather than cleaning it with the kind of astringent that stings terribly when applied but which also leaves it clean and ready to heal.

Like everybody else in the world, I have some memories that even today cause me pain—words spoken in haste, actions I took without thinking through their implications, moments that I acted basely or in direct contradiction to the virtues I claim (when anyone will listen) as my own. Some of these are merely irritating to recall, but there are some that even now make me cringe with regret. In this, we are all alike. But the way to grow forward is not to forget, not to “disrupt” our memories of even the least happy past.  The way to grow forward in life is to face the pain of the past—of both the self-inflicted variety and the kind inflicted upon us by others—and, in so doing, to find the courage to become the men and women we wish ourselves to be. That is growth! And it is precisely in that sense that remembrance can lead to redemption…of the individual and of society in general.

Friday, January 3, 2014

Too Much Information

It’s always seemed to me that clarity of vision—seeing things as they are, not falling prey to self-serving delusions, having a firm grasp of the distinction between reality and fantasy—it’s always seemed to me that knowing how things are out there in the world is the ultimate sign of mental wellbeing and stability. And the inverse, I’ve also always thought, must also be true. And, indeed, it has always seemed to me that one of the hallmarks of mentally unstable people is precisely their inability to distinguish clearly between reality and fantasy, between fact and fiction. All that being the case, a clear sense of how things are—and of where we stand in terms of the most important of life’s variable factors—should be something to cultivate in ourselves. And to encourage in others as well, and particularly in young people just embarking on the kind of self-invention that leads to the enjoyment of successful adult lives.

I have, however, completely changed my mind—or almost completely—in the wake of my discovery of a machine that can ably assist any of us to do just all those things I just said I’ve always taken to characterize mentally stable, healthy adults. To my surprise, I want nothing of it!

I am thinking of a new alarm clock. I don’t actually need a new clock. I already have one on my night table, and it’s a nice one too with big red numbers. Joan has one on her night table too. We have any number of other ones too throughout the house. Plus our phones have alarm clocks built into them, as does Joan’s iPad. The last thing, in fact, we need is a new alarm clock. But this clock to which I refer is not just any alarm clock. It is an alarm clock in the sense that it tells time and has an alarm feature that you can set to go off at a particular time. But that is hardly what makes this specific clock special. Designed by Alexis Bedoret, Ryan Gury, Todd Sussman, and Al Kelly of the Chicago-based company FIG and Dan Sperling of PHAW Architectural Woodworks, a company based in the Bronx, this alarm clock also has three other features, one more horrifying than the next.

If you program it properly and allow it to stay in touch through the ether with your bank, with your Facebook page (I’m sure the designers are young enough never to have met anyone who doesn’t have one), and with your insurance company, the clock will wake you every morning with three pieces of information.

First, you can wake up each and every morning to the specific number of dollars you have in the bank. You can choose which accounts to include and with which institutions the machine should check in the minutes before the alarm goes off at whatever time you’ve set it for. You can decide whether you wish to include funds segregated in pension funds or in IRAs or similar investment accounts. You can include monies you have squirrelled away in whatever savings or brokerage accounts you wish.  And then, when you’re all done, you can wake up, assuming you have a savings account with $12,741 in it and that’s the only account you’ve chosen to include, to this:

But the machine hasn’t even started. Not really! Because the next thing it will do is tell you exactly how many friends you have. It can simply access the number from Facebook. Or you can offer some alternate on-line catalogue of people you know, for example your e-mail address book or the contacts list from your phone. But one way or the other, if whatever source you’ve sent the clock to check yields 761 names, you will then wake up to this:

And then, if you aren’t depressed enough—let’s say, for examples, that you have a huge amount of money in the bank and more friends than you could possibly ever keep up with—the machine will then tell you how many days you have left to live.

Obviously, no one knows when his or her time is going to be up. But there are in this world all sorts of actuarial tables that help insurance companies decide for how long and for how much they should reasonably be willing to insure your life without inadvertently bankrupting themselves. So you feed all your data into the machine—there’s a whole computer program, I’m sure, that makes this doable. You tell it how old you are, whether you’re male or female, how old your parents and grandparents are or how old they were when they departed this world for the World of Truth, whether you smoke, how often you drink alcohol, how much you weigh, how tall you are, what diseases you’ve had over the years, what surgeries you’ve gone through, and a million other details. And then it tells you this:

Is that a long time? 16,561 days is 45.37 years. If you’re seventy, this would be a very welcome piece of information to wake up to.  If you’re twenty, not so welcome!  Depending on where you fall on the scale between twenty and seventy…that’s how delighted you’ll be to jump out of bed in the morning knowing that tomorrow morning the machine is going to tell you that your actuarial tables predict a future of only 16,560 days. You could take it as welcome encouragement to make every day matter, to do some good every single day of your life. But why is it that I do not think that is how most people would respond to this specific piece of information being foisted on them even before they’re out of bed in the morning?

I personally will not be buying a FIG alarm clock. (For one thing, it would be a waste of money since I’m sure I would get out of bed that first morning, throw the damned thing out the window, and be done with it.) But now that I find myself thinking about it…I find myself wondering why, if I truly believe that knowing how things are and where you stand is such a mentally healthy, adult thing….why, if I truly do think that living in a delusional fantasy world is not  a good thing…then why exactly do I so vehemently not want to know how much money I have in the bank, how many friends I have accessible to me, and how long I have to do all the various things that I keep telling myself I will definitely get to…eventually? That is the question I wish to write about today.

I heard an interesting interview on NPR the other day with Roy F. Baumeister, the president of the Society for the Study of Motivation, an interdisciplinary organization of researchers who study motivation. (You can hear it too, if you are reading this electronically, by clicking here.) He was discussing this FIG alarm clock and it was, in fact, that interview that made me decide to write to you about the clock this week.

He had a lot to say, but what really caught my attention was his observation that the notion that mentally ill people lack a clear grasp of reality is often belied by the actual data and, in fact, it is often mentally healthy people who consciously choose to ignore aspects of reality that will make them unlikely to move forward in life. Indeed, he noted, depressed people are often entirely realistic…and more so than their non-depressed co-citizens. They are the ones, after all, who don’t bother writing novels because they know what the odds are of getting published actually are. They don’t buy lottery tickets either, because they know the chances of winning are infinitesimal.  Knowing that they will eventually leave this world, it seems peculiar to care about much—in the end, whatever you earn will end up in somebody else’s pocket anyway! In all these details, they are completely correct. (The chances of being the big winner in the Powerball lottery are 1 in 175 million. By way of comparison, the chances that a woman will give birth to identical quadruplets is 1 in 13 million. I have no idea how to calculate the odds of an identical quadruplet winning the Powerball, but I can tell you it’s not great.) 

So it turns out I was wrong. And also not wrong. There’s seeing clearly. But there’s also seeing too clearly. Knowing how things truly are is a good thing…in small doses. In other words, if you’re about to spend a fortune on tickets to Peru, it’s really a very good idea to know in advance if someone your age and stage will actually be able to climb all the way up to the top of Machu Picchu. If you’re about to buy very expensive car, it’s probably a good plan to know if you can afford it before you actually get to the dealer and start writing checks.  But there’s also such a thing as knowing too much! Since we are all mortal, we all have a specific number of days left. But what good could possibly come from being smacked in the face with that information even before we’ve brushed our teeth in the morning? Some things it’s best to know…just not in too much detail.  It’s hard to say why I don’t want to know how much money I have in the bank exactly, although I’m sure I don’t.  I certainly don’t want to know how many friends I have. (And the list of contacts in my e-mail program is certainly not a list of them at any rate, nor would be the list of “friends” on my Facebook page if I had such a thing.) And least of all do I want to know how much time I have left. It’s enough to know that it’s finite without having to know exactly what my insurance company thinks the specific number of breaths I have yet to take is.  Still, knowing that life is finite is a good thing. It’s what propels people forward. In a certain sense, mortality is what makes life meaningful. And precious. It’s just that certain truths are best digested whole and not in bite-sized (or byte-sized) pieces. You really can see some things more clearly at a distance!