Thursday, October 25, 2012

Winter of Some Other World


The other day I heard a scientist on the radio wondering aloud whether colors really exist. It seemed, to say the least, a peculiar question to ponder. How could colors not exist? Aren’t things different colors? But the further we got into the interview the more interesting the question became. Colors, obviously, are some aspect of what we see in the world.  So what we really mean when we say that color exists, or that a specific color exists, is that our brains interpret the specific way light refracts off certain objects to yield the conclusion—the entirely intellectual conclusion in that it is rooted in the intellect alone—that those objects all bear the same hue, that they “look” the same. Does that mean that they actually are the same in some absolute sense of the word? Or does it just mean that we see them that way to us given the way our optic nerves work and the way our human brains interpret the signals brought to them by along that particular stretch of the neurological highway that is our perceptive consciousness in all its fullness…and essentially arbitrariness. Do cockroaches, with their tiny little brains, also think red apples are red and green ones green? How could any of us know that?

One day, I’d like a neurologist, ideally a philosophically-oriented yet supreme articulate one like Oliver Sacks, to explain this all to me in a way that doesn’t yield the conclusion that, all perception being midrash of one sort of another, all we know of the world is what we think we do, not what we actually do know in any unassailably absolute way. In the meantime, however, I’ve been thinking about these questions and wondering about them on my own. Is all we know of the world a function of how our brains interpret the data? Would it be possible, therefore, for two people to look at the same apple and for one to “see” it as being red and the other to “see” it as green without either being wrong? I have no idea! But the idea itself is truly intriguing. Maybe what we know of the world is precisely a function of what the signals our brains receive mean to us…not in unequivocal, unquestionable terms but in the private, idiosyncratic, deeply personal way that we interpret the world all around us by deciphering sensory data partially in the way our brains are hardwired to work and partially in the way we have trained (or willed) them to work all on our own.

It was in the context of these unsettling thoughts about the way we know the world that I finished Ken Follett’s book, Winter of the World, the second installment in his ongoing “Century Trilogy” series depicting life in the twentieth century. It’s a big book—although at 935 pages it isn’t quite as long as the first book in the series, Fall of Giants—and, like its predecessor, it too features the rich, complicated stories of the slightly intertwined lives of five families, one Welsh, one English, one Russian, one German, and one American, as the members of those families make their way through the fourth and fifth decades of the twentieth century. (The first book featured earlier versions of the same families and covered the World War I era, including the years leading up to the war and the ones following its conclusion.)  I’ve always liked Ken Follett’s books. I believe I’ve read all of them, excluding perhaps only some of his very early ones.  I especially loved The Pillars of the Earth and its sequel World Without End, but some of the earlier books I also liked very much—especially A Dangerous Fortune, The Man from St. Petersburg, and The Key to Rebecca. He writes bigger books than ever now—Winter of the World is his third 900+  page book in a row—but his books are still characterized by the same attention to detail, his trademark (and uncanny) ability to sound as though he knows the places he is writing about intimately, his well-known plot twists and clever characterizations, plus his very skillful way of weaving real historical personalities into an otherwise fictitious narrative.  He’s the real thing, Ken Follett. If you haven’t read him, you’re in for a huge treat.

But for all the author’s talent at depicting places and eras other than his or his readers’ own, the world depicted in Winter of the World is not the one in which I live. Nor is the world he describes anything like the world I personally see when I look back on the years about which he is writing. For one (huge) thing, the Shoah—the prism through which I view everything connected with the Second World War—is almost entirely absent. The story features a long section set in Berlin in the late 1930s, but Kristallnacht is left unmentioned. (1938 itself is passed over without comment—the book skips from 1937 to 1939—which from a Jewish point of view is almost unimaginable.) The sole Jew in the book truly to suffer at the hands of the Nazis is a Jewish doctor who is magically sent home from one of the camps and then who, in the style of the saintly Janusz Korczac, selflessly elects personally to accompany the mentally disabled children in his care to their deaths. (His daughter survives but seems successfully to have fled from her own Jewish heritage, which is only subsequently mentioned only negatively with respect to the burden such a heritage constitutes for her as she attempts to assimilate into English society.) The only Jew to belong to one of the five primary families is a Londoner who does not appear to have any specific connection to Judaism or Jewishness other than openly to self-define as a Jew. But that is not exactly my point: the book doesn’t need to be about Jewish people. But to write a thousand-page book about the Second World War in which the Nazis’ war against the Jews is referenced only in passing and never in detail, let alone in graphic detail—that puts the book more in the realm of science fiction than belles-lettres to me, and makes the world it describes into a kind of parallel universe in which many things take place that also took place in the real world but which nevertheless remains a fundamentally alien and unfamiliar version of the actual world in which we all live.

There are occasional references to Nazi anti-Semitism, but the only sustained depictions of politically-motivated Nazi violence are one horrific account of the execution of a gay man in one of the camps and two moving episodes revolving around the Nazis’ efforts to murder the mentally disabled under the so-called Aktion T4 program of forced euthanasia that eventually led to the deaths of about 200,000 physically or mentally handicapped men, women, and children. The first account is strictly about children, but the second appears to feature Jewish individuals, but the author declines to say that clearly and merely allows the readers to draw their own conclusions from the larger context of the narrative. And it is that notion—that the Jewish dead all seem also to fall into a different category of individuals the Nazis hated—that runs through the book and makes it read so strangely to me. The T4 victims are probably Jews (because they are being treated in a Jewish hospital when it was already illegal for Jewish doctors to have Aryan patients), but they are also mentally disabled. The one time readers encounter the Einsatzgruppen, the death squads that murdered entire Jewish communities in the occupied Soviet Union and in Poland, most readers will fail to understand that the Nazis specifically targeted the Jews under their control for annihilation. Erik, a German soldier, accidentally wanders into the forest and, coming across a mass execution in progress, naively asks the officer in charge about the victims.
             Erik spoke to an SS sergeant standing nearby. “Who are
              these people, Sarge?”
             “Communists,” said the man. “From the town. Political
               commissars, and so on.”
             “What, even that little boy?”
             “Jews too,” said the sergeant.
             “Well, what are they, Communists or Jews?”
             “What’s the difference?”
The story of a fabulously beautiful garden in Wales being destroyed to make way for a coal mine is told with more feeling and in greater detail.

Auschwitz itself is mentioned only once, and there it is reported that “hundreds” of Jews died there, not the 1.2 million Jews who were murdered in that place. It is true that in the context of the story itself, we are possibly supposed to imagine that the report of hundreds killed was an example of how wrong the first details regarding the fate of the Jews in Nazi-occupied Europe were, and how understated. But the figure is never corrected, nor is the camp ever mentioned again. Nor is any other camp mentioned by name that I can recall.

And so I come back to my initial  question about colors. Is what we see of the world only what we think we see? I look back on the 1940s and see the Shoah looming so large before my eyes that I can barely perceive the rest of the world just behind and beyond it. I know all about the rest of it. I’ve read the books. I know perfectly well that the Jews were not the Nazis only victims. The story about the murder of the gay man is as grotesque as it is horrific. The story of those poor  disabled children shipped off to their deaths by a system that saw them only as bills to be paid rather than as precious human lives to be cherished defies belief. Yet, to write about the Second World War but to treat the Nazi war against the Jews as a background detail seems beyond peculiar. Or is that just how fiction works, novelists almost by definition telling one specific story (or, in this case, one set of interlocking stories) and leaving the big picture for historians and scholars to paint in a more even-handed, more comprehensive way? It’s hard to say. The book, after all, is about five families and the specific experiences those specific people had during the war, not about the war itself. No one in any of the families was at Stalingrad, so there’s no account of the battle. None was a Jew in Nazi-occupied Europe, so there’s nothing to tell. Follett hardly has to explain himself to me, of all people. But reading the book was disorienting to me, and profoundly so. Almost like learning that colors are only a function of how we personally interpret sensory data and cannot, therefore, be presumed to represent anything like “absolute” reality.

I’ll definitely read the third volume in the Century trilogy when it comes out in a few years, but the experience of reading Winter of the World has left me puzzled. I suppose the test of a great book is whether it inspires you to ask just that kind of question…about yourself, about how you view the world, about the way you understand the world to function. If that is so, then Follett’s must be a very good book indeed. To see the Second World War through the eyes of characters to whom the Shoah is a mere detail among many, thus just one more instance of Nazi badness directed at yet one more group of innocents without being something unique in the annals of human degeneracy, is upsetting to the point of being disorienting. But, of course, just because others see my apple as they must or do, that doesn’t mean the apple isn’t really red. And, similarly, just because we are prisoners of our own perceptive capabilities, that doesn’t mean the world out there doesn’t really exist…or that history doesn’t.

Thursday, October 18, 2012

Cuba and Iran


I remember the Cuban Missile Crisis only vaguely. I had just begun at P.S. 196 and was in fourth grade. Our teacher, the stately and beloved Mrs. Rose Drayson, did her best to keep a straight face as she had us practice making ourselves safe from an all-out nuclear attack by hiding under our desks with our hands over the tops of our heads, presumably to protect ourselves from falling debris when the school exploded.  Could she really have bought into the idea that the real danger from a nuclear war was going to be from falling plaster? She seemed to! But what did I know? I was only nine and I thought so too.

This month marks the fiftieth anniversary of those terrible weeks in 1962.  The world has changed and not changed. American policy towards Cuba is still confused and confusing. A Castro, although not the same one as then, is still running the show in Havana. The Soviet Union, of course, is no more. But the threat that nuclear weapons constitute to the future of the world as we know it—that is still very much in place. And, in its own way, the question of nuclear weaponry still constitutes the best example of the greatest of society’s unsolved riddles: how to deal with scientific knowledge that, for all it poses an ongoing threat to the future the world as we know it, simply cannot be unlearned, only—if possible—contained and controlled.

The events of October 1962 are surely still etched clearly in the minds of all of us who were old enough then to understand the significance of the events as they unfolded in Washington, Moscow, and the Caribbean. In the summer of 1962, after disastrous failure of the Bay of Pigs Invasion attempt of the previous spring, Cuba and the Soviet Union began secretly to build missile launching sites in Cuba that could accommodate medium- and intermediate-range ballistic nuclear missiles aimed at the United States. The cover of secrecy was maintained for some time too, but then in October of that year, an American reconnaissance flight was able to bring home photographic proof of what was going on.  It was this indisputable proof that triggered the crisis. In all fairness, it has to be noted that the United States had already stationed more than one hundred missiles with the capability of striking the Soviet Union with nuclear warheads in the U.K. (in 1958) and in Italy and Turkey in 1961.  Nevertheless, President Kennedy made it clear that our government would not tolerate the presence of such missiles a mere ninety miles from Florida and initiated a military blockade effectively to prevent offensive weapons with nuclear capability from being delivered to Cuba.  The blockade was set in place. Tensions were sky-high. No one knew where it would all lead, but, in the end, reason prevailed and an agreement, made public on October 28, 1962, was reached. Publicly, the U.S.S.R. agreed to withdraw all offensive nuclear weaponry from Cuba in exchange for an American pledge not to invade the island. And privately our country agreed to withdraw its missiles from Turkey and Italy.

The Soviets kept their end of the bargain, quickly removing the missiles already in place and returning them to the Soviet Union. The blockade was duly lifted soon thereafter, formally ending on November 20, 1962, almost one year to the date before President Kennedy himself was assassinated in Dallas. By September of the following year, all American missiles were gone from Turkey and Italy. The crisis was over.

I have found myself returning again and again to the details of that October in the last few weeks. Partially, that is simply because this month marks the fiftieth anniversary of the crisis.  It’s odd to think that President Kennedy, if he were alive, would be ninety-five years old. (He was born the year after my father, but somehow he remains young in my mind, frozen in time at age forty-six.) Khrushchev, if he were alive, would be 118 years old. (It’s interesting to me that I completely missed the fact back then that Nikita Khrushchev  was old enough to be President Kennedy’s father. I think I thought of them as contemporaries, as I think many of us did.)  But it is not only the fact that a full half-century has passed since I was taught to protect myself from an aggressive Soviet attack against our nation by hiding under my desk in Mrs. Grayson’s classroom that brings the events of that October to mind. Even more than that sense of time past, it is the relationship of those events to the current efforts of Iran to acquire nuclear missiles that is bringing me back again and again to the topic.

After the U-2 pilot returned to base with incontrovertible photographic evidence that the Soviets were installing missiles in Cuba that could hit targets in the continental United States with nuclear warheads, the president convened the group of advisors that would henceforth be known as the Executive Committee of the National Security Council and charged them with elaborating the various courses of action that our country could take to respond to the situation in Cuba.

They responded with a list of six possibilities, some slightly overlapping but each constituting a specific path forward. The U.S. could launch a full-scale invasion of Cuba with the specific goal of overthrowing the Castro government and replacing it with a democratically elected one. The U.S. could bring the full force of its Air Force to bear and aim solely at destroying the missile sites, but without our country undertaking to meddle in Cuban internal politics. Alternately, the U.S. could do nothing at all, which non-course of action could reasonably be justified by noting that the Soviets already had the capability of launching innumerable nuclear missiles at the U.S. from its own territory. A fourth possible course of action, similar to the third,  could have been to do nothing but also to threaten mighty reprisals should the Cubans not dismantle the missiles bases on their own.  A fifth could have been to focus solely on diplomatic efforts to convince the Soviets to withdraw their missiles from Cuba. And finally, there was a sixth course of action the president’s advisors suggested and which the president ultimately accepted, which was to use the US Navy to set up a blockade that would prevent Soviet ships from arriving in Cuban ports.

Each of these paths forward was fraught with danger. Each, other than the option to do nothing at all, involved some level of risk that the United States could have ended up at war with the Soviet Union. Would that war have involved the use of nuclear weaponry? It surely could have! Would such a war necessarily have involved aggressive action aimed at the territory of each country? The sinking of the USS Maine in Havana harbor on February 15, 1898,  led directly to the Spanish-American War, after all, and that war was fought neither on American nor Spanish soil! (Most of the fighting took place in Cuba, Guam, Puerto Rico, and the Philippines.)  And yet, for all the danger involved in doing something, as opposed to doing nothing, the results were favorable. The president chose neither the path the most likely to lead to a war between the United States and the Soviet Union—that would surely have been the decision to opt for an all-out invasion of Cuba—nor did he choose the path that not possibly have led to a war. Instead, he chose a middle-ground approach, making it clear that the U.S. would not tolerate nuclear missiles less than 100 miles from its shores but also allowing the Soviets the time and the space to back off gracefully without appearing to cave under the president’s threats and, at least formally, without losing face.

How could these lessons be applied to the situation in Iran today? The same range of options exists, after all, today: to do nothing, to threaten war, to mount a blockade, to launch a unilateral all-out attack, or to attempt to resolve the dispute diplomatically.  Diplomacy, clearly every sane person’s first choice, seems totally to have failed. An all-out war, unilaterally undertaken and ruthlessly pursued until Iran’s nuclear capability is ground to dust, is every sane person’s last choice.  And so, not unlike President Kennedy in his day, we have opted for a blockade or, to use the modern term, a set of sanctions intended so severely to cripple the economy of Iran that its leaders will, in the end, have no choice but to do whatever it takes to bring those sanctions to an end. Are those sanctions working? At least a little bit, they clearly are. The rial, Iran’s currency, has dropped dramatically in value. The sense that things will only get worse seems finally to have dawned on the Iranians, both the citizenry and the leadership. Will they respond by doing whatever it takes to convince the world to end the sanctions?  I suppose we’ll all find out soon enough. But I remain convinced that the Soviets backed down neither because U Thant asked them to nor because President Kennedy demanded that they do, but because the Soviet leadership finally accepted the possibility that not doing so could possibly have led to a war that they would almost definitely have lost. They wouldn’t and couldn’t ever have said that in public. But I believe that the key motivating factor behind their decision to capitulate was their sense that doing so served their own best interests far more than sticking to their guns, literally, ever could have.

Will the world remain committed firmly enough to the sanctions currently squeezing the Iranians and their currency to bring the leaders of Iran around to the conclusion that abandoning their nuclear pretensions is the decision that best serves, not Israel’s interests or the interests of the United States, but their own interests?  I do not see that happening unless the threat of war is as real to the leaders of Iran as it was to the members of the Politburo in 1962.  And that is why I encourage any of you who have not yet done so to sign the petition we have undertaken to complete by the last day of October and then send to the White House. The petition, which you can access easily by going to www.change.org and then typing “Shelter Rock” into the search box, encourages President Obama to stick to his guns, to honor his commitment to take nothing at all off the table until our honorable and just goal of preventing Iran from becoming a nuclear power capable of sharing its weaponry with any or all of the violent terrorist groups it openly supports is attained, and to know how crucial and vital we consider this issue to be…for ourselves,  for our country, for Israel, and for the world.

To defer speaking out until after the presidential election is over would be, I believe, a mistake: this is not an issue tied to the specific individual sitting in the Oval Office but to our country’s resolve to prevent a catastrophe from taking place.

Friday, October 5, 2012

Tweeting the First Amendment


Like all of you, I’m sure—and also like more than seventy million other Americans—I watched the presidential debate last night with the greatest interest. (That 70 million figure is up from  the 52.7 million who watched the first Obama-McCain debate in 2008, which figure was down from the 62.5 million who watched the first presidential debate in 2004…and even further down from the amazing 80 million plus who watched the single Ronald Reagan/Jimmy Carter debate in 1980. But no matter how you slice it, that’s a lot of people watching two people talk to, but mostly at, each other. By unsettling comparison, 111.5 million Americans watched the Superbowl last February.) But unlike most Americans, I did not watch the debate on television. Instead, I watched on the internet, which provided an edge to the experience that those watching on television regretfully were unable to share.

On the specific internet sites I was watching—and I kept flipping around from site to site specifically to highlight the experience I want to tell you about—the debate was featured more or less exactly as on TV, but with the addition of an ongoing feed of tweeted comments directly from viewers’ Twitter accounts also visible on the screen.  For readers who don’t tweet, Twitter is a micro-blogging site that also serves its clients as a kind of a social networking service. The basic idea is simple enough—you open an account, people sign up to follow your tweets, and then you communicate to all those people whatever you wish…as long as your communication—your tweet—does not exceed the 140 character limit.  Does it sound appealing to be part of a world that communicates in 140-character mini-messages? Apparently it does to a lot of people—there are 500 million active users out there who generate 340 million daily tweets. Almost more astoundingly, Twitter itself fields 1.6 billion search requests every single day. Mostly, of course, the tweeted comments concern no one other than the tweeter and his or her tweeted-to minions. But Wednesday night an arrangement was set in place to allow individuals to tweet their comments on the debaters’ points directly to the sites streaming the debates. And that seemed remarkable, at least to me, not for the content of the tweets that appeared on my screen, which ranged from occasionally insightful to fully dopey (with “nowhere-near-as-clever-as-the-tweeter-him-or-herself-thought” somewhere on closer-to-dopey side of that spectrum), but for the reality of the phenomenon itself.  I myself do not have a Twitter account. But for just the briefest of moments on Wednesday evening, I came as close as I ever have to wishing that I did. (Fortunately, the feeling passed by morning.)

Here were two men, one the most powerful political office-holder in the world and the other a serious contender for the job of most powerful political office-holder in the world debating about details that could conceivably win or lose the election for either of them. And there, secreted away in a thousand thousand living rooms, were the American people—or an interesting cross section of them—commenting in real time, or in almost real time, on what they were hearing not to each other, much less to their pets or their plants, but to millions upon millions of their co-citizens. (Given the truly remarkable lack of vulgar or offensive comments on my screen, I assume there must have been some filtering mechanism in place to safeguard the dignity of the larger undertaking. But the comments that got through certainly seemed to represent every conceivable point of view and didn’t strike me as having been censored at all with respect to their political orientation or candidate preference.)  In some dark parallel universe, the officers of a malign FBI would have spent Thursday browbeating the staff at Twitter into revealing the identities of the tweeters who dared comment negatively about the president or, in a slightly alternate version of that alternate universe, Governor Romney. But I don’t believe that I am being naïve in assuming that no such activity took place at all. Americans take it for granted, as well we should be able to, that comments injected by individuals into a debate between the candidates vying to be our supreme political leader and the commander-in-chief of our armed forces are entirely protected as free speech of the specific type in which all may freely engage without fear of reprisals that is unambiguously guaranteed by the First Amendment. But perhaps we take that freedom, as we do most others of our most basic freedoms, just a bit too freely for granted.

Let’s compare the scene from last night with, say, Thailand, where Daranee Chanchoengsilpakul was sentenced last year to eighteen years in prison for merely “intending to insult” King Bhumibol and Queen Sirikit at a political protest during the course of which she was invited to give a brief speech. (As shocking as that might seem, however, what should we say of the case of a different Thai citizen, Chiranuch Premchaiporn, who was facing a possible fifty years in prison for allowing the website she manages to publish an article that contained a few sentences deemed insulting to the royal family? She was found guilty in February and sentenced to eight months in prison. Her sentence, fortunately, was suspended and she was allowed to leave the court, if not quite free, than at least unincarcerated. But what shocks is the potential penalty she was facing, not her actual sentence.) Nor is Thailand alone in its draconian approach to free speech when it comes to the nation’s leadership. It is against the law in Greece, for example, to insult the president of that country. Merely disparaging the president of Italy is illegal under section 278 of Italy’s criminal code. It is against the law in Morocco and Jordan to insult the kings of those countries. Even in ultra-liberal Norway, defaming the king is punishable with up to five years in prison. In fact, many countries have lèse majesté laws that prohibit people from speaking even discourteously, let alone with real hostility or disdain, about that country’s head of state. All these laws seem to be rooted in the conviction that it is reasonable, even just, to curtail the citizenry’s right to express itself freely when the specific way it is being curtailed has to do with preventing unfriendly chatter about the nation’s leaders.

In these United States, free speech also has certain widely accepted limits that the courts have imposed on the simple concept as presented in the Bill of Rights.  Speech that incites others to violence or to the commission of crimes is not guaranteed by law in our country.  Nor is threatening language that is deemed likely to provoke physical violence. Nor are certain varieties of pornography, although (as we all know) a precise, easy-to-apply definition of pornography itself continues to elude our nation’s jurists. There are other exceptions as well, mostly easy to justify and widely accepted, at least by most, as reasonable. But more to the point is that all nine justices of the Supreme Court voted unanimously in 1997, in a case known as Reno vs. the American Civil Liberties Union, to strike down the anti-decency provisions of the Communications Decency Act passed by Congress in 1996, thereby extending First Amendment rights to the Internet. (There has been a lot of legal wrangling between Congress and the courts since then, most notably with respect to the Child Online Protection Act, passed by Congress in 1998 but repeatedly struck down by the courts as at least partially unconstitutional.  The final sense of how the law should be applied, as far as I can understand it, is simply that the same rules that govern freedom of expression in traditional contexts such as oral speech and printed materials are deemed also to extent to material “published” on the Internet even though the word “published” in that sense obviously means something quite different than it did in the pre-cyber age.)

What seems interesting to me is that so many countries with commitments to freedom of speech seem to find it reasonable, even natural, to limit their citizens’ right to speak out freely specifically when it comes to speaking poorly of that nation’s head of state. To American ears, that rings, to say the very least, oddly. As well it should! The freedom to criticize our leadership, even (or rather, especially) harshly and bitingly, is at the core of what it means—or should mean—to be the free citizen of a free country. Indeed, embedded in the idea of unfettered freedom to criticize our leaders and potential leaders sharply is the ancillary notion that it is our leaders who serve us, and not we, the people, them.

As long as we can keep that specific part of things alive in our country—the right of ordinary citizens to comment as acerbically as they wish on the demeanor or remarks of the country’s top leadership or would-be leadership—I think we will have managed to hold onto the essence of free speech in an ever-evolving world. And to call our world “ever-evolving” is really to say the very least: Twitter, with its current half a billion account holders, was only created in 2006. Before that, if my readers can remember back that far, tweets were solely what Tweety Bird emitted when he was running away from Sylvester. So we’ve clearly come a long way in a short time. (Do they even make Tweety Bird cartoons anymore?)  But that specific aspect of the experience I had Wednesday evening was why I found myself looking past the content of the debate the other evening—which I had been anticipating with such enthusiasm but which, for some reason, ended up striking me as unexpectedly dull—and taking great pleasure in the tweets that kept popping up on my screen.  This, it struck me while watching, is what makes America a true democracy: not the right of ordinary citizens to run for office,  or not just  that, but even more profoundly the right of ordinary citizens with nothing to lose openly to mock our leaders and would-be leaders, laughing derisively at their missteps, calling them sharply to task for making any statements deemed even marginally misleading, and generally feeling entirely unobliged to hold their tongues even when it comes to criticizing people vying for the right to command the combined armed forces of the world’s most powerful nation. If you ask me, that is the freedom that makes this country truly great.