A few months
ago, I wrote to you all about how odd—and not at all in a good way—it seems to
me that we have elevated honesty to the level of desirable asset that draws support
to candidates for election rather than treating personal probity as one of the
bedrock virtues that we as a nation simply expect of anyone at all who would
vie for public office. Thomas Jefferson once famously wrote that honesty was,
in his opinion, “the first chapter in the book of wisdom.” But since we also
don’t particularly expect our candidates to be wise, only savvy, his point
will probably not strike the electorate as all that compelling. This phenomenon
is upsetting to acknowledge but not that hard to explain: the reason we are
prepared to support candidates specifically because they strike us as
honest rather simply rejecting as unworthy anyone proven to lack personal
integrity is because we’ve also become strangely inured to the phenomenon of
lying in society in general. And given the degree to which political campaigns
seem to embody the best and worst of societal attitudes towards most things, it
was probably only to be expected that the presidential campaign—the mother of
all political campaigns in our country—should again and again prove my point
that we no longer demand truthfulness of all office-seekers but instead
prefer to admire it when it unexpectedly surfaces in one specific candidate.
All that
being the case, I was particularly interested in a study published the other
week in a journal intended solely for serious scientists, but which received
coverage in the general press and which came to my attention in that way.
The study, published
this week by Neil Garrett, Stephanie C. Lazzaro, Dan Ariely, and Tali Sharot in
Nature Neuroscience, is entitled “The Brain Adapts to Dishonesty” and
describes the results of a fascinating experiment undertaken by the authors of
the article in which the brain—and specifically the part of the brain called
the amygdala, the area of the brain associated with emotional response to
outside stimuli—can be shown to become slowly but verifiably inured to the
telling of untruths…and how the diminished response to the telling of self-serving
lies paves the way for the individual in question to tell increasingly more
brazen lies as the naturally negative response to deceit erodes in the course
of time. (I actually found it slightly heartening to learn that the human brain
is naturally predisposed—if that’s the right word—to respond negatively to falsehood.
Who would have thought that?) Regretfully, the actual article is not all
that reader-friendly because it was obviously written by scientists for
other scientists, but you can check it out online anyway by clicking here.
The re-presentation of the material for “regular” readers without advanced
degrees in brain science by Erica Goode that was published in the New York Times
just this week, on the other hand, is entirely accessible…and very interesting
and provocative. (You can read Erica Goode’s article by clicking here.)
I recommend taking a look at both pieces, then focusing on the one that is by
far the more accessible.
In a sense,
their discovery only confirms what most of us sensed anyway to be the case: the
more we misbehave in some specific way—including with respect to the telling of
lies—the easier it becomes to repeat the sin without feeling overwhelmed by
remorse. Millennia ago, the editor of
the Mishnah recorded the great sage Ben Azzai’s wise comment on human nature to
the effect that just as the performance of one good deed (and the satisfaction
behaving well brings in its wake) leads naturally to the desire to do more good
in the world, so too does one sin often lead to another as sinners becomes inured
to their own poor behavior and find it increasingly easier to justify with each
subsequent iniquitous misdeed. Ben Azzai’s remark is semi-famous, but my own
favorite iteration of that same thought comes from the Talmud, where we find the
mordant comment of Rav Huna, the third-century master of the academy of Sura in
modern-day Iraq, to the effect that “once an individual commits a sin and then
repeats it, it becomes permitted to him.” By that, Rav Huna did not mean to
suggest that engaging in forbidden activity somehow makes the deed allowed—which
would be a patent absurdity—but rather that, as they are repeated again and
again, misdeeds begin no longer to feel wrong or forbidden at all, but rather
take on the feel of wholly permitted acts…to the individual doing them
if not to the world around.
With that
phenomenon, we are all surely familiar. You cross a specific line. You feel
briefly regretful, but then, because there is clearly nothing we human beings
enjoy more than mimicking ourselves, you find it, not harder, but just
that much easier to repeat the offense a second or third time. And then,
by the eighth or twentieth time ‘round that particular block, you barely
register the concept of wrongdoing at all with respect to the deed in question
and just proceed without giving the matter even a second thought. So Rav
Huna really was right that the ability to distinguish forbidden from permitted
becomes corrupted in the mind of the habitual sinner.
In the Nature
Neuroscience study, people lied consistently when they perceived the lie to
be to their own advantage and that they stood a good chance of not being
caught. Nothing too surprising there! But what was very surprising was the
discovery that the negative response in the amygdala decreased as the
scope of the lie increased. This suggests that the brain becomes
desensitized as the lies keep coming. In other words, the inner mechanism that
favors honesty and reacts negatively to deceit becomes degraded when the
boundary between falsehood and truth becomes consistently and repeatedly blurred.
To explain
that from a spiritual point of view, we really don’t need to go any further
than Rav Huna. But I can justify the results of the study without recourse to
religious psychology as well: since truths correspond to reality and untruths exist
solely in the realm of self-serving fantasy, it is hardly surprising that, when
given the choice between interpreting data in a way consistent with what the
brain perceives as reality and interpreting it in a fanciful way that the brain
perceives as flawed and inconsonant with how things really are in the world,
the brain naturally opts to favor reality and shun fantasy. What that says
about the human condition is encouraging. But what the fact that we apparently also
have the ability to erode that aspect of our human condition through
habituation brought on by repetition is part of the equation as well.
Also of
interest is that, if I am reading the study correctly, the amygdala only
accommodated itself to self-serving lies, not to untrue statements that were
merely erroneous because the speaker did not know the correct answer to a
question. And that part is key, I think, because it makes this a question of
moral decision-making not mere perception.
The specific
experiment had to do with two groups of people: one could see a huge jar of
pennies and the other couldn’t, but the members of the second group were the
ones who had to report how many pennies were in the jar. Since they couldn’t
see the jar, they had to depend on the data received by the people in the first
group. But by manipulating the instructions—in effect, incentifying lying by
the people in the first group in some cases and truth-telling in others, and by
varying the likelihood of being caught—the authors of the study could see how
the amygdala responds to truth telling and to lying, then see if the response
varies with the level of benefit the liar imagines might accrue to him or her
if the lie goes undetected. The brain does not respond to honest errors at all
because it takes them for truths. (That’s what an error is, after all: a
statement that is incorrect but which the speaker thinks to be correct.) But
when the brain understands that it is being asked to embrace a lie, it responds
negatively. For a while. Eventually, it gets used to it. Eventually, the personal
probity of the person in whose head that brain is housed degrades to the point
that, as per Rav Huna, the forbidden becomes permitted.
A while
back, I wrote to you about the phenomenon of politicians telling what appeared
to be pointless lies, which I defined as lies that do not appear to offer any
obvious gain to the person telling them. We tend to dismiss such instances as
mere misspeaking and I suppose many of those instances really are best demoted
to the level of “mere” mistakes. But we are talking about an entirely different
phenomenon here: the ability of the brain to adjust to the telling of lies, to
lose its outrage and thus also its ability to inspire the kind of shame that
naturally discourages future lying, and to accommodate the liar’s propensity to
lie by abandoning its natural tendency to wish for the inner self and the outer
world to exist in the same context of perceived reality.
In my
opinion, we have done ourselves a great disservice by demoting personal
integrity to the level of something we admire in candidates when we detect its
presence rather than something we demand be the sine qua non of everyone
who would run for public office. What elections are really about—both on the
national and local levels—is not supporting candidates based on the specific
positions they espouse, but rather determining the persons in who we would be
acting the most wisely to put our trust. The Nature Neuroscience article
is really about how wrong it would be to make that decision based solely on
outer demeanor or on the trappings, absent the content, of personal integrity.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.