Simple Cognitive Errors |
Simple cognitive errors can have disastrous consequences unless
you know how to watch out for them. We like to think of ourselves as pretty
rational, but that's hardly how we seem from the perspective of accident
investigators and search-and-rescue crews.
People who deal with the aftermath of human error can tell you all too
well that otherwise normal, healthy individuals are exceptionally predisposed
to making the kind of mistake best described as boneheaded. Interestingly, research into this kind of self-defeating
behavior shows that it is usually far from random and when we make mistakes, we
tend to make them in ways that cluster under a few categories of screw-up. There is a method to our mindlessness. Most of the time, we are on autopilot,
relying on habit and time-saving rules of thumb known as heuristics. For the most part, these rules work just
fine, and when they don't, the penalty is nothing worse than a scraped knee or
a bruised ego. But when the stakes are higher, when a career is in jeopardy or
a life is on the line, they can lead us into mental traps from which there is
no escape. One slipup leads to another, and to another, in an ever-worsening
spiral. The pressure ratchets up, and our ability to make sound decisions
withers.
These cognitive errors are most dangerous in a potentially lethal
environment like the wilderness or the cockpit of an aircraft, but versions of
them can crop up in everyday life, too, such as when making decisions about
what to eat, whom to date, or how to invest.
The best defense is to know they exist.
When you recognize yourself starting to glide into one of these mind
traps, stop, take a breath, and turn on your rational brain. We become victim to a simple but insidious
cognitive error and I call it 'redlining'.
Anytime we plan a mission that requires us to set a safety parameter,
there's a risk that in the heat of the moment we'll be tempted to overstep it.
Divers see an interesting wreck or coral formation just beyond the maximum
limit of their dive tables. Airplane pilots descend through clouds to their
minimum safe altitude, fail to see the runway, and decide to go just a little
bit lower. It’s easy to think: I'll just
cross redline a little bit. What difference will it make? The problem is that
once we do, there are no more cues reminding us that we're heading in the wrong
direction. A little bit becomes a little bit more, and at some point it becomes
too much. Nothing's calling you back to the safe side.
A related phenomenon has been dubbed the "what-the-hell
effect," which can occur when dieters try to control their impulses by
setting hard-and-fast daily limits on their eating, a kind of nutritional
redline. One day, they slip up, eat a sundae, and boom—they're over the line.
Now they are in no-man's-land, so they're just going to blow the diet completely.
They're going to binge. The best
response to passing redline is to recognize what you have done, stop, and
calmly steer yourself back toward the right side. Focus on the outcome, what is important in
the long-term process, and not what happens on any individual day.
The domino effect results from a deep-seated emotion of the need
to help others. Altruism offers an evolutionary advantage but can compel us to
throw our lives away for little purpose. In stressful situations, you see a
failure in the working memory, which is involved in inhibiting impulses. People
lose the ability to think about the long-term consequences of their
actions. Now pausing for a moment and
taking a deep breath and even taking one step back sometimes allows you to see it
in a different light, to maybe think my efforts would be better spent running
to get help. I imagine that in these situations, that is an alternative that is
not even considered. Something similar
unfolds in some romantic relationships, when partners, perhaps unwittingly,
enable or get sucked into their partner's addictions or narcissism. You end up
doing things for the other person even though it is not in your own best
interest or even in the interest of the relationship. The only way you can save
yourself is to get the hell out.
As GPS units and satellite navigation apps have flourished over
the past few years, there has been a spate of cases, in which travelers follow
their devices blindly and wind up getting badly lost. In each case, the
underlying mistake is not merely technological but perceptual. It is the failure to remain aware of one's
environment, what aviation psychologists call situational awareness. People have always had difficulties
maintaining situational awareness, but the proliferation of electronics and our
blind faith that it will keep us safe has led to an epidemic of
absentmindedness. A big element in
situational awareness is paying attention to cues. If you're focusing just on
that GPS unit, and you see that little icon moving down the road, and say to
yourself, OK, I know where I am, technically, that can be a big problem,
because you are not looking at the world passing by your windshield. Full situational awareness requires
incorporating outside information into a model of your environment, and using
that model to predict how the situation might change. If all you are doing is
following the line of the GPS, and it turns out to be wrong, you'll be
completely clueless about what to do next.
In daily life, we rely on what is called social situational awareness to
navigate our way through the human maze. When you miss social cues, an
embarrassing faux pas can occur. Using swear words is completely fine in some
settings. In others, it is not as the stranger in a crowd, you'll have to pay
attention to figure out what is appropriate.
There is a manifestation of our irrational assessment of risks and
rewards. We tend to avoid risk when contemplating potential gains but seek risk
to avoid losses. For instance, if you offer people a choice between a certain
loss of $1,000 and a 50-50 chance of losing $2,500, the majority will opt for
the riskier option, to avoid a definite financial hit. Casinos make a good
profit from our propensity for risk-seeking behavior. Gamblers wind up in a hole,
and then instinctively take bigger and bigger risks in an attempt to recoup the
losses. Most go in hoping for the best, but to a veteran in the field of
applied psychology, it's a foregone conclusion.
Our minds recoil from uncertainty; we are wired to find order in
randomness. We look at clouds and see sheep. This can be a useful trait when it
comes to making decisions, since we're helpless without a theory that makes
sense of our quandary. Unfortunately, once we form a theory, we tend to see
everything through its lens. It is hard to let go of a fixed belief. A
consequence is that when people get lost in the back country, they can convince
themselves that they know exactly where they are, a problem known in the
search-and-rescue community as bending the map.
Such errors of overconfidence are due to what phenomenon psychologists
call confirmation bias. When trying to solve a problem or troubleshoot a
problem, we get fixated on a specific option or hypothesis, and ignore
contradictory evidence and other information that could help us make a better
decision.
A vast collective error of confirmation bias unfolded in the past
decade as investors, analysts, and financial advisers all managed to convince
themselves that legions of financial derivatives based on subprime mortgages
were all fundamentally sound. There was plenty of evidence to the contrary, and
many commentators pointed out the facts. But the money was so good that too
many found it easier to believe. They kept convincing themselves right up until
the roof caved in.
How can you avoid confirmation bias? You can employ some of the
same strategies for sidestepping other mind traps. Take yourself off autopilot.
Become aware of your environment. Make a habit of skepticism, including
skepticism toward your own assumptions and gut feelings. Don't use your
intuition to convince yourself that things are going right, use it to alert you
to potential problems. Please listen to
those niggling doubts.
No comments:
Post a Comment