Who Is Visiting Us

Our Tweets
Search Our Site
Credits
Powered by Squarespace
« What Value Cycle Leadership Is & Why It Matters - Gregory C. Unruh - Harvard Business Review | Main | Can we really tell who will succeed in competitive business situations without knowing what they have to offer? - MIT Study »
Monday
Nov222010

Why Being Wrong Feels So Right (And What You Can Do About It) - Sarah Green - Harvard Business Review

What if you made a major mistake — and you didn't even notice? Even if it was right in front of your nose?

Chances are, it happens on a regular basis. That's what I took away from the recently concluded PopTech conference, whose theme this year was "Brilliant Accidents, Necessary Failures, and Improbable Breakthroughs."

Kevin Dunbar, a professor of psychology at the University of Toronto, illustrated our reflexive reaction to being wrong with brain scans that should make any would-be innovator turn cold. When the subject — in this case, a lab researcher — viewed an unexpected result, the scan showed a dime-sized area of activity in the dorsolateral prefrontal cortex. As this Wired profile of Dunbar explains, that's like the brain's "delete" key. Now, as any editor can tell you, a delete key is a wonderful gift: by cutting out the chaff (of prose, of data, of life) we can see the wheat that much more clearly. The brain's process of filtering is what helps us pay attention. But for a scientist — or anyone in the business of discovery — if you habitually mentally delete anomalous data, how can you learn from it?

(Ladies, the news is slightly better for you: Dunbar noticed a gender split in his research. Women were more likely than men to investigate unexpected findings, while men were more likely to assume they knew the reason for the unexpected result, and proceed without more analysis.)

And that's just the cases where the brain noticed something off. What if you didn't even see the anomaly in the first place — even if it was as glaring as a chest-thumping gorilla? On day three of the conference, we heard from Chris Chabris, one of the psychologists behind the now-famous "gorilla experiment." (If you'd like to try the experiment, stop reading now, and follow the link.) Subjects are told to watch a video of two teams — one wearing white shirts, the other black — passing basketballs, and to count the number of times the white team passes the ball. Towards the end of the video, a person in a gorilla suit walks through the middle of the teams, turns to face the camera, thumps their chest, and then walks off. In Chabris's experiments, about fifty percent of people don't see the gorilla at all. As Chabris and co-author Daniel Simons explain in their fascinating new book, The Invisible Gorilla, people are not generally pleased to find themselves duped, and easily switch from surprise to denial. "A man who was tested later by the producers of Dateline NBC for their report on this research said, 'I know that gorilla didn't come through there the first time.' Other subjects accused us of switching the tape while they weren't looking."

But before you judge them — especially if you saw the gorilla — think back to the last time you were wrong. How did being wrong feel? Was your reaction to deny it? Did you feel "idiotic and embarassed," or did "your heart sink and your dander rise" as "wrongologist" Kathryn Schulz describes in the delightful Being Wrong? As Schulz pointed out in her PopTech talk, deflation and embarrassment are the emotions of realizing you are wrong. Because in truth, being wrong feels exactly the same as being right. This is how, while camping, I once had an impassioned argument with a friend over whose pillow was whose. We were both utterly convinced, by the light of our Coleman lantern, that a certain pillow was ours. Of course, in the clear light of day, only one of us was right. But in the moment, even though it was completely trivial, we both thought the other person was insane. And, Schulz points out, we're terrible at admitting our own wrongness — even when it's something trivial. Like the subjects of Dunbar's research and Chabris's experiments, we delete the information, deny it, pass the blame to someone else, justify ourselves, or get defensive.

And here's the kicker: though we think of being wrong as aberrant or unusual, in truth we're wrong astonishingly often.

To illustrate this, Schulz pointed to Ulric Neisser's work on flashbulb memories - our memories of events like the Challenger disaster, the Kennedy assassination, D-Day, or 9/11. We tend to start our stories of these events with the words, "I remember exactly where I was when I heard..." But do we? The day after the Challenger explosion, Neisser asked a group of students to write down their memory of events. Three years later, he asked them to do so again. Fifty percent of these subsequent reports were more than two-thirds wrong. Twenty-five percent of the reports were completely, 100% wrong. And only seven percent were completely accurate. And, Schulz pointed out, while we'd all like to be in that seven percent, the odds are stacked heavily against us.

As a result of this influx of information, I've made two post-PopTech resolutions:

  1. Actively look for anomalies. We can't look inward for feelings of wrongness; as Schulz so convincingly illustrated, the "feeling" of being right is misleading. We've got to look outward. We need an external aid — the light of the sun, in my camping example above — to know when we're wrong. Dunbar assured us that the brain is not hardwired to disregard anomalous data — we can retrain our brains to notice the unexpected. We just need to look for it. After all, the key to Chabris's experiments is not that his subjects aren't paying attention — on the contrary, it's that they are so focused on the task at hand that they experience a kind of tunnel vision that allows them to screen out "irrelevant" data.
  2. Be gentle to each other — and to yourself. My second strategy, however, is an internal one. And that's to remember that we all have blind spots. The next time I'm convinced a politician is lying, or a friend is being willfully obtuse, or a colleague is half-daft — or that I myself must be an idiot — I'm going to pause, and remember: sometimes, we all miss the gorilla.

Sarah Green is an associate editor at HBR.

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
All HTML will be escaped. Hyperlinks will be created for URLs automatically.