At the beginning of every meeting, a question hangs in the
air: Who will be heard? The answer has huge implications not only for decision
making, but for the levels of diversity and inclusion throughout the
organization. Being heard is a matter of whose ideas get included — and who,
therefore, reaps the accompanying career benefits — and whose ideas get left
behind.
Yet instead of relying on subject matter experts, people
often pay closest attention to the person who talks most frequently, or has the
most impressive title, or comes from the CEO’s hometown. And that’s because of
how our brains are built.
The group decision-making process, rather than aligning with
actual competence, habitually falls for messy proxies of expertise, a phrase
coined by University of Utah management professor Bryan Bonner. Essentially,
when our brains are left to their own devices, attention is drawn to shortcuts,
such as turning focus to the loudest or tallest person in the room. Over time,
letting false expertise run the show can have negative side effects…
One of the most important assets a group can have is the
expertise of its members. But research indicates that even when everyone within
a group recognizes who the subject matter expert is, they defer to that member
just 62 percent of the time; when they don’t, they listen to the most
extroverted person. Another experiment found that “airtime” — the amount of
time people spend talking — is a stronger indicator of perceived influence than
actual expertise. Our brains also form subtle preferences for people we have
met over ones we haven’t, and assume people who are good at one thing are also
good at other, unrelated things. These biases inevitably end up excluding
people and their ideas.
People are not naturally skilled at figuring out who they
should be listening to. But by combining organizational and social psychology
with neuroscience, we can get a clearer picture of why we’re so habitually and
mistakenly deferential, and then understand how we can work to prevent that
from happening.
The brain uses shortcuts to manage the vast amounts of
information that it processes every minute in any given social situation. These
shortcuts allow our nonconscious brain to deal with sorting the large volume of
data while freeing up capacity in our conscious brain for dealing with whatever
cognitive decision making is at hand. This process serves us well in many
circumstances, such as having the reflex to, say, duck when someone throws a
bottle at our head. But it can be harmful in other circumstances, such as when
shortcuts lead us to fall for false expertise.
At a cognitive level, the biases that lead us to believe
false expertise are similarity (“People like me are better than people who
aren’t like me”); experience (“My perceptions of the world must be accurate”);
and expedience (“If it feels right, it must be true”). These shortcuts cause us
to evaluate people on the basis of proxies — things such as height,
extroversion, gender, and other characteristics that don’t matter, rather than
more meaningful ones.
The behavioral account of this pattern was first captured by
breakthrough research from Daniel Kahneman and the late Amos Tversky, which
eventually led to a Nobel Prize in Economic Science for Kahneman, and his
bestseller Thinking, Fast and Slow. Their distinction between so-called System
1 thinking, a “hot” form of cognition involving instinct, quick reactions, and
automatic responses, and System 2 “cool” thinking, or careful reflection and
analysis, is very important here. System 1 thinking can be seen as a sort of
autopilot. It’s helpful in certain situations involving obvious,
straightforward decisions — such as the ducking-the-bottle example. But in more
complicated decision-making contexts, it can cause more harm than good — for
instance, by allowing the person with the highest rank in the meeting to decide
the best way forward, rather than the person with the best idea…
Set up “if-then” plans. To guide attention back from these
proxies of expertise, you can formulate “if-then” plans, which help the
anterior cingulate cortex — a brain region that allows us to detect errors and
flag conflicting information — find differences between our actual behavior and
our preferred behavior. By incorporating this type of bias-mitigation plan
before we enter into a situation where we know a decision will be made, we
increase our chances of making optimal decisions.
For example, you can say to yourself: “If I catch myself
agreeing with everything a dominant, charismatic person is saying in a meeting,
then I will privately ask a third person (not the presenter or the loudest
person) to repeat the information, shortly after the meeting, to see if I still
agree.”
Get explicit, and get it in writing. One fairly easy
intervention is to instruct employees to get in the habit of laying out, in
writing, the precise steps that led to a given decision being made. You also
can write out the process for your own decision making…
Incentivize awareness. Along those same lines, managers
should reward employees who detect flaws in their thinking and correct course.
At the NeuroLeadership Institute, we have a “mistake of the month” section in
our monthly work-in-progress meetings to help model and celebrate this kind of
admission.
To use a sports example, New England Patriots quarterback
Tom Brady reportedly pays his defense if they can intercept his passes in
practice. (It must help. He’s one of two players in NFL history to win five
Super Bowls.) The takeaway: By making error detection a team sport, you
destigmatize the situation, highlight the learning opportunities, and increase
the likelihood of making better decisions in the future.
Set up buffers. Taking your decision making from “hot” to
“cool” often requires a conscious commitment to create a buffer between when
you receive information and when you make a decision on how to move forward…
Cut the cues. The most common and research-backed approach
involves giving hirers access to fewer of the sorts of cues that can trigger
expedience biases. Blind selection is a classic example. In the 1970s and
1980s, top orchestras instituted a blind selection process in which the
identity of applicants was concealed from the hiring committee, often by
literally hiding the player behind a screen while he or she performed. As a result,
the number of female musicians in the top five U.S. symphony orchestras rose
from 5 percent in 1970 to more than 25 percent in 1996.
Bonner, the Utah psychologist, says to “take the humanity
out” when you can. “Set up situations where people exchange information with as
little noise as possible,” he says. If you’re brainstorming, have everyone write
down their ideas on index cards or on shared documents, then review the ideas
anonymously — that way the strength of the idea, rather than the status of the
source, will be the most powerful thing…
Biases are human — a function of our brains — and falling
for them doesn’t make us malicious. We have the capacity to nudge ourselves
toward more rational thinking, to identify and correct the errors we make as a
result of bias, and to build institutions that promote good, clear thinking and
decision making. With the right systems, tools, and awareness in place, we can
better cultivate the best ideas from the most well-suited minds. It just takes
a bit of effort, and in the long run pays off in big ways. The best ideas get a
chance to be heard — and implemented — and your best thinkers are recognized
and keep on thinking.
https://www.strategy-business.com/article/Why-Our-Brains-Fall-for-False-Expertise-and-How-to-Stop-It
No comments:
Post a Comment