There’s an operating assumption that groups make better-informed decisions than individuals—that the diversity of the group inevitably leads to a broader view where the sum is greater than its parts. While that makes sense on paper, research suggests that groups can be just as susceptible to developing blind spots as individuals—and may be even less likely to learn from past mistakes.
In the fourth of a series of articles on how human behavior can affect investment decisions, we examine the interaction of individual biases and group dynamics and how they can lead to poor group decisions. We also introduce techniques to ensure that the best thinking of the group gets on the table, maximizing the odds of making informed decisions.
Groupthink: a history lesson
Dr. Irving Janis, a clinical psychologist at Yale University, first coined the term groupthink when examining the decision-making that led to the Bay of Pigs fiasco during the early years of President John F. Kennedy’s administration. The goal of that mission was to overthrow Cuba’s revolutionary leader Fidel Castro using a small group of approximately 1,000 Cuban exiles in small boats landing in the bay under the cloak of darkness. The CIA had given the mission a high probability of success, and Kennedy’s inner cabinet members, eager to depose Castro, quickly agreed to give the green light. In the end, about everything that could go wrong did, and the mission failed within 24 hours, with the exiles either captured or killed by Castro’s forces.
What went wrong?
Despite input from the CIA, the U.S. Department of Defense, and Kennedy’s inner cabinet, Dr. Janis found that the group quickly coalesced around a single course of action—profoundly overestimating its odds of success. There was little discussion in terms of what could go wrong or its implications. In the end, the group made a common behavioral error by confusing a plausible outcome with a probable one.
Avoiding groupthink requires recognizing individual biases first and group dynamics second. Two of the most important individual biases (or tendencies) are belief bias and confirmation bias.
Belief bias is the tendency to hold views that aren’t necessarily supported by facts. An example might be that GM makes better trucks than Ford. What’s most interesting is that these beliefs are often long held and passed along by others rather than based on actual experience.
Confirmation bias is our tendency to protect these beliefs by seeking out information that supports our viewpoint and minimizing (or avoiding) information that contradicts it. In the case above, a common tendency would be to read a positive review on GM trucks and avoid a negative one. The net effect is that we’re often not as open minded as we think.
Group settings can amplify these tendencies based on their own set of behavioral biases. Two, in particular, are the race to consensus and social conforming. The first—race to consensus—simply reflects that groups like to be efficient and not waste time or resources in endless deliberation, regardless of how complex the matter at hand. The second—social conforming—is the pressure that groups can put on those who don’t quickly fall in line with the consensus point of view, with their hesitancy to conform often viewed as resistant and de-railing behavior.
The role of experts and senior leaders
Groups may also be deferential to those who know more about the subject (i.e., experts) as well as those more senior to them. One study on the influence of experts used MRI technology to monitor the brain activity of two groups when deliberating and making financial decisions.¹ One group had an expert among their ranks and the other didn’t. Researchers found that group members who had the benefit of an expert had significantly less brain activity in regions responsible for probability-weighing functions than the other group. In other words, they had a strong tendency to simply defer to the expert. In the case of a senior leader within the group, a forceful point of view can have the same effect. Taken together, it’s easy to see how groups can make decisions that are less than the sum of their parts.
Structuring the group meeting to avoid getting blindsided
Kennedy learned quickly from the Bay of Pigs fiasco and restructured his cabinet meetings to ensure the broadest possible perspective and a more thorough vetting of options. This had an immediate payoff after the Russians installed missiles in Cuba shortly afterward. Many in the Defense Department advocated for an immediate bombing of the missile sites, but Kennedy eventually opted for a naval blockade of the island, which eventually proved to be the right course of action—avoiding a potential nuclear war between the United States and Russia.
As stewards of other people’s capital, investment advisors have a fiduciary responsibility to make objective, fully informed decisions that are in the best interests of their clients—free from bias and conflicts of interest. One way to pursue this is by structuring the investment policy committee meetings to address these known tendencies that can lead to poor decision-making. This includes subjecting any proposed course of action, such as lowering credit exposure across portfolios, to a rigorous evaluation by:
- Decorrelating the opinions of group members through the use of secret ballots
- Presenting a bull, bear, and base case for the proposed course of action
- Appointing a group member to play devil’s advocate
- Documenting the deliberations, including the rationale for the final decision
- Identifying the timeframe and circumstances to reevaluate the decision
The investment consulting group at John Hancock Investment Management is here to help and offers a range of services, from formal model reviews to manager selection to the investment decision process. For more information on structuring your team’s investment policy meeting, contact us today.
1 "Expert Financial Advice Neurobiologically 'Offloads' Financial Decision-Making Under Risk," Jan B. Engelmann et al, 3/24/09.