Human Decision-Making in a Big Data World
Organizations are looking to integrate Big Data and advanced analytics into their business operations in order to become more analytics-driven in their decision-making. However, there are several challenges that need to be addressed in order to make that transformation successful. One of those challenges is the very nature of how humans make decisions, and how our genetic makeup works against us in analyzing data and making decisions.
Human Decision Making Dilemma
The human brain is a poor decision-making tool. Human decision making capabilities have evolved from millions of years of survival on the savanna. Humans became very good at pattern recognition: from “That looks just a harmless log behind that patch of grass,” to “Yum, that looks like an antelope!” to “YIKES, that’s actually a saber toothed tiger!!” Necessity dictated that we become very good at recognizing patterns and making quick, instinctive survival decisions based upon those patterns.
Unfortunately, humans are lousy number crunchers (guess we didn’t need to crunch many numbers to know to spot that saber toothed tiger). Consequently, humans have learned to rely upon heuristics, gut feel, rules of thumb, anecdotal information, and intuition as our decision guides. But these decision tricks are inherently flawed and fail us in a world of very large, widely varied, high velocity data sources.
Figure 1: Dilbert by Scott Adams
Awareness of these human decision-making flaws is important if we want to transform our organization, and our people, to become an analytics-driven business.
Human Decision Making Traps
Let’s cover a few examples, or decision traps, where the human brain will lead us to suboptimal, incorrect, or even fatal decisions.
We put a great deal of weight on whatever we happen to know, and assume that what we don’t know isn’t important. The casinos of Las Vegas were built on this human flaw (and why my son likes to say that “gambling is a tax on those who are bad at math”).
For example, hedge fund Long-Term Capital Management (LTCM), with two Nobel Prize winners on staff, returned ~40% per year from 1994 to 1998. Soon other traders copied their techniques. So LTCM looked for new markets where others might not be able to mimic them. LTCM made the fatal assumption that these new markets operated in the same way as the old markets. In 1998, the LTCM portfolio dropped from $100B to $0.6B in value and a consortium of investment banks had to take LTCM over to avoid a market crash.
Companies make similar mistakes by over-valuing their experience in an existing market when they move into a new market (e.g., AT&T with computers), or launching a new product into a different product category (e.g., Procter & Gamble with orange juice). Companies don’t do enough research and analysis to identify and model the business drivers, and the competitive and market risks of moving into a new market or product category.
Trap: Anchoring Bias
Anchoring is the subtle human tendency to glom onto one fact as a reference point for decisions, even though that reference point may have no logical relevance to the decision at hand. During normal decision-making, individuals anchor, or overly rely, on specific information and then adjust to that value to account for other elements of the circumstance. Usually once the anchor is set, there is a bias toward that information.
For example, humans struggle deciding when to sell a stock. If someone buys a stock at $20 and then sees it rise to $80, they have a hard time selling the stock when it starts to drop because we’ve been anchored by the $80 price. This was a fairly common occurrence during the dot.com bust, as people whose low-cost stock options rose to unimaginable highs, then rode their stock options (chased the tape) all the way to zero because they had set their anchor point to the high.
This anchoring bias tends to show up in organizations’ pricing, investment, and acquisition decisions.
Trap: Risk Aversion
Our tolerance for risk is highly inconsistent. Risk aversion is a manifestation of people’s general preference for certainty over uncertainty, and for minimizing the magnitude of the worst possible outcomes to which they are exposed. Risk aversion surfaces in the reluctance of a person to accept a bargain with an uncertain payoff rather than another bargain with a more certain, but possibly lower, expected payoff.
For example, a risk-averse investor might choose to put his or her money into a bank account with a low but guaranteed interest rate, rather than into a stock that may have high expected returns but also involves a chance of losing value.
Another example is the reluctance of a business to cannibalize an incumbent product, even an aging or falling incumbent product, at the expense of up-and-coming product.
Trap: Don’t Understand Sunk Costs
Many companies often throw good money after bad investments because they don’t comprehend the concept of “sunk costs.” In economics, sunk costs are retrospective (past) costs that have already been incurred and cannot be recovered. Sunk costs are sometimes contrasted with prospective costs, which are future costs that may be incurred or changed if an action is taken. However, sunk costs need to be ignored when making going-forward decisions.
As an example, people will sit through a bad movie until the end even though they are not enjoying the movie. Why? Most of us would watch the rest of the movie since we paid for it, but the truth is, the price of the movie is a sunk cost.
As business examples, Coca Cola (with New Coke) and IBM (with OS/2) continued to throw good money at bad investment decisions because they had invested significant time and money (and emotional capital) into those products and wanted to try to recoup their investments, even at the cost of missing more lucrative business opportunities. We see this today with on-going marketing campaign spend, brand rationalization decisions, and decisions to exit poorly performing markets.
How a decision is stated or framed can impact what decision is made. Information, when presented in different formats, alters people’s decisions. Individuals have a tendency to select inconsistent choices, depending on whether the question is framed to concentrate on losses or gains.
As an example, participants were offered two alternative solutions for 600 people affected by a hypothetical deadly disease:
- Option A saves 200 people’s lives
- Option B has a 33% chance of saving all 600 people and a 66% possibility of saving no one
These decisions have the same expected value of 200 lives saved, but option B is risky. 72% of participants chose option A, whereas only 28% of participants chose option B.
However, another group of participants were offered the same scenario with the same statistics, but described differently:
- If option C is taken, then 400 people die
- If option D is taken, then there is a 33% chance that no people will die and a 66% probability that all 600 will die
In this group, 78% of participants chose option D (equivalent to option B), whereas only 22% of participants chose option C (equivalent to option A).
The discrepancy in choice between these parallel options is the framing effect; the two groups favored different options because the options were expressed employing different language. In the first problem, a positive frame emphasizes lives gained; in the second, a negative frame emphasizes lives lost.
Other human decision making traps include Herding (Safety in Numbers), Mental Accounting, Reluctance to Admit Mistakes (Revisionist History), Confusing Luck with Skill, Bias to the Relative, Don’t Respect Randomness and Over-emphasize the Dramatic.
What Can One Do?
The key is to guide, not stifle, human intuition (think guard rails, not railroad tracks). Here are some things that you can do to guide your decision-making as you make the transformation to an analytics-driven organization:
- Use analytic models to help decision makers understand and quantify the decision risks and returns. Leverage proven statistical tools and techniques to improve the understanding of probabilities. Employ a structured analytic discipline that captures and weighs both the risks and opportunities.
- Confirm and then reconfirm that you are using the appropriate metrics (think Moneyball). Just because a particular metric has always been the appropriate metric, don’t assume that it is the right one for this particular decision.
- Challenge your model’s assumptions. Test the vulnerability of the model and the model’s assumptions using Sensitivity Analysis and Monte Carlo techniques. For example, challenging the assumption that housing prices would never decline would have averted the recent mortgage market meltdown.
- Consult a wide variety of opinions when you vet a model. Avoid Group Think (which is yet another decision-making flaw). Have someone play the contrarian (think Tom Hanks in the movie “Big”). Use facilitation techniques in the decision process to ensure that all voices are heard and all views are contemplated.
- Be careful how you frame decisions.
- Create business models that properly treat sunk costs. Ensure that the model and analysis only considers new incremental costs. Ensure that your models include opportunity costs.
- Use “after the decision” Review Boards and formal debriefs to capture what worked and what didn’t, and why.
- Beware of counter-intuitive compensation; humans are revenue optimization machines
Figure 2: Dilbert by Scott Adams
Making the transformation to an analytics-driven culture is a powerful business enabler, but more than technology needs to be considered in driving that transformation. Understanding, managing, and educating on common decision-making traps will help ensure a successful transformation.
By the way, don’t forget to register for my upcoming webcast Transform Your BI and Data Warehouse for Big Data!
 Groupthink is a psychological phenomenon that occurs within groups of people and happens when the desire for harmony in a decision-making group overrides a realistic appraisal of alternatives. The Enron scandal and the Bay of Pigs decisions are two such examples.