A 12-Questions Framework to Better Decision Making
Created by Daniel Kahneman, Dan Lovallo and Olivier Sibony
Do you sometimes wonder that even after having a strong understanding of our function, industry, and role we end up taking a wrong decision?
We, as humans, are heavily influenced by our primary, innate biases. Our biases function outside of our rational and logical thinking and play a fundamental role in our decision-making process.
Let’s first understand, what does bias means, Bias is disproportionate weight in favor of or against an idea or thing. Biases can be of different type and as leaders, it impacts us in many ways.
I recently came across a framework created by Daniel Kahneman, Dan Lovallo and Olivier Sibony on HBR that can help leaders neutralize biases in teams’ thinking. These questions help leaders examine whether a team has explored alternatives appropriately, gathered all the right information, and used well-grounded numbers to support its case.
Preliminary Questions - Ask Yourself
Check for Self-Interested Biases: Is there any reason to suspect the team making the recommendation of errors motivated by self-interest?
For example, Review the proposal with extra care, especially for overoptimism.Check for the affect Heaurastic: Has the team fallen in love with its proposal?
For example, Rigorously apply all the quality controls on the checklist.Check for Groupthink: Were there dissenting opinions within the team? Were they explored adequately?
Solicit dissenting views, discreetly if necessary.Challenge Questions - Ask the Recommenders
Check for Saliency Bias: Could the diagnosis be overly influenced by an analogy to a memorable success?
Ask for more analogies, and rigorously analyze their similarity to the current situation.Check for Confirmation Bias: Are credible alternatives included along with the
recommendation?
Request additional options.Check for Availability Bias: If you had to make this decision again in a year's time, what information would you want, and can you get more of it now?
Use checklists of the data needed for each kind of decision.Check for Anchoring Bias: Do you know where the numbers came from? Can there be...unsubstantiated numbers?...extrapolation from history?...a motivation to use a certain anchor?
Reanchor with figures generated by other models or benchmarks, and request new analysis.Check for a Halo Effect: Is the team assuming that a person, organization, or approach that is successful in one area will be just as successful in another?Eliminate false inferences, and ask the team to seek additional comparable
examples.Check for Sunk-Cost Fallacy,Endowment Effect: Are the recommenders overly attached to a history of past decisions?
Consider the issue as if you were a new CEO.Evaluation Questions: Ask about the proposal
Check for Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect : Is the base case overly Optimistic?
Have the team build a case taking an outside view; use war gamesCheck for Disaster Neglect: Is the worst case bad enough?
Have the team conduct a premortem: Imagine that the worst has happened, and develop a story about the causes.Check for Loss Aversion: Is the recommending team overly cautious?
Realign incentives to share responsibility for the risk or to remove risk.
In case you want to practice reducing biases for better decision making, take out time to reflect on a few past decisions you have taken and what differently you could have done?
Learn from the outcomes and try using the above questions next time as a framework for taking decisions.