The availability bias says this: we create a picture of the world using the examples that most easily come to mind. That we base our decisions on our immediate perception of things, even when hard evidence shows contrasting probability.
This is absurd, of course, because in reality things don’t happen more frequently just because we can conceive of them more easily.
A lot of decision making, at a personal level, at team levels and apparently even at company levels goes absolutely wrong because of this.
Like if you're making a decision on which car to buy, and data shows a certain car is reliable and stable, but if a friend says he knows somebody who had a bad experience with that car, we'll likely take that particular experience into account.
Doctors often fall victim to the availability bias. They have their favourite treatments, which they use for all possible cases. More appropriate treatments may exist, but these are in the recesses of the doctors’ minds. Consequently they practice what they know. Consultants too. If they come across an entirely new case, they do not throw up their hands and sigh: ‘I really don’t know what to tell you.’ Instead they turn to one of their more familiar methods, whether or not it is ideal.
This is absurd, of course, because in reality things don’t happen more frequently just because we can conceive of them more easily.
A lot of decision making, at a personal level, at team levels and apparently even at company levels goes absolutely wrong because of this.
Like if you're making a decision on which car to buy, and data shows a certain car is reliable and stable, but if a friend says he knows somebody who had a bad experience with that car, we'll likely take that particular experience into account.
Doctors often fall victim to the availability bias. They have their favourite treatments, which they use for all possible cases. More appropriate treatments may exist, but these are in the recesses of the doctors’ minds. Consequently they practice what they know. Consultants too. If they come across an entirely new case, they do not throw up their hands and sigh: ‘I really don’t know what to tell you.’ Instead they turn to one of their more familiar methods, whether or not it is ideal.
Thanks to the availability bias, we travel through life with an incorrect risk map in our heads. Thus, we systematically overestimate the risk of being the victim of an accident or a crime, and we underestimate the risk of dying from less spectacular means. Anything silent or invisible we downgrade in our minds. We think dramatically, not quantitatively.
If something is repeated often enough, it gets stored at the forefront of our minds. It doesn’t even have to be true.
Fend it off by spending time with people who think differently than you think.... people whose experiences and expertise are different than yours. It always fits to have that other perspective in to get to better decisions. And also listen. You can decide, but yet...listen :)
Nice post..I read something similar to this in the recent Mckinsey quarterly report.."Overcoming obstacles to effective scenario planning": http://www.mckinsey.com/insights/strategy/overcoming_obstacles_to_effective_scenario_planning
ReplyDeleteThanks Kiran.....for liking it, and for the report :)
ReplyDelete