The availability heuristic is a phenomenon (which can result in a cognitive bias) in which people predict the frequency of an event, or a proportion within a population, based on how easily an example can be brought to mind.
Simply stated, where an anecdote (“I know an American guy who…”) is used to “prove” an entire proposition or to support a bias, the availability heuristic is in play.
In these instances the ease of imagining an example or the vividness and emotional impact of that example becomes more credible than actual statistical probability. Because an example is easily brought to mind or mentally “available,” the single example is considered as representative of the whole rather than as just a single example in a range of data.
This phenomenon was first reported by psychologists Amos Tversky and Daniel Kahneman, who also identified the representativeness heuristic. To see how availability differs from related terms vividness and salience, see availability, salience and vividness.
Essentially the availability heuristic operates on the notion that “if you can think of it, it must be important.” Media coverage can help fuel a person’s example bias with widespread and extensive coverage of unusual events, such as airline accidents, and less coverage of more routine, less sensational events, such as car accidents. For example, when asked to rate the probability of a variety of causes of death, people tend to rate more “newsworthy” events as more likely because they can more readily recall an example from memory. In fact, people often rate the chance of death by plane crash higher than the chance by car crash, and death by natural disaster as probable only because these unusual events are more often reported in mass media than common causes of death. In actuality, car-accident deaths are much more common than plane-crash deaths, whereas more common causes of death, such as medical error, tend not to be widely reported. Additional rare forms of death are also seen as more common than they really are because of their inherent drama such as shark attacks, and lightning.
- A person argues that cigarette smoking is not unhealthy because his grandfather smoked three packs of cigarettes a day and lived to be 100. The grandfather’s health could simply be an unusual case that does not speak to the health of smokers in general.
- A politician says that walnut farmers need a special farm subsidy. He points to a farmer standing nearby and explains how that farmer will benefit. Others who watch and discuss later agree that the subsidy is needed based on the benefit to that farmer. The farmer, however, might be the only person who will benefit from the subsidy. Walnut farmers in general may not necessarily need this subsidy.
- A person claims to a group of friends that drivers of red cars get more speeding tickets. The group agrees with the statement because a member of the group, “Jim,” drives a red car and frequently gets speeding tickets. The reality could be that Jim just drives fast and would get a speeding ticket regardless of the color of car that he drove. Even if statistics show fewer speeding tickets were given to red cars than to other colors of cars, Jim is an available example which makes the statement seem more plausible.
- Someone is asked to estimate the proportion of words that begin with the letter “R” or “K” versus those words that have the letter “R” or “K” in the third position. Most English-speaking people could immediately think of many words that begin with the letters “R” (roar, rusty, ribald) or “K” (kangaroo, kitchen, kale), but it would take a more concentrated effort to think of any words where “R” or “K” is the third letter (street, care, borrow, acknowledge); the immediate answer would probably be that words that begin with “R” or “K” are more common. The reality is that words that have the letter “R” or “K” in the third position are more common. In fact, there are three times as many words that have the letter “K” in the third position.
One important corollary finding to this heuristic is that people asked to imagine an outcome tend to immediately view it as more likely than people that were not asked to imagine the specific outcome. If group A were asked to imagine a specific outcome and then asked if it were a likely outcome, and group B were asked whether the same specific outcome were likely without being asked to imagine it first, the members of group A tend to view the outcome as more likely than the members of group B, thereby demonstrating the tendency toward using an availability heuristic as a basis for logic.
In one experiment that occurred before the 1976 US Presidential election, participants were asked simply to imagine Gerald Ford winning the upcoming election. Those who were asked to do this subsequently viewed Ford as being significantly more likely to win the upcoming election, and vice versa for participants that had been asked to imagine Jimmy Carter winning. Analogous results were found with vivid versus pallid descriptions of outcomes in other experiments.
An opposite effect of this bias, called denial, occurs when an outcome is so upsetting that the very act of thinking about it leads to an increased refusal to believe it might occur. In this case, being asked to imagine the outcome actually made participants view it as less likely.
Denial, how about denial that your magic unicorn method won’t work?
Availability? How about thinking the Morse code is palindromes or K3 is meant as a literal application even though these are just someone else’s ideas bandied about teh internets?
Even in the space of 20 total years, with each solver actually only probably being involved for a small portion of that, we learn from other sites and other methods. We try and imagine the methods working or the known solutions and it predisposes us to them.
How many of us can say, “Oh yeah, I know a guy who totally tried that…” or “I thought I read online somewhere…”
I’m not pointing fingers and I’m not saying I’m immune. I’m just saying that it’s worth the time for some self-evaluation.