Furthermore, the measurements were similar to those from the first study of hindsight bias in 1975, which examined how people evaluated the probabilities of various geopolitical events before and after President Nixon's trip to China and the USSR. In other words, because the evaluators already knew the answer, they thought the task was easier than it really was. Evaluators who had been primed by a clear photo greatly overestimated the percentage of people who would correctly identify the human. The experiment found clear hindsight bias. To examine hindsight bias, some evaluators were shown clear versions of the photos before they saw the blurry photos-a situation analogous to how a jury in a medical malpractice case would already know the correct diagnosis before seeing the X-ray evidence. The evaluators' job was to estimate how many performers guessed correctly for each picture.
Subjects were divided into those who would do the task-the "performers"-and those who would judge the performers after the fact-the "evaluators." The performers saw a series of blurry photos and were told to guess which ones had humans in them. In particular, the Caltech experiment used eye-tracking methods to monitor where the subjects were looking as they evaluated the photos, giving the researchers a window into the subjects' thought processes. The visual system is among the most heavily studied parts of the brain, and researchers have developed many techniques and tools to understand it. In the study, the researchers gave volunteers a basic visual task: to look for humans in blurry pictures. The wide-ranging influence of hindsight bias has been observed in many previous studies, but research into the underlying mechanisms is difficult because these kinds of judgment are complex.īut by using experimental techniques from behavioral economics and visual psychophysics-the study of how visual stimuli affect perception-the Caltech researchers say they were able to probe more deeply into how hindsight emerges during decision making. Once we know the outcome of a decision or event, we can't easily retrieve those old files, so we can't accurately evaluate something after the fact. Hindsight bias likely stems from the fact that when given new information, the brain tends to file away the old data and ignore it, Camerer explains. In a new study, recently published online in the journal Psychological Science, a team led by Camerer and Shinsuke Shimojo, the Gertrude Baltimore Professor of Experimental Psychology, not only found a way to predict the severity of the bias, but also identified a technique that successfully reduces it-a strategy that could help produce fairer assessments in situations such as medical malpractice suits and reviewing police or military actions. "But in the past, they weren't understood well enough to prevent them." "We know a lot about the nature of these types of judgmental biases," he says. The bias is strong enough to alter your own memories, giving you an inflated sense that you saw the result coming. Furthermore, hindsight bias exists even if you were there. "Hindsight bias is fueled by the fact that you weren't there-you didn't see the fog and confusion," says Colin Camerer, the Robert Kirby Professor of Behavioral Economics at the California Institute of Technology (Caltech).
In legal settings, this tendency to underestimate the challenges faced by someone else-called hindsight bias-can lead to unfair judgments, punishing people who made an honest, unavoidable mistake.
PASADENA, Calif.-You probably know it as Monday-morning quarterbacking or 20/20 hindsight: failures often look obvious and predictable after the fact-whether it's an interception thrown by a quarterback under pressure, a surgeon's mistake, a slow response to a natural disaster, or friendly fire in the fog of war.