Daniel Kahneman (1934) is an Israeli-American psychologist. He was awarded the Nobel Memorial Prize in Economic Sciences for his work in prospect theory (2002).

daniel kahneman

Notable Contributions

  • Prospect theory
  • Decision making
  • Cognitive biases
  • Behavioral economics


Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science.

Misconceptions of chance: People ex­pect that a sequence of events generated by a random process will represent the essential characteristics of that process even when the sequence is short. In considering tosses of a coin for heads or tails, for example, people regard the sequence H-T-H-T-T-H to be more likely than the sequence H-H-H-T-T-T, which does not appear random, and also more likely than the sequence H-H-H-H-T-H, which does not represent the fairness of the coin. Thus, people expect that the essential characteristics of the process will be represented, not only globally in the entire sequence, but also locally in each of its parts. A locally representative sequence, how­ever, deviates systematically from chance expectation: it contains too many al­ternations and too few runs. Chance is commonly viewed as a self-correcting process in which a deviation in one direction induces a devia­tion in the opposite direction to restore the equilibrium. In fact, deviations are not “corrected” as a chance process unfolds, they are merely diluted.

Misconceptions of chance are not limited to naive subjects. A study of the statistical intuitions of experienced research psychologists revealed a lingering belief in what may be called the “law of small numbers,” according to which even small samples are highly representative of the populations from which they are drawn. The responses of these investigators reflected the ex­pectation that a valid hypothesis about a population will be represented by a statistically significant result in a sam­ple — with little regard for its size. As a consequence, the researchers put too much faith in the results of small samples and grossly overestimated the replicability of such results. In the actual conduct of research, this bias leads to the selection of samples of inadequate size and to overinterpretation of findings.


Kahneman, D. (2011). Thinking, Fast and Slow. New York: Farrar, Straus and Giroux.

We prepared a survey that included realistic scenarios of statistical issues that arise in research. Amos (Tversky) collected the responses of a group of expert participants in a meeting of the Society of Mathematical Psychology, including the authors of two statistical textbooks. As expected, we found that our expert colleagues, like us, greatly exaggerated the likelihood that the original result of an experiment would be successfully replicated even with a small sample. They also gave very poor advice to a fictitious graduate student about the number of observations she needed to collect. Even statisticians were not good intuitive statisticians.

Social scientists in the 1970s broadly accepted two ideas about human nature. First, people are generally rational, and their thinking is normally sound. Second, emotions such as fear, affection, and hatred explain most of the occasions on which people depart from rationality. Our article challenged both assumptions without discussing them directly. We documented systematic errors in the thinking of normal people, and we traced these errors to the design of the machinery of cognition rather than to the corruption of thought by emotion.

TED Talks (Video). 2010.