Photography credited to Arne Sattler
Behavioural Science is a rapidly expanding field and everyday new research is being developed in academia, tested and implemented by practitioners in financial organisations, development agencies, government ‘nudge’ units and more. This interview is part of a series interviewing prominent people in the field. And in today's interview the answers are provided by Gerd Gigerenzer.
Gerd is a German psychologist, behavioural scientist, statistician, writer, and educator who shaped our understanding of heuristics. Gigerenzer mainly studied the use of bounded rationality and heuristics in decision-making. Together with Daniel Goldstein, he was the first to theorize the recognition heuristic and the take-the-best heuristic. He did so by providing evidence that a lack of familiarity with a topic can actually result in an individual making more accurate inferences. Gigerenzer has been a strong critic of Daniel Kahneman and Amos Tversky’s work, arguing that heuristics should not lead us to believe that human thinking is biased. Instead, Gerd believes that we should think of rationality as an adaptive tool that is not consistent with the rules of logic. Today, Gigerenzer is the director of the Harding Center for Risk Literacy at the University of Potsdam. He has been able to apply his ideas outside of academia as the founder of Simply Rational, an institution that investigates decision-making. Gigerenzer also holds a director emeritus position at the Center for Adaptive Behavior and Cognition at the Max Planck Institute for Human Development in Berlin.
Who or what got you into behavioural science? I was a musician in my earlier career. I loved playing jazz and soul and so I financed my studies on the stage. When I was approaching the end of my graduate studies, I had to make a decision: Should I continue with my music career, which was the safe option, or try an academic career, which was the risky option? I took the challenge, the academic career. And that was the moment where I started to think about how we make important decisions when we cannot foresee the future, that is, under uncertainty. That is what brought me into psychology and, later, to the interaction between psychology and economics. But soon I realized that decisions under uncertainty were rarely if ever studied. How do we decide which job to take, whom to trust, where to invest? The stock-in-trade of academic experiments were situations where you already know all possible consequences, such as choices between monetary gambles and other tasks where everything is certain, including the probabilities. So my next decision was to study how people make decisions in the real world.
What is the accomplishment you are proudest of as a behavioural scientist? And is there still something that you would like to achieve? First, the achievements are not really my achievements. During my time as director at the Max Planck Institute for Human Development in Berlin, I worked with more than a hundred PhD students, postdocs and fellow researchers from a dozen disciplines. Everything in my mind has its roots in conversations with them over coffee and cake, and in writing articles together. Jointly, we discovered the “less is more” effect. This is the counterintuitive effect that in certain situations, using less information leads to better decisions. For instance, when predicting the spread of the flu, the recency heuristic, which uses only a single data point, can beat big data analytics. Google Flu Trends analyzed 50 millions of search terms and developed a secret algorithm to make these predictions, but a smart heuristic derived from human intelligence leads to substantially better predictions. Under uncertainty – such as the fickle behavior of flu and corona viruses – less can be more. You won't find this effect if you restrict research to what Jimmy Savage called a small world - where everything is known and more is always better. Showing the less is more effect experimentally and analytically proving the conditions under which it arises, that’s definitely a result I’m proud of. Second, we have developed the study of the adaptive toolbox, that is, the repertoire of heuristics an individual or an organization has at its disposal. The use of formal models of heuristics is progress over the previous use of vague labels like availability or representativeness or system 1 vs. system 2. By specifying heuristics as algorithmic models we could show for the first time when they work and when they don't. Much of behavioral economics, however, still holds the mistaken belief that optimization is always possible and better, and heuristics second-best.
Third, and probably most important, the concept of ecological rationality. A heuristic is ecologically rational to the degree it is adapted to its environment. That is the theoretical foundation of less-is-more effects, of why heuristics work, and a new understanding of rationality in general. It's about success in the real world, not about internal consistency, such as logic or Bayesian updating. The world in which we live is characterized by uncertainty, not calculable risk, and here, people rely on adaptive heuristics. That is not a bias. Under uncertainty, heuristics can be the best thing we can do.
Finally, the results of our work have been applied in quite a few fields. My colleagues and I have trained managers in using smart heuristics, I have trained US federal judges in risk literacy, such as understanding DNA evidence, and concepts such as natural frequencies are now being taught in high schools. The latter also helps laypeople and doctors understand what a positive Covid-19 test or a positive HIV test means, that is, how likely one is infected. Natural frequencies have become part of the terminology of evidence-based medicine and are used in courts. They boost people’s understanding of risk, while mathematically equivalent concepts such as conditional probabilities tend to be confusing. People often think it's their fault that they don't understand statistics. No, it's the fault of the notation and the representation. Teaching heuristics for decision-making under uncertainty and teaching statistics for situations of risk is indispensable if we want to have smart citizens in a functioning democracy. What I would still like to achieve is to further improve the understanding of risk and uncertainty in physicians, judges, managers, and the general public. My colleague Ralph Hertwig and I call this program “boosting.” Many social scientists, however, are more interested in nudging the general public. Nudging is not about making people smart, but steers them to make choices that policy makers think they should make. Nudging has become a billion-dollar business, despite a 2022 meta-analysis by Maier et al. in PNAS entitled “No evidence for nudging after adjusting for publication bias.” Not in health, food, finance, or any other domain. We can do better by boosting.
If you weren't a behavioural scientist, what would you be? If I had been risk averse when deciding between a career in music and academia, I would have continued my life as a musician on the stage. Besides that, I have always been curious about the amazing rules of thumb animals evolved to create their environment and maintain social relations. Thus, I might have ended up studying life in coral reefs – I love scuba diving.
How do you apply all of these insights in your personal life? One thing I've learned is to make decisions faster. One can't know all the consequences of each possible action in an uncertain world. And after thinking a bit about it, I already know whether something is for me or not. Research shows that the first option that comes to the mind of an expert is likely the best option. This is enabled by the fluency heuristic, documented in studies on firefighters, golf players and handball players. Novices, in contrast, perform better if they take time and think about it, which is called the speed-accuracy trade-off. But that trade-off is only true for novices, such as when undergraduates are tested on an experimental task they have never seen before. If you have expertise, decide quickly and do not waste time on endless deliberation.
With all your experience, what skills would you say are needed to be a behavioural scientist? Are there any recommendations you would make? The most important skill is critical thinking. That requires reading, reading, and reading. And discussions with people from different disciplines and cultures. Critical thinking is necessary, but not sufficient. You also need courage. To stand up for the best scientific evidence and for the values of science, even if others do not.
How do you think behavioural science will develop (in the next 10 years)? My expectation is that research will improve in three respects. First, it will develop from the current focus on models of decision-making under risk to decision-making under uncertainty. In models of risk, all possible actions and their future consequences are known, and nothing unexpected can ever happen. That is the world of the consistency axioms, subjective expected utility maximization, and Bayesian updating. In situations of uncertainty, where not all possible actions and their consequences are known, utility maximization is by definition impossible. Nevertheless, people make decisions about jobs, partners, investments, and other uncertain options. Future research will focus on the heuristic processes used to make these decisions. And the misleading association between heuristics and biases will be overcome.
Second, research will move away from constructing as-if-models that predict only the outcome of a decision and do not model the actual process of decision. Research will finally free itself from Milton Friedman’s as-if philosophy, where the psychological realism of the process is of no relevance. Currently, the most prominent models in behavioural economics follow the as-if doctrine, such as prospect theory, hyperbolic discounting, and the Fehr-Schmidt model of social preferences. Instead, I expect that research will finally take psychology seriously and focus on modelling the actual decision processes, as Herbert Simon argued for long ago.
Finally, future research will analyze the interaction between mind and environment in order to understand how heuristics exploit environmental structures, as studied by the program of ecological rationality. Simon used the analogy of a pair of scissors to make this point, where mind and environment are the two blades: if one looks only at one blade, one will not understand why a pair of scissors cuts so well. Today, much of behavioural economic theories still looks mainly at the mind, using concepts such as loss aversion, risk aversion, or system 1. The consequence is that intelligent processes are mistaken for cognitive errors.
What advice would you give to someone who's looking to go into the field? First, read broadly. If you're interested in psychology, you need to read more than the heuristics and biases program. That's just one slice. Second, have the courage to build your own opinion based on what you read. Don’t run after hot topics; avoid becoming part of a herd mentality. And it’s important to read the important, original work. Not interpretations of it, but the work itself. And a last piece of advice: study decisions in the real world; to begin with, look at your own decisions and analyze them. You’ll probably notice that you rely on heuristics and do not calculate subjective expected utilities for all possible options. Then model these heuristics and study the conditions under which they work. When I made my decision to try for an academic career, I wasn't listing all the possible consequences because I couldn't possibly know them. I didn't use any probabilities because I couldn't estimate them. But watching me making this important decision gave me the inspiration for a heuristic that we later studied in experiments and simulations. It's called the take-the-best heuristic. Where you have two options, think about the most important reasons for your decision and rank them. Begin with the one you care about most, and if one option is the better for this reason, take it. If not, repeat with the second-best reason. It worked for me. I picked academia because it was the greater challenge for me. I wanted to test my courage to do something new.
Which other behavioural scientists would you love to read an interview by? If he were still alive, Egon Brunswik, who was a brilliant ecological psychologist. I have learned much from reading his work; he understood behavior as a result of the interaction between mind and environment. Likewise, I have been inspired by Herbert Simon, who had a similar adaptive perspective. If you want to understand the limits of expected utility models, interview Daniel Friedman, Mark Isaac, Duncan James and Shyam Sunder, the authors of Risky Curves: On the Empirical Failure of Expected Utility. And there is Ulrike Hahn, Ralf Hertwig, Riccardo Rebonato, and Lola Lopes.
Thank you so much for taking the time to answer my questions Gerd!
As I said before, this interview is part of a larger series which can also be found here on the blog. Make sure you don't miss any of those, nor any of the upcoming interviews!