Irrationality and the Future

Climate change is all about future impact; the aspects that we see and deal with right now are limited to those already highlighted by early warning signs in this process. The fact that the issues are global means that mitigating their impacts requires even more time than it might otherwise. Seventy percent of Earth is covered by oceans, which absorb a significant fraction of the additional heat and human-emitted greenhouse gases; the effects of climate change will continue even after strong mitigation accomplishments such as a full global energy transition to non-carbon fuels. Many of the driving forces of climate change – such as deep ocean temperature – are slow processes that need a long time to equilibrate (the IPCC labeled the process of climate stabilization after the world regularizes its concentration of greenhouse gases as committed warming). Action to stabilize atmospheric greenhouse concentrations needs to start now in order to affect temperatures in the future.

Yet people hate to pay now for promised benefits down the line (think about education!). The term economists use is “discounting the future.” There are debates as to the rate at which people discount it but there is overall consensus that the phenomenon exists. In a sense, the hatred of “pay now to prevent future losses” systems contradicts one of the main biases in human irrationality: loss aversion. But as long as the losses are predicted to materialize in the future, we would much rather not think about them at all, thank you!

Tversky and Kahneman (see my previous two blogs) also noticed some of the flaws inherent in asking people to make predictions. Wikipedia summarizes some important aspects of their thinking:

Kahneman and Tversky[1][2] found that human judgment is generally optimistic due to overconfidence and insufficient consideration of distributional information about outcomes. Therefore, people tend to underestimate the costs, completion times, and risks of planned actions, whereas they tend to overestimate the benefits of those same actions. Such error is caused by actors taking an “inside view“, where focus is on the constituents of the specific planned action instead of on the actual outcomes of similar ventures that have already been completed.

Kahneman and Tversky concluded that disregard of distributional information, i.e. risk, is perhaps the major source of error in forecasting. On that basis they recommended that forecasters “should therefore make every effort to frame the forecasting problem so as to facilitate utilizing all the distributional information that is available”.[2]:416 Using distributional information from previous ventures similar to the one being forecast is called taking an “outside view“. Reference class forecasting is a method for taking an outside view on planned actions.

Reference class forecasting for a specific project involves the following three steps:

1. Identify a reference class of past, similar projects.

2. Establish a probability distribution for the selected reference class for the parameter that is being forecast.

3. Compare the specific project with the reference class distribution, in order to establish the most likely outcome for the specific project.

Their argument of our innate optimism and natural disregard for risk (often termed the “planning fallacy”) has a close resemblance to the “just world” hypothesis that I discussed in last week’s blog: we think of the world as operating in a rational manner in spite of countless evidence that humans are not very rational creatures. You can see another window into Kahneman’s thinking via Morgan Housel’s interview with him on the topic:

Morgan Housel: When the evidence is so clear that the recent past is probably not going to predict the future, why is the tendency so strong to keep assuming that it will?

Dr. Kahneman: Well, people are inferring a sort of causal model from what they see, so when they see a certain pattern, they feel there is a force that is causing that pattern to, that pattern of behavior and behavior in the market and to continue. And then if the force is there, you see the extending. This is really a very, very natural way to think. When you see a movement in one direction, you are extrapolating that movement. You are anticipating where things will go and you are anticipating usually in a fairly linear fashion, and this is the way that people think about the future generally.

This sort of thinking about future impact has carried over into the application of how time biases play out in Behavioral Economics.

The first example here is from a paper by Benartzi and Thaler that introduces the concept of “Myopic Loss Aversion” in order to try to explain the puzzle of why stocks outperform bonds so strongly in long-term investments:

Myopic Loss Aversion and the Equity Premium Puzzle

Shlomo Benartzi, Richard H. Thaler

NBER Working Paper No. 4369
Issued in May 1993
NBER Program(s): AP

The equity premium puzzle, first documented by Mehra and Prescott, refers to the empirical fact that stocks have greatly outperformed bonds over the last century. As Mehra and Prescott point out, it appears difficult to explain the magnitude of the equity premium within the usual economics paradigm because the level of risk aversion necessary to justify such a large premium is implausibly large. We offer a new explanation based on Kahneman and Tversky’s ‘prospect theory’. The explanation has two components. First, investors are assumed to be ‘loss averse’ meaning they are distinctly more sensitive to losses than to gains. Second, investors are assumed to evaluate their portfolios frequently, even if they have long-term investment goals such as saving for retirement or managing a pension plan. We dub this combination ‘myopic loss aversion’. Using simulations we find that the size of the equity premium is consistent with the previously estimated parameters of prospect theory if investors evaluate their portfolios annually. That is, investors appear to choose portfolios as if they were operating with a time horizon of about one year. The same approach is then used to study the size effect. Preliminary results suggest that myopic loss aversion may also have some explanatory power for this anomaly

The key here is that the short-term checking of the performance of the portfolio basically converts the long-term bias to a short-term issue.

Cheng and He tried a direct experimental approach to the issue. Here I am including the abstract, with its summary of the project’s conclusions, and the experimental setup that was used to derive these conclusions:

Deciding for Future Selves Reduces Loss Aversion

Qiqi Cheng and Guibing He

Abstract

In this paper, we present an incentivized experiment to investigate the degree of loss aversion when people make decisions for their current selves and future selves under risk. We find that when participants make decisions for their future selves, they are less loss averse compared to when they make decisions for their current selves. This finding is consistent with the interpretation of loss aversion as a bias in decision-making driven by emotions, which are reduced when making decisions for future selves. Our findings endorsed the external validity of previous studies on the impact of emotion on loss aversion in a real world decision-making environment.

Tasks and Procedure

To measure the willingness to choose the risky prospect, we follow Holt and Laury (2002, 2005) decision task by asking participants to make a series of binary choices for 20 pairs of options (Table 1). The first option (Option A, the safe option) in each pair is always RMB 10 (10 Chinese Yuan) with certainty. The second option (Option B, the risky option) holds the potential outcomes constant at RMB 18 or 1 for each pair but changes the probabilities of winning for each decision, which creates a scale of increasing expected values. Because expected values in early decisions favor Option A while the expected values in later decisions favor Option B, an individual should initially choose Option A and then switch to Option B. Therefore, there will be a ‘switch point,’ which reflects a participant’s willingness to choose a risky prospect. The participants are told that each of their 20 decisions in the table has the same chance of being selected and their payment for the experiment will be determined by their decisions.

The conclusions from these discussions are clear – our biases are highly sensitive to the length of time in which we are trying to make predictions and thus directly impact mitigations that have to start in the present. Climate change is an existential issue so we don’t have the luxury of waiting until the damage is imminent. Once we get there it is too late to avoid.

About climatechangefork

Micha Tomkiewicz, Ph.D., is a professor of physics in the Department of Physics, Brooklyn College, the City University of New York. He is also a professor of physics and chemistry in the School for Graduate Studies of the City University of New York. In addition, he is the founding-director of the Environmental Studies Program at Brooklyn College as well as director of the Electrochemistry Institute at that same institution.
This entry was posted in Anthropocene, Anthropogenic, Climate Change, Sustainability and tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *