Thought Piece by Stephen Heins

The new Michael Lewis book promises to develop a theme that has become a part of the environmental narrative: Scientists and environmentalists who maintain certainty when predicting the future of the planet.

Also, the human bias to misremember their previous predictions as being correct. I have been advocating humility for visionaries,  predictors, futurists, forecasters and modelers. In fact, they all should be willing to disclose their previous predictions and their correctness. Yes, I love Michael Lewis’ work and this new book promises to be very timely, so I ordered it.

“One bias they found is that we underestimate uncertainty. In hindsight bias, for example, test subjects misremembered their own predictions as being correct. (Italics mine) As Tversky explained, “we find ourselves unable to predict what will happen; yet, after the fact we explain what did happen with a great deal of confidence. . . . It leads us to believe that there is a less uncertain world than there actually is.”

Michael Lewis’s ‘Brilliant’ New Book About Cognitive Bias

‘The Undoing Project,’ focuses on the lifelong collaboration of Daniel Kahneman and Amos Tversky, two Israeli-American psychologists who are our age’s apostles of doubt about human reason.

Originally Posted December 5, 20166:54 p.m. ET | By William Easterly, The Wall Street Journal  (Book Reviews)

Michael Lewis’s brilliant book celebrates Daniel Kahneman and Amos Tversky, Israeli-American psychologists who are our age’s apostles of doubt about human reason. The timing is fortunate, given that overconfident experts may have caused and then failed to predict such momentous events as Brexit and the election of Donald Trump.

Mr. Kahneman and Tversky (who died in 1996) first started working together in 1969. They were well-matched. The Holocaust survivor Mr. Kahneman chronically doubted even himself. The brash Tversky targeted his doubts toward others, especially (as one acquaintance noted) “people who don’t know the difference between knowing and not knowing.” Testing people with quizzes in their laboratory, they found a host of “cognitive biases” afflicting rational thinking.

 

One bias they found is that we underestimate uncertainty. In hindsight bias, for example, test subjects misremembered their own predictions as being correct. As Tversky explained, “we find ourselves unable to predict what will happen; yet, after the fact we explain what did happen with a great deal of confidence. . . . It leads us to believe that there is a less uncertain world than there actually is.” Mr. Lewis is outraged by McKinsey & Co. coaching their consultants to radiate certainty while billing clients huge fees to forecast such unknowable variables as the future price of oil. The work of Tversky and Mr. Kahneman convinced Mr. Lewis that, as he puts it when summarizing the view of a jaded former consultant, such “confidence was a sign of fraudulence.”

Failing to process uncertainty correctly, we attach too much importance to too small a number of observations. Basketball teams believe that players suddenly have a “hot hand” after they have made a string of baskets, so you should pass them the ball. Tversky showed that the hot hand was a myth—among many small samples of shooting attempts, there will randomly be some streaks. Instead of a hot hand, there was “regression to the mean”—players fall back down to their average shooting prowess after a streak. Likewise a “cold” player will move back up to his own average. (Both Mr. Lewis and his subjects love sports examples; Mr. Lewis now says that he realizes the insights chronicled in his 2003 “Moneyball,” about flawed judgment in baseball, had been predicted by Mr. Kahneman and Tversky all along.)

Failing to understand regression to the mean is a ubiquitous source of prediction errors, such as expecting China’s world-record streak of high economic growth rates to continue forever (it won’t). Mr. Kahneman showed that such flawed thinking had even messed up the Israeli Air Force. Officers praised pilots after a great landing and berated them after a terrible one. Officers then noticed that the next landing after a fantastic one was worse, while the one after a horrendous one was better. The Air Force concluded that praise backfired while criticism improved performance. Mr. Kahneman noted that this spurious conclusion failed to understand regression to the mean. When he repeated this story to test subjects later, they made up stories about why praise backfired—they were also blind to the regression to the mean. Mr. Kahneman wrote: “It is part of the human condition that we are statistically punished for rewarding others and rewarded for punishing them.”

We also process uncertainty about people badly, resorting to stereotypes based on a small number of vivid examples about different types of people. Nobody in basketball thought of an awkward Chinese-American as a typical star, so nobody drafted Jeremy Lin in 2010. The Knicks discovered his abilities only in 2012 after a rash of injuries forced them to play him. Tversky and Mr. Kahneman found that stereotypes are more powerful than the logic of probability. They told test subjects that a fictitious “Linda” was smart and socially conscious and asked which is more likely: (a) that Linda is a bank teller or (b) that Linda is a bank teller and a feminist. Subjects chose (b) even though the subset of feminist bank tellers has to be smaller than the set of all bank tellers (feminist and non-feminist). Linda was just too irresistible a stereotype of a feminist to obey the laws of sets and probability.

Today vivid examples of Muslim terrorists have the far more serious consequence of inducing many to vastly overestimate the likelihood that any random Muslim might be a terrorist. A passenger on an American Airlines flight in May reported a suspicious dark-haired man next to her scribbling what looked like Arabic on a piece of paper. Security personnel pulled him off the plane only to discover that he was an Italian professor of economics at the University of Pennsylvania who had been doing differential equations.

It would be wrong to mock uneducated people for such mistakes, because Mr. Kahneman and Tversky found that even those trained in statistics exhibit the same cognitive biases. Indeed, there exist no experts without cognitive biases to fix everyone else’s cognitive biases.

Tversky sadly died much too young, at age 59. Mr. Kahneman went on to win the Nobel Prize in economics in 2002, then wrote a best-selling book, “Thinking, Fast and Slow,” in 2011—the next great book to read after this one. There Mr. Kahneman balanced out the account a bit. He and Tversky had focused on mistakes, but there are many things that the brain does well—as Mr. Lewis also notes.

I had to wonder, as I was reading “The Undoing Project,” whether Mr. Lewis would really respect his subjects’ doubts about experts or follow the imperative that any book on any problem must conclude with experts confidently solving the problem. Mr. Lewis passed the test. There is a brief mention of a few expert nudges to trick people into making decisions in their own interest on things like retirement savings, but the nudge approach correctly seems like small beer. In a world of overly certain predictions and policy prescriptions from consulting firms and think tanks to politicians and book authors, Mr. Lewis has given us a spectacular account of two great men who faced up to uncertainty and the limits of human reason.