An Urban's Rural View

Why Experts Make Bad Calls

Urban C Lehner
By  Urban C Lehner , Editor Emeritus
Connect with Urban:

Who makes smarter decisions -- the expert or the algorithm?

In 2003, Michael Lewis wrote a book arguing for the algorithm that got made into a movie. "Moneyball" described how a low-budget Major League Baseball team, the Oakland Athletics, had become a contender by believing the statistics rather than the scouts -- and by focusing on smarter stats. Brad Pitt played Athletics general manager Billy Beane.

The Athletics' approach has since gone mainstream. Many teams have hired number crunchers and listened less to scouts. Many fans pay more attention to on-base percentages and slugging percentages than such traditional measures as batting averages and RBIs. Other sports teams have gotten into the numbers game, too, including the Super Bowl-winning Philadelphia Eagles.

What, you may be wondering, does this have to do with the price of corn? Possibly quite a lot.

In the introduction to his 2016 follow-up book, "The Undoing Project," Lewis put it this way: "If the market for baseball players was inefficient, what market couldn't be? If a fresh, analytical approach had led to the discovery of new knowledge in baseball, was there any sphere of human activity in which it might not do the same?"

Sensing that farmers, too, might be relying on inexpert experts or looking at the wrong numbers, I devoted my first DTN column in 2003 to what "Moneyball" might mean for DTN users. Striving to find the right numbers for farm marketers, DTN senior analyst Darin Newsom devised the DTN Six Factors, which he and his team still use to analyze markets and make buy-sell recommendations on 16 commodities.

P[L1] D[0x0] M[300x250] OOP[F] ADUNIT[] T[]

Why do people, even experts, make bad calls? Lewis writes of an experiment in which doctors made worse cancer diagnoses than algorithms: "You could beat the doctor by replacing him with an equation created by people who knew nothing about medicine and had simply asked a few questions of doctors." "The Undoing Project" tries to explain how this could be.

The book focuses on two Israeli psychologists, Daniel Kahneman and Amos Tversky, who rewrote our understanding of how people make decisions, and in the process challenged some of the most fundamental assumptions of economics. In 2006 Kahneman won the Nobel Prize in economics, a first for a psychologist. (Tversky would likely have won it with Kahneman, but he died a decade before the prize was awarded.)

Here, oversimplified, is the Kahneman- Tversky theory: Faced with a complex problem, human beings often avoid doing the deep thinking or calculations necessary to reach the right answer. Instead they use mental shortcuts called "heuristics." Sometimes those heuristics lead them astray.

Heuristics come in a variety of shapes and sizes. One of the simplest is called "representativeness" -- you compare the situation you're analyzing with a stereotype or mental model.

Lewis offers a memorable example. The NBA Houston Rockets had an algorithm for judging players that said they should draft Jeremy Lin on the basis of his performance playing for Harvard. But "the objective measurement of Jeremy Lin didn't square with what the experts saw when they watched him play: a not terribly athletic Asian kid."

The Rockets didn't draft Lin. When his career took off with another team, they analyzed their mistake and added a new measurement to their model -- the speed of a player's first two steps. Lin had "the quickest first move of any player measured. He was explosive." But he didn't look like the experts' mental model of an NBA player. He was Asian.

Another of the Israeli psychologists' heuristics, "availability," holds that people opt for solutions that are easy to call to mind. In one experiment, students were asked whether there are more words in the English language beginning with the letter K or more with K in the third position. There are actually many more of the latter. The students got it wrong because "it was simply easier to recall words that start with K than to recall words with K as their third letter."

"Anchoring" is yet another heuristic. "Danny and Amos asked their subjects to spin a wheel of fortune with slots numbered 0 through 100. Then they asked the subjects to estimate the percentage of African countries in the United Nations. Those who had spun a higher number tended to guess that a higher percentage of the UN consisted of African nations." In the same way, students given an impossibly brief five seconds to multiply 8 x 7 x 6 x 5 x 4 x 3 x 2 x 1 tended to guess a much higher answer than those given five seconds to multiply 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8. The first group's "anchor" was a higher number.

The book abounds with examples explaining how bad decisions get made. People trained in statistics and probability theory, for example, nonetheless fell for the "gambler's fallacy" of assuming that because a coin flip had come up heads several times in a row, the next flip would more likely be tails.

They did this because, despite their statistical training, they based big judgments on small sample sizes. Flip a coin a thousand times and the heads and tails will likely be roughly equal. Ten flips, though, could easily come up all heads. On any one flip the odds are 50-50.

There are many layers to the book, including the complex relationship between these two very different, very brilliant men. They challenged each other relentlessly, and both were willing to admit when they were wrong.

But the heart of the book is how the world came to understand the foibles of human decision-making. It's an understanding that's useful for anyone who must make complex decisions, farmers and ranchers very much included.

Urban Lehner can be reached at urbanize@gmail.com

(ES)

P[] D[728x170] M[320x75] OOP[F] ADUNIT[] T[]
P[L2] D[728x90] M[320x50] OOP[F] ADUNIT[] T[]

Comments

To comment, please Log In or Join our Community .