Continuing the last post, one reason that many of us
don't trust experts and social engineers is because they often get it
wrong. I'm reading the new book Range: Why Generalists Triumph in a Specialized World. Here's an except:
"The
pattern is by now familiar [....] the track record of expert
forecasters—in science, in economics, in politics—is as dismal as ever.
In business, esteemed (and lavishly compensated) forecasters routinely
are wildly wrong in their predictions of everything from the next
stock-market correction to the next housing boom. Reliable insight into
the future is possible, however. It just requires a style of thinking
that’s uncommon among experts who are certain that their deep knowledge
has granted them a special grasp of what is to come. [...] Even faced
with their results, many experts never admitted systematic flaws in
their judgment. When they missed wildly, it was a near miss; if just one
little thing had gone differently, they would have nailed it. 'There is
often a curiously inverse relationship,' Tetlock concluded, 'between
how well forecasters thought they were doing and how well they did.'"
"One
subgroup of scholars, however, did manage to see more of what was
coming [....] they were not vested in a single discipline. They took
from each argument and integrated apparently contradictory worldviews.
[...] The integrators outperformed their colleagues in pretty much every
way, but especially trounced them on long-term predictions. Eventually,
Tetlock bestowed nicknames (borrowed from the philosopher Isaiah
Berlin) on the experts he’d observed: The highly specialized hedgehogs
knew 'one big thing,' while the integrator foxes knew 'many little
things.'”
"When an outcome took them by surprise,
foxes were much more likely to adjust their ideas. Hedgehogs barely
budged. Some made authoritative predictions that turned out to be wildly
wrong—then updated their theories in the wrong direction. They became
even more convinced of the original beliefs that had led them astray.
The best forecasters, by contrast, view their own ideas as hypotheses in
need of testing. If they make a bet and lose, they embrace the logic of
a loss just as they would the reinforcement of a win. This is called,
in a word, learning."
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.