Bits of Books - Books by Title


Superforecasting: The Art and Science of Prediction



Philip Tetlock and Dan Gardner





Foresight isn't innate. It's a product of certain ways of thinking, of gathering information, and updating beliefs.

Increasingly aware of inadequacy of the 'guru' model - listening to pundits like Paul Krugman or Niall Ferguson whose view ionly come from subjective judgement.

For a long time the C2 Roman physician Galen was regarded as the medical authority. But he never conducted anything like an expt. He was always right, no matter what the outcome. "All who take this treatment recover in a short time, except for those it does not help, who die. It is obvious, therefore, that it fails only in incurable cases."

Our System 1 and System 2 thinking processes. We need System 1 to deal with emergencies, so we follow hunches. If it feels true, it is. The impt thing is that it is insensitive to the quality of evidence. It delivers strong conclusions at lightning speed, never stopping to wonder if the 'evidence' might be flawed, or whether there might be better evidence available elsewhere.

More books on the Mind

Evaluate validity of cues. It is likely that there are early indications that a building is going to collapse into a fire, or that a child is coming down with an infection. But unlikely that there is publicly available information that can predict how a share price will change - if there is, it will already be incorporated in share price. So can trust the firefighter or the nurse, but not the stock broker.

The Fox and Hedgehog difference. The prediction teams that didn't out-predict 'chimp throwing darts' were the ones who tried to shoe-horn everything into their preconceived world view - whether that was conservative or socialist or pessimistic ('We're running out of everything') or optimistic ('We'll find answers in future so no need to worry today'). They were also more confident and likely to label outcomes as 'certain' or 'impossible'. Committed to their conclusions, they were unwilling to admit error, instead saying "Just wait."

The other group was more pragmatic, prepared to choose different tools for different tasks. They talked about probabilities rather than certainties, and were more willing to admit errors.

The more famous an expert is, the less accurate he was. Hedgehogs tell tight, simple stories that appeal to their audience. And they are confident, and people hate uncertainty.

Writing down your prediction, and the steps to it, helps distance you and lets you scrutinise - "Would this convince someone else?" "Is there someone else who can add info to this?" The simple step of assuming that your prediction is wrong, and looking for evidence to support that contrary POV, often produces a second estimate which is better.

So, in Nov 2014 asked whether Saudi Arabia would agree to rpoduction cuts to try to boost price of oil. On the one hand the Saudis run few risks letting price stay low bc they have huge financial reserves. On the other hand, they need high prices to support high social spending to keep population on their side. On the third hand, they may see cuts as futile bc enough oil coming from other sources that OPEC can no longer dictate prices. So, net answer, feels no-ish 80%. And in fact Saudis didn't support price cuts, which surprised most experts.

Have to have an appetite to question basic, emotionally charged beliefs.

Superforecasters often approach problems in systematic way: unpack problem into components. Distinguish as sharply as possible between known and unknown bits. Leave no assumption unscrutinised. Adopt outside view. Downplay uniqueness and look at it as a special case of a wider class of phenomena. Then adopt inside view and concentrate on how unique the problem is. Then express judgement as precisely as you can, with a probability figure.

When you have a strong belief in something, no evidence will shift it. Whether it's biblical literalism, gun control, homophobia or racial prejudice, you can't take the block at the base of your (belief) tower without threatening rest of your identity.

Many people have a "fixed mindset" - everything is innate and not much you can do about it. So, "I'm bad at maths" or "I'm crazy", and any setback just confirms that belief - it becomes self-fulfilling. But a "growth mindset" looks to improve, and is prepared to change mind when facts change.

Police get over-confident of their ability to spot liars, because they seldom get quick feedback. So they grow more confident faster than they grow more accurate. Meteorologists and bridge players get quick feedback on their predictions, so more humility. But skill at bridge or weather forecasting does not translate into skill at political forecasting.

1992 Pres elections everyone was sure GWB was going to win re-election easily, so all the Democrat big guns stayed out, leaving nomination to an obscure Arkansas governor, Bill Clinton.

General Petraeus keen for American generals to go to Uni, if only to make them realize that there are seriously smart people in the world who hold very different assumptions about the world and so come to different conclusions than what you do. And like encountering shocks on the battlefield, grappling with other ways of thinking trains officers to be mentally flexible.

Nicholas ( "The Black Swan") Taleb reckons that societies don't crawl; they advance by jumps. What matters can't be forecast, and what can be forecast doesn't matter. But Tetlock suggests that most Black Swans are actually gray - not likely, but not totally unpredictable. And although jumps are most obvious part of progress, crawling (as in steady gradual improvement) is also important. An average 1% annual growth in C19th and 2% in C20th turned the squalor of the C18th into the prosperity we see today.

The attributes of a Superforecaster

Cautious
Humble
Actively open-minded
Intellectually curious
Introspective and self-critical
Numerate
Pragmatic rather than dogmatic - not wedded to any fixed ideas
Analytical - can step back and consider other POVs
Probabilistic - judge in grades of maybe
Updaters - modify as new facts emerge
Psychologists - check thinking for cognitive and emotional biases
Grit

(London Times)



Who wouldn't want to be a superforecaster? The entire betting industry is based on people's beliefs (or delusions) that they can anticipate the results of future events. All financial investment and speculation is similarly motivated: the big banks employ economists not for academic reasons, but to give their traders the best available advice on likely future trends in the bond and equity markets.

Philip Tetlock is a Canadian professor whose work at the hinge between political science and psychology has some interesting things to say about the precarious profession of prediction. He sprang to public attention in 2004 when his project of surveying forecasting over the previous 20 years came to the conclusion that there was very little difference between the accuracy of so-called 'experts' and guesses made by the man in the street.

In this fascinating and breezily written follow-up, Tetlock now takes issue with those who say that he has proved that paying people for forecasts is a complete waste of money. He argues instead that it is possible to be exceptionally good at forecasting events over a wide field, without actually being an 'expert'. Some of us simply have the required nous for it.

This conclusion is itself the result of another gigantic survey masterminded by Tetlock, courtesy of the Pentagon. This stemmed from the fiasco of the CIA telling President George W Bush that Saddam Hussein possessed weapons of mass destruction, ready for deployment against the West. The agency didn't just say this was 'likely' or 'probable'. It was 100% certain - a 'slam dunk'.

Given that the Pentagon has approximately 20,000 people collating and interpreting intelligence, such a conclusion was monumentally embarrassing. So it decided to engineer a mega-survey of forecasters, to see what could be learnt from those who did it best. A little-known branch called the Intelligence Advanced Research Projects Activity got in touch with Tetlock and commissioned him to set thousands of questions about future world events as a kind of examination paper, which was then sent out to a similarly large number of volunteers, by no means all of them professional forecasters. This was called The Good Judgment Project.

To cut a long story short, some people scored spectacularly well in the tests, getting the answers right to such varied questions as 'Will the Japanese prime minister visit Yasukuni Shrine?' or 'Will Saudi Arabia agree to tighter oil production quotas?' Many of these 'superforecasters' were not professionals.

One such was Doug Lorch, a former IBM programmer with an insatiable curiosity about the world who spends his retirement reading newspapers assiduously. What Lorch shares with other superforecasters identified by Tetlock is a set of common characteristics. They are generally, for instance, highly dispassionate (most people allow their forecasts to be influenced by their political or moral opinions, even if they don't realise it). They also find it easy to admit that they have got things wrong, and then try to assess the matter from a different perspective. This perhaps explains why the pundits that you see on television are no better than the man in the street tossing a coin: they are thoroughly invested in a certain view. And these pundits' certainty, while compelling to viewers, is itself the mark of an insufficiently self-critical outlook.

Tetlock's work is fascinating, though I am not sure I agree that particular individuals could consistently beat all markets in prediction, because of their sheer skill. Luck can never be ruled out. But what do I know? I never gamble.



More books on Politics

More books on Business











Books by Title

Books by Author

Books by Topic

Bits of Books To Impress