Warren Buffett — as usual — may be right about forecasts:
- Three-quarters of all U.S. stock mutual funds have failed to beat the market over the past decade.
- Last year, 98% of economists expected interest rates to rise; they fell instead.
- Most energy analysts didn’t foresee oil’s collapse from $145 a barrel in 2008 to $38 this summer — or its 15% rebound since.
However, as Jason Zweig pointed out in the WSJ a few days ago, a new book suggests that amateurs might well be less-hapless forecasters than the experts — so long as they go about it the right way.
Philip Tetlock — psychologist and professor of management at the University of Pennsylvania’s Wharton School and author of “Superforecasting: The Art and Science of Prediction” (co-written with journalist Dan Gardner) — just concluded the first stage of what he calls the Good Judgment Project, which pitted some 20,000 amateur forecasters against some of the most knowledgeable experts in the world.
The amateurs won — hands down. Their forecasts were more accurate more often, and the confidence they had in their forecasts — as measured by the odds they set on being right — was more accurately tuned.
It turned out that, after rigorous statistical controls, the elite amateurs were on average about 30% more accurate than the experts with access to classified information. What’s more, the full pool of amateurs also outperformed the experts.
Most experts — like most people — “are too quick to make up their minds and too slow to change them,” Tetlock says. And experts are paid not just to be right, but to sound right: cocksure even when the evidence is sparse or ambiguous.
The most careful, curious, open-minded, persistent and self-critical of the amateur forecasters — as measured by a battery of psychological tests — did the best.
The top 2%, whom Prof. Tetlock dubs “superforecasters,” have above-average — but rarely genius-level — intelligence. Many are mathematicians, scientists or software engineers; but among the others are a pharmacist, a Pilates instructor, a caseworker for the Pennsylvania state welfare department and a Canadian underwater-hockey coach.
So How Can You Improve Your Own Prediction Skills?
Zweig lays out these 5 Steps to improving your prediction skills:
Compare. Establish the “base rate,” the typical historical frequency or severity of the outcome you are trying to forecast, by picking a reference class of similar past events. Very few situations are unique; think deeply enough about the instance at hand, and you will likely discover a set of historical precedents for it. Instead of guessing how stocks will do if interest rates rise by one percentage point, for instance, look at the full sample of all such past interest-rate increases, take the average of subsequent stock performance and use that as your base-rate assumption.
Try to flush your ignorance out into the open: Don’t just say, “I think emerging-markets stocks are cheap right now.” Instead, say “There is an 80% probability that emerging-markets stocks will go up at least 25% in the next 12 months.” When significant new information comes in, adjust your forecast incrementally.
Invert. Consider the opposite and ask what information you would need if you were trying to make a case that emerging-markets stocks will go down at least 25% in the next 12 months. Then seek it out, absorb it and test whether you need to revise your original forecast.
Find a variety of sources for news and analysis, varying widely in their viewpoints, vested interests and geographic origin. Discount the sources that use the self-reinforcing language of overconfidence: Words like “furthermore,” “clearly” and “certain” or “impossible.” Heed most closely those sources that admit uncertainty with words like “however,” “but,” and “on the other hand.”
Record in detail the reasoning behind your forecasts. When you turn out to be right, don’t just celebrate; check your records to see whether you were right for the reasons you predicted or whether you just got lucky. When you turn out to be wrong, compare this error to the other times you were mistaken, looking for patterns to correct. If you don’t measure every meaningful aspect of your forecasts, you will never improve.
You can read Zweig’s original WSJ post here: Can You See the Future? Probably Better Than Professional Forecasters and his 5 step guide here
Finally, you can take a spin through Tetlock and Gardner’s new book “Superforecasting: The Art and Science of Prediction.” Zweig writes that “Superforecasting” is the most important book on decision making since Daniel Kahneman’s “Thinking, Fast and Slow.” And Prof. Kahneman agrees. “It’s a manual to systematic thinking in the real world… This book shows that under the right conditions regular people are capable of improving their judgment enough to beat the professionals at their own game.” (I haven’t read the “Superinvestors” book yet — it was just released on Sept. 29th — but I have read “Thinking, Fast and Slow,” which was an excellent book on decision making and behavioral economics).