This summary of US and UK policies for communicating probability in intelligence reports.
Apparently Niall Ferguson’s consulting firm makes & checks some quantified forecasts every year: “So at the beginning of each year we at Greenmantle make predictions about the year ahead, and at the end of the year we see — and tell our clients — how we did. Each December we also rate every predictive statement we have made in the previous 12 months, either “true”, “false” or “not proven”. In recent years, we have also forced ourselves to attach probabilities to our predictions — not easy when so much lies in the realm of uncertainty rather than calculable risk. We have, in short, tried to be superforecasters.”
Review of some failed long-term space forecasts by Carl Shulman.
Thanks!
Some additional recent stuff I found interesting:
This summary of US and UK policies for communicating probability in intelligence reports.
Apparently Niall Ferguson’s consulting firm makes & checks some quantified forecasts every year: “So at the beginning of each year we at Greenmantle make predictions about the year ahead, and at the end of the year we see — and tell our clients — how we did. Each December we also rate every predictive statement we have made in the previous 12 months, either “true”, “false” or “not proven”. In recent years, we have also forced ourselves to attach probabilities to our predictions — not easy when so much lies in the realm of uncertainty rather than calculable risk. We have, in short, tried to be superforecasters.”
Review of some failed long-term space forecasts by Carl Shulman.
Some early promising results from DARPA SCORE.
Assessing Kurzweil predictions about 2019: the results
Bias, Information, Noise: The BIN Model of Forecasting is a pretty interesting result if it holds up. Another explanation by Mauboussin here. Supposedly this is what Kahneman’s next book will be about; HBR preview here.
GJP2 is now recruiting forecasters.