I think that you can think about “forecasting” as one of a collection of intellectual practices that the EA community is unusually focused on.
Other practices/norms include: - “Scout Mindset” - Bayesian Thinking - Lots of care/interest about analytical philosophy - A preferences for empirical data/arguments - A mild dislike for many kinds of emotions arguments - Rationalism / Rationalist Fiction
I think that a background variable here is that EA is close to an intellectual community of thinkers that use similar tools and practices. Thinkers like Steven Pinker, Matthew Yglesias, Robin Hanson, Bryan Caplan, and Andrew Gelman come to mind as people with somewhat similar styles and interests. Many of these people also showed unusual interest in “forecasting”.
So some questions here would be: 1. How well does EA fit into some outer intellectual tribes, like I hinted at above? 2. What preferences/norms do these tribes have, and how justified are they? 3. Why aren’t all of these preferences/norms more common?
I think that “forecasting” as we discuss it is often a set of norms like: 1. Making sure that forecasts are recorded and scored. 2. Making sure forecasters are incentivized to do well. 3. Making sure that the top forecasters are chosen for future rounds.
To do this formally requires a fair bit of overhead, so it doesn’t make much sense for small organizations.
I think that larger organizations either know enough to vet and incentivize their existing analysts (getting many of the benefits of forecasters), or don’t, in which case they won’t be convinced to use forecasters. (I think that obvious explanations are some of the reason, but I do have questions here)
Society-wide, I think most people don’t care about forecasters for similar reasons that most people don’t care about Bayesian Thinking, Scott Alexander, or Andrew Gelman. I think these tools/people are clever, but others aren’t convinced/knowledgeable of them.
Some thoughts:
I think that you can think about “forecasting” as one of a collection of intellectual practices that the EA community is unusually focused on.
Other practices/norms include:
- “Scout Mindset”
- Bayesian Thinking
- Lots of care/interest about analytical philosophy
- A preferences for empirical data/arguments
- A mild dislike for many kinds of emotions arguments
- Rationalism / Rationalist Fiction
I think that a background variable here is that EA is close to an intellectual community of thinkers that use similar tools and practices. Thinkers like Steven Pinker, Matthew Yglesias, Robin Hanson, Bryan Caplan, and Andrew Gelman come to mind as people with somewhat similar styles and interests. Many of these people also showed unusual interest in “forecasting”.
So some questions here would be:
1. How well does EA fit into some outer intellectual tribes, like I hinted at above?
2. What preferences/norms do these tribes have, and how justified are they?
3. Why aren’t all of these preferences/norms more common?
I think that “forecasting” as we discuss it is often a set of norms like:
1. Making sure that forecasts are recorded and scored.
2. Making sure forecasters are incentivized to do well.
3. Making sure that the top forecasters are chosen for future rounds.
To do this formally requires a fair bit of overhead, so it doesn’t make much sense for small organizations.
I think that larger organizations either know enough to vet and incentivize their existing analysts (getting many of the benefits of forecasters), or don’t, in which case they won’t be convinced to use forecasters. (I think that obvious explanations are some of the reason, but I do have questions here)
Society-wide, I think most people don’t care about forecasters for similar reasons that most people don’t care about Bayesian Thinking, Scott Alexander, or Andrew Gelman. I think these tools/people are clever, but others aren’t convinced/knowledgeable of them.