Preventing catastrophic risks, improving global health and improving animal welfare are goals in themselves. At best, forecasting is a meta topic that supports other goals
Yes, itâs a meta topic; Iâm commenting less on the importance of forecasting in an ITN framework and more on its neglectedness. This stuff basically doesnât get funding outside of EA, and even inside EA had no institutional commitment; outside of random one-of grants, the largest forecasting funding program Iâm aware of over the last 2 years were $30k in âminigrantsâ funded by Scott Alexander out of pocket.
But on the importance of it: insofar as you think future people matter and that we have the ability and responsibility to help them, forecasting the future is paramount. Steering todayâs world without understanding the future would be like trying to help people in Africa, but without overseas reporting to guide youâyouâll obviously do worse if you canât see outcomes of your actions.
You can make a reasonable argument (as some other commenters do!) that the tractability of forecasting to date hasnât been great; I agree that the most common approaches of âtournament setting forecastingâ or âsuperforecaster consultingâ havenât produced much of decision-relevance. But there are many other possible approaches (eg FutureSearch.ai is doing interesting things using an LLM to forecast), and Iâm again excited to see what Ben and Javier do here.
Yes, itâs a meta topic; Iâm commenting less on the importance of forecasting in an ITN framework and more on its neglectedness. This stuff basically doesnât get funding outside of EA, and even inside EA had no institutional commitment;
I donât think itâs necessary to talk in terms of an ITN framework but something being neglected isnât nearly reason enough to fund it. Neglectedness is perhaps the least important part of the framework and something being neglected alone isnât a reason to fund it. Getting 6 year olds in race cars for example seems like a neglected cause but one that isnât worth pursuing.
I think something not getting funding outside of EA is probably a medium-sized update to the thing not being important enough to work on. Things start to get EA funding once a sufficient number of the community finds the arguments for working on a problem sufficiently convincing. But many many many problems have come across EAâs eyes and very few of them have stuck. For something to not get funding from others suggests that very few others found it to be important.
Yeah, I agree neglectedness is less important but it does capture something important; I think eg climate change is both important and tractable but not neglected. In my head, âimportanceâ is about âhow much would a perfectly rational world direct at this?â while âneglectedâ is âhow far are we from that world?â.
Also agreed that the lack of external funding is an update that forecasting (as currently conceived) has more hype than real utility. I tend to think this is because of the narrowness of how forecasting is currently framed, though (see my comments on tractability above)
Thatâs a great resource I wasnât aware of, thanks (did you make it?). I do think that OpenPhil has spent a commendable amount of money on forecasting to date (though: nowhere near half Animal Welfare, more like a tenth). But I think this has been done very unsystematically, with no dedicated grantmaker. My understanding it was like, a side project of Luke Muehlhauser for a long time; when I reached out in Jan â23 he said they were not making new forecasting grants until they filled this role. Even if it took a year, Iâm glad this program is now launched!
I think your point 1 is a good starting point but I would add âin percentage terms compared to all other potential causesâ and you have to be in the top 1% of that for EA to consider the cause neglected.
3. I didnât make it. It is great though. I was talking about on a yearly basis in the last couple years. That said, I made the comment off memory so I could be wrong.
Preventing catastrophic risks, improving global health and improving animal welfare are goals in themselves. At best, forecasting is a meta topic that supports other goals
Yes, itâs a meta topic; Iâm commenting less on the importance of forecasting in an ITN framework and more on its neglectedness. This stuff basically doesnât get funding outside of EA, and even inside EA had no institutional commitment; outside of random one-of grants, the largest forecasting funding program Iâm aware of over the last 2 years were $30k in âminigrantsâ funded by Scott Alexander out of pocket.
But on the importance of it: insofar as you think future people matter and that we have the ability and responsibility to help them, forecasting the future is paramount. Steering todayâs world without understanding the future would be like trying to help people in Africa, but without overseas reporting to guide youâyouâll obviously do worse if you canât see outcomes of your actions.
You can make a reasonable argument (as some other commenters do!) that the tractability of forecasting to date hasnât been great; I agree that the most common approaches of âtournament setting forecastingâ or âsuperforecaster consultingâ havenât produced much of decision-relevance. But there are many other possible approaches (eg FutureSearch.ai is doing interesting things using an LLM to forecast), and Iâm again excited to see what Ben and Javier do here.
I donât think itâs necessary to talk in terms of an ITN framework but something being neglected isnât nearly reason enough to fund it. Neglectedness is perhaps the least important part of the framework and something being neglected alone isnât a reason to fund it. Getting 6 year olds in race cars for example seems like a neglected cause but one that isnât worth pursuing.
I think something not getting funding outside of EA is probably a medium-sized update to the thing not being important enough to work on. Things start to get EA funding once a sufficient number of the community finds the arguments for working on a problem sufficiently convincing. But many many many problems have come across EAâs eyes and very few of them have stuck. For something to not get funding from others suggests that very few others found it to be important.
Forecasting still seems to get a fair amount of dollars, probably about half as much as animal welfare. https://ââdocs.google.com/ââspreadsheets/ââd/ââ1ip7nXs7l-8sahT6ehvk2pBrlQ6Umy5IMPYStO3taaoc/ââedit?usp=sharing
Your points on helping future people (and non-human animals) are well taken.
Yeah, I agree neglectedness is less important but it does capture something important; I think eg climate change is both important and tractable but not neglected. In my head, âimportanceâ is about âhow much would a perfectly rational world direct at this?â while âneglectedâ is âhow far are we from that world?â.
Also agreed that the lack of external funding is an update that forecasting (as currently conceived) has more hype than real utility. I tend to think this is because of the narrowness of how forecasting is currently framed, though (see my comments on tractability above)
Thatâs a great resource I wasnât aware of, thanks (did you make it?). I do think that OpenPhil has spent a commendable amount of money on forecasting to date (though: nowhere near half Animal Welfare, more like a tenth). But I think this has been done very unsystematically, with no dedicated grantmaker. My understanding it was like, a side project of Luke Muehlhauser for a long time; when I reached out in Jan â23 he said they were not making new forecasting grants until they filled this role. Even if it took a year, Iâm glad this program is now launched!
I think your point 1 is a good starting point but I would add âin percentage terms compared to all other potential causesâ and you have to be in the top 1% of that for EA to consider the cause neglected.
3. I didnât make it. It is great though. I was talking about on a yearly basis in the last couple years. That said, I made the comment off memory so I could be wrong.