Re: Existential Risk Persuasion Tournament, I’m wondering if one thing to consider with forecasters is just that they think a lot about the future, asking them to then imagine a future where a ton of their preconceived predictions may not occur, I wonder if this is a significant model shift. Or something like:
forecasters are biased toward status quo as that is easier to predict from—imagine you had to take into account everything all at once in your prediction, “will x marry by year 2050? Well there’s a 1% chance both parties are dead because of AI…” is absurd.
But I guess forecasters also have a more accurate world model anyway. Though this still felt like something I wanted to write out anyway considering I was trying to justify the forecasters low xrisk takes. (Again, status quo bias against extreme changes)
Re: Existential Risk Persuasion Tournament, I’m wondering if one thing to consider with forecasters is just that they think a lot about the future, asking them to then imagine a future where a ton of their preconceived predictions may not occur, I wonder if this is a significant model shift. Or something like:
forecasters are biased toward status quo as that is easier to predict from—imagine you had to take into account everything all at once in your prediction, “will x marry by year 2050? Well there’s a 1% chance both parties are dead because of AI…” is absurd.
But I guess forecasters also have a more accurate world model anyway. Though this still felt like something I wanted to write out anyway considering I was trying to justify the forecasters low xrisk takes. (Again, status quo bias against extreme changes)