Thanks for the long comment, this gives me a much richer picture of how people might be thinking about this. On the first two bullets:
You say you aren’t anchoring, in a world where we defaulted to expressing probability in 1/10^6 units called Ms I’m just left feeling like you would write “you should be hesitant to assign 999,999M+ probabilities without a good argument. The burden of proof gets stronger and stronger as you move closer to 1, and 1,000,000 is getting to be a big number.”. So if it’s not anchoring, what calculation or intuition is leading you to specifically 99% (or at least, something in that ballpark), and would similarly lead you to roughly 990,000M with the alternate language?
My reply to Max and your first bullet both give examples of cases in the natural world where probabilities of real future events would go way outside the 0.01% − 99.99% range. Conjunctions force you to have extreme confidence somewhere, the only question is where. If I try to steelman your claim, I think I end up with an idea that we should apply our extreme confidence to the thing inside the product due to correlated cause, rather than the thing outside; does that sound fair?
The rest I see as an attempt to justify the extreme confidences inside the product, and I’ll have to think about more. The following are gut responses:
I’m not sure which step of this you get off the boat for
I’m much more baseline cynical than you seem to be about people’s willingness and ability to actually try, and try consistently, over a huge time period. To give some idea, I’d probably have assigned <50% probability to humanity surviving to the year 2150, and <10% for the year 3000, before I came across EA. Whether that’s correct or not, I don’t think its wildly unusual among people who take climate change seriously*, and yet we almost certainly aren’t doing enough to combat that as a society. This gives me little hope for dealing with <10% threats that will surely appear over the centuries, and as a result I found and continue to find the seemingly-baseline optimism of longtermist EA very jarring.
(Again, the above is a gut response as opposed to a reasoned claim.)
Applying the rule of thumb for estimating lifetimes to “the human species” rather than “intelligent life” seems like it’s doing a huge amount of work.
Yeah, Owen made a similar point, and actually I was using civilisation rather than ‘the human species’, which is 20x shorter still. I honestly hadn’t thought about intelligent life as a possible class before, and that probably is the thing from this conversation that has the most chance of changing how I think about this.
*”The survey from the Yale Program on Climate Change Communication found that 39 percent think the odds of global warming ending the human race are at least 50 percent. ”
Thanks for the long comment, this gives me a much richer picture of how people might be thinking about this. On the first two bullets:
You say you aren’t anchoring, in a world where we defaulted to expressing probability in 1/10^6 units called Ms I’m just left feeling like you would write “you should be hesitant to assign 999,999M+ probabilities without a good argument. The burden of proof gets stronger and stronger as you move closer to 1, and 1,000,000 is getting to be a big number.”. So if it’s not anchoring, what calculation or intuition is leading you to specifically 99% (or at least, something in that ballpark), and would similarly lead you to roughly 990,000M with the alternate language?
My reply to Max and your first bullet both give examples of cases in the natural world where probabilities of real future events would go way outside the 0.01% − 99.99% range. Conjunctions force you to have extreme confidence somewhere, the only question is where. If I try to steelman your claim, I think I end up with an idea that we should apply our extreme confidence to the thing inside the product due to correlated cause, rather than the thing outside; does that sound fair?
The rest I see as an attempt to justify the extreme confidences inside the product, and I’ll have to think about more. The following are gut responses:
I’m much more baseline cynical than you seem to be about people’s willingness and ability to actually try, and try consistently, over a huge time period. To give some idea, I’d probably have assigned <50% probability to humanity surviving to the year 2150, and <10% for the year 3000, before I came across EA. Whether that’s correct or not, I don’t think its wildly unusual among people who take climate change seriously*, and yet we almost certainly aren’t doing enough to combat that as a society. This gives me little hope for dealing with <10% threats that will surely appear over the centuries, and as a result I found and continue to find the seemingly-baseline optimism of longtermist EA very jarring.
(Again, the above is a gut response as opposed to a reasoned claim.)
Yeah, Owen made a similar point, and actually I was using civilisation rather than ‘the human species’, which is 20x shorter still. I honestly hadn’t thought about intelligent life as a possible class before, and that probably is the thing from this conversation that has the most chance of changing how I think about this.
*”The survey from the Yale Program on Climate Change Communication found that 39 percent think the odds of global warming ending the human race are at least 50 percent. ”