Thanks for the long comment, this gives me a much richer picture of how people might be thinking about this. On the first two bullets:
You say you arenāt anchoring, in a world where we defaulted to expressing probability in 1/ā10^6 units called Ms Iām just left feeling like you would write āyou should be hesitant to assign 999,999M+ probabilities without a good argument. The burden of proof gets stronger and stronger as you move closer to 1, and 1,000,000 is getting to be a big number.ā. So if itās not anchoring, what calculation or intuition is leading you to specifically 99% (or at least, something in that ballpark), and would similarly lead you to roughly 990,000M with the alternate language?
My reply to Max and your first bullet both give examples of cases in the natural world where probabilities of real future events would go way outside the 0.01% ā 99.99% range. Conjunctions force you to have extreme confidence somewhere, the only question is where. If I try to steelman your claim, I think I end up with an idea that we should apply our extreme confidence to the thing inside the product due to correlated cause, rather than the thing outside; does that sound fair?
The rest I see as an attempt to justify the extreme confidences inside the product, and Iāll have to think about more. The following are gut responses:
Iām not sure which step of this you get off the boat for
Iām much more baseline cynical than you seem to be about peopleās willingness and ability to actually try, and try consistently, over a huge time period. To give some idea, Iād probably have assigned <50% probability to humanity surviving to the year 2150, and <10% for the year 3000, before I came across EA. Whether thatās correct or not, I donāt think its wildly unusual among people who take climate change seriously*, and yet we almost certainly arenāt doing enough to combat that as a society. This gives me little hope for dealing with <10% threats that will surely appear over the centuries, and as a result I found and continue to find the seemingly-baseline optimism of longtermist EA very jarring.
(Again, the above is a gut response as opposed to a reasoned claim.)
Applying the rule of thumb for estimating lifetimes to āthe human speciesā rather than āintelligent lifeā seems like itās doing a huge amount of work.
Yeah, Owen made a similar point, and actually I was using civilisation rather than āthe human speciesā, which is 20x shorter still. I honestly hadnāt thought about intelligent life as a possible class before, and that probably is the thing from this conversation that has the most chance of changing how I think about this.
*āThe survey from the Yale Program on Climate Change Communication found that 39 percent think the odds of global warming ending the human race are at least 50 percent. ā
Thanks for the long comment, this gives me a much richer picture of how people might be thinking about this. On the first two bullets:
You say you arenāt anchoring, in a world where we defaulted to expressing probability in 1/ā10^6 units called Ms Iām just left feeling like you would write āyou should be hesitant to assign 999,999M+ probabilities without a good argument. The burden of proof gets stronger and stronger as you move closer to 1, and 1,000,000 is getting to be a big number.ā. So if itās not anchoring, what calculation or intuition is leading you to specifically 99% (or at least, something in that ballpark), and would similarly lead you to roughly 990,000M with the alternate language?
My reply to Max and your first bullet both give examples of cases in the natural world where probabilities of real future events would go way outside the 0.01% ā 99.99% range. Conjunctions force you to have extreme confidence somewhere, the only question is where. If I try to steelman your claim, I think I end up with an idea that we should apply our extreme confidence to the thing inside the product due to correlated cause, rather than the thing outside; does that sound fair?
The rest I see as an attempt to justify the extreme confidences inside the product, and Iāll have to think about more. The following are gut responses:
Iām much more baseline cynical than you seem to be about peopleās willingness and ability to actually try, and try consistently, over a huge time period. To give some idea, Iād probably have assigned <50% probability to humanity surviving to the year 2150, and <10% for the year 3000, before I came across EA. Whether thatās correct or not, I donāt think its wildly unusual among people who take climate change seriously*, and yet we almost certainly arenāt doing enough to combat that as a society. This gives me little hope for dealing with <10% threats that will surely appear over the centuries, and as a result I found and continue to find the seemingly-baseline optimism of longtermist EA very jarring.
(Again, the above is a gut response as opposed to a reasoned claim.)
Yeah, Owen made a similar point, and actually I was using civilisation rather than āthe human speciesā, which is 20x shorter still. I honestly hadnāt thought about intelligent life as a possible class before, and that probably is the thing from this conversation that has the most chance of changing how I think about this.
*āThe survey from the Yale Program on Climate Change Communication found that 39 percent think the odds of global warming ending the human race are at least 50 percent. ā