I believe your linked text for existential catastrophe (in the second table) is incorrect- I get a page not found error.
Thanks, fixed.
Substantively, I realize this is probably not something you originally asked (nor am I asking for it since presumably this’d take a bunch of time), but I’d be super curious to see what kind of uncertainty estimates folks put in this, and how combining using those uncertainties might look. If you have some intuition on what those intervals look like, that’d be interesting.
The reason I’m curious about this is probably fairly transparent, but given the pretty extensive broader community uncertainty on the topic, aggregating using those might yield a different point estimate, but more importantly might help people understand the problem better by seeing the large degree uncertainty involved. For example, it’d be interesting/useful to see how much probability people put outside a 10-90% range.
Good question! Perhaps we can include better uncertainty information in a future post at some point. For now, regarding my personal uncertainty, I’ll quote what I wrote about a week ago:
[Regarding chance of misaligned takeover by 2100] I’m going to stick to 35% for now, but it’s a very tough question and I could see ending up at anywhere between 10-90% on further reflection and discussion.
[Regarding chance of TAI by 2100] Mashing all these intuitions together gives me a best guess of 80%, though I think I could end up at anywhere between 60% and 90% on further reflection and discussion.
Thanks, fixed.
Good question! Perhaps we can include better uncertainty information in a future post at some point. For now, regarding my personal uncertainty, I’ll quote what I wrote about a week ago: