Do you also think this yourself? I don’t clearly see what worlds look like, where P (doom | AGI) would be ambiguous in hindsight? Some mayor accident because everything is going too fast?
There are some things we would recognize as an AGI, but others (that we’re still worried about) are ambiguous. There are some things we would immediately recognize as ‘doom’ (like extinction) but others are more ambiguous (like those in Paul Christiano’s “what failure looks like”, or like a seemingly eternal dictatorship).
Do you also think this yourself? I don’t clearly see what worlds look like, where P (doom | AGI) would be ambiguous in hindsight? Some mayor accident because everything is going too fast?
There are some things we would recognize as an AGI, but others (that we’re still worried about) are ambiguous. There are some things we would immediately recognize as ‘doom’ (like extinction) but others are more ambiguous (like those in Paul Christiano’s “what failure looks like”, or like a seemingly eternal dictatorship).
I sort of view AGI as a standin for powerful optimization capable of killing us in AI Alignment contexts.
Yeah, I think I would count these as unambigous in hindsight. Though siren Worlds might be an exception.