This is a valuable point, but I do think that giving real weight to a world where we have neither extinction nor 30% growth would still be an update to important views about superhuman AI. It seems like evidence against the Most Important Century thesis, for example.
I think Most Important Century still goes through if you replace extinction/TAI with “bigdealness”. In fact, bigdealness takes up considerably more space for me.
To the degree that non-extinction/TAI-bigdealness decreases the magnitude of implications for financial markets in particular, it is more consistent with the current state of financial markets.
Well I think MIC relies on some sort of discontinuity this century, and when we start getting into the range of precedented growth rates, the discontinuity looks less likely.
But we might not be disagreeing much here. It seems like a plausibly important update, but I’m not sure how large.
This is a valuable point, but I do think that giving real weight to a world where we have neither extinction nor 30% growth would still be an update to important views about superhuman AI. It seems like evidence against the Most Important Century thesis, for example.
An update, yeh, but how important?
I think Most Important Century still goes through if you replace extinction/TAI with “bigdealness”. In fact, bigdealness takes up considerably more space for me.
To the degree that non-extinction/TAI-bigdealness decreases the magnitude of implications for financial markets in particular, it is more consistent with the current state of financial markets.
Well I think MIC relies on some sort of discontinuity this century, and when we start getting into the range of precedented growth rates, the discontinuity looks less likely.
But we might not be disagreeing much here. It seems like a plausibly important update, but I’m not sure how large.