I am surprised that critical commenters have focused on the irrationality or inadequacy of financial markets, rather than what feels like the more obvious point:
Unaligned AI need not imply extinction, and aligned AI need not imply 30% growth. Financial markets can be inconsistent with these implications without being inconsistent with Big Deal AI.
On unaligned AI: eyeball the reviews of the Carlsmith report. Looks like average P(xrisk | misalignment) ~= 45% among reviewers.
On aligned AI: 30% growth is crazy high! The authors are unwilling to make their claims for less-crazy growth figures:
This is a valuable point, but I do think that giving real weight to a world where we have neither extinction nor 30% growth would still be an update to important views about superhuman AI. It seems like evidence against the Most Important Century thesis, for example.
I think Most Important Century still goes through if you replace extinction/TAI with “bigdealness”. In fact, bigdealness takes up considerably more space for me.
To the degree that non-extinction/TAI-bigdealness decreases the magnitude of implications for financial markets in particular, it is more consistent with the current state of financial markets.
Well I think MIC relies on some sort of discontinuity this century, and when we start getting into the range of precedented growth rates, the discontinuity looks less likely.
But we might not be disagreeing much here. It seems like a plausibly important update, but I’m not sure how large.
[Edit: this is no longer applicable, sheesh stop downvoting] Your tweets appear to be set to private (thus impacting the accessibility of the last link).
I am surprised that critical commenters have focused on the irrationality or inadequacy of financial markets, rather than what feels like the more obvious point:
Unaligned AI need not imply extinction, and aligned AI need not imply 30% growth. Financial markets can be inconsistent with these implications without being inconsistent with Big Deal AI.
On unaligned AI: eyeball the reviews of the Carlsmith report. Looks like average P(xrisk | misalignment) ~= 45% among reviewers.
On aligned AI: 30% growth is crazy high! The authors are unwilling to make their claims for less-crazy growth figures:
This is a valuable point, but I do think that giving real weight to a world where we have neither extinction nor 30% growth would still be an update to important views about superhuman AI. It seems like evidence against the Most Important Century thesis, for example.
An update, yeh, but how important?
I think Most Important Century still goes through if you replace extinction/TAI with “bigdealness”. In fact, bigdealness takes up considerably more space for me.
To the degree that non-extinction/TAI-bigdealness decreases the magnitude of implications for financial markets in particular, it is more consistent with the current state of financial markets.
Well I think MIC relies on some sort of discontinuity this century, and when we start getting into the range of precedented growth rates, the discontinuity looks less likely.
But we might not be disagreeing much here. It seems like a plausibly important update, but I’m not sure how large.
[Edit: this is no longer applicable, sheesh stop downvoting]
Your tweets appear to be set to private (thus impacting the accessibility of the last link).
Ah, thank you for mentioning; corrected in original comment.