If the only scenario for AI were: exist at more or less normal levels of economic growth, then foom, I think Eliezer would be correct. However, I think a likely scenario is that there is accelerated growth, coincident with accelerated risk (via e.g. AI terrorism or wars), before foom. This cluster of outcomes may even be modal. In that case, interest rates would very likely rise and traders would be able to profit, before foom.
Put another way, we often place substantial weight on non-foom good and bad AI scenarios, even if foom is a risk. The market seems to be ruling out non-foom AI risk as well as non-foom AI growth before foom. Eliezer is correct that the market may not be ruling out the (IMO unlikely) scenario of “no-AI induced growth or risk prior to foom, then foom.”
If the only scenario for AI were: exist at more or less normal levels of economic growth, then foom, I think Eliezer would be correct. However, I think a likely scenario is that there is accelerated growth, coincident with accelerated risk (via e.g. AI terrorism or wars), before foom. This cluster of outcomes may even be modal. In that case, interest rates would very likely rise and traders would be able to profit, before foom.
Put another way, we often place substantial weight on non-foom good and bad AI scenarios, even if foom is a risk. The market seems to be ruling out non-foom AI risk as well as non-foom AI growth before foom. Eliezer is correct that the market may not be ruling out the (IMO unlikely) scenario of “no-AI induced growth or risk prior to foom, then foom.”