My guess for the pushback: 1 week before the end of the world, you think a sizable part of the population will notice and change their economic behavior drastically. I imagine this scenario contains a slow “attack” by AI that everyone sees coming?
If the only scenario for AI were: exist at more or less normal levels of economic growth, then foom, I think Eliezer would be correct. However, I think a likely scenario is that there is accelerated growth, coincident with accelerated risk (via e.g. AI terrorism or wars), before foom. This cluster of outcomes may even be modal. In that case, interest rates would very likely rise and traders would be able to profit, before foom.
Put another way, we often place substantial weight on non-foom good and bad AI scenarios, even if foom is a risk. The market seems to be ruling out non-foom AI risk as well as non-foom AI growth before foom. Eliezer is correct that the market may not be ruling out the (IMO unlikely) scenario of “no-AI induced growth or risk prior to foom, then foom.”
People disagreeing, would you say why?
My guess for the pushback: 1 week before the end of the world, you think a sizable part of the population will notice and change their economic behavior drastically. I imagine this scenario contains a slow “attack” by AI that everyone sees coming?
(agree vote = yeah that is the pushback)
If the only scenario for AI were: exist at more or less normal levels of economic growth, then foom, I think Eliezer would be correct. However, I think a likely scenario is that there is accelerated growth, coincident with accelerated risk (via e.g. AI terrorism or wars), before foom. This cluster of outcomes may even be modal. In that case, interest rates would very likely rise and traders would be able to profit, before foom.
Put another way, we often place substantial weight on non-foom good and bad AI scenarios, even if foom is a risk. The market seems to be ruling out non-foom AI risk as well as non-foom AI growth before foom. Eliezer is correct that the market may not be ruling out the (IMO unlikely) scenario of “no-AI induced growth or risk prior to foom, then foom.”