it’s relatively common markets are inefficient, but unexploitable; trading on “everyone dies” seems a clear case of hard-to-exploit inefficiency
markets are not magic; impacts of one-off events with complex consequences are difficult to price in, and what all the magical market aggregation boils down to are a bunch of human brains doing the trades; e.g. I was able to beat the market and get n-times return at point where markets were insane about covid; later, I talked about it with someone in one of the giant hedge funds, and the simple explanation is, while they were looking into it, at some point I knew more about covid than they were able to assemble
example of such hard-to-predict event are e.g. capabilities and impacts of a specific model
the dichotomy 30% growth / everyone dies is unrealistic for trading purposes
near term, there are various outcomes like “industry X gets disrupted” or “someone loses job due to automation” or “war”
if you anticipate fears of this type to dominate in next 10 years, you should price many people increasing their savings and borrowing less
I think there are a ton of Transformative AI scenarios where not “everyone dies”. I think many AI Safety researchers are currently expecting less than a 40% chance of everyone dying.
I also really have a hard time imagining many financial traders actually seriously believing: 1. Transformative AI is likely to happen 2. It’s very likely to kill everyone, conditional on happening. (95%++)
Both of those are radical right now. You need to believe (1) to believe we’re likely doomed soon.
I haven’t seen any evidence of people with money seriously discussing (2).
Nice post, but my rough take is there is
it’s relatively common markets are inefficient, but unexploitable; trading on “everyone dies” seems a clear case of hard-to-exploit inefficiency
markets are not magic; impacts of one-off events with complex consequences are difficult to price in, and what all the magical market aggregation boils down to are a bunch of human brains doing the trades; e.g. I was able to beat the market and get n-times return at point where markets were insane about covid; later, I talked about it with someone in one of the giant hedge funds, and the simple explanation is, while they were looking into it, at some point I knew more about covid than they were able to assemble
example of such hard-to-predict event are e.g. capabilities and impacts of a specific model
the dichotomy 30% growth / everyone dies is unrealistic for trading purposes
near term, there are various outcomes like “industry X gets disrupted” or “someone loses job due to automation” or “war”
if you anticipate fears of this type to dominate in next 10 years, you should price many people increasing their savings and borrowing less
I think there are a ton of Transformative AI scenarios where not “everyone dies”. I think many AI Safety researchers are currently expecting less than a 40% chance of everyone dying.
I also really have a hard time imagining many financial traders actually seriously believing:
1. Transformative AI is likely to happen
2. It’s very likely to kill everyone, conditional on happening. (95%++)
Both of those are radical right now. You need to believe (1) to believe we’re likely doomed soon.
I haven’t seen any evidence of people with money seriously discussing (2).
Sounds like I’d like a hedge fund to write the news for me (after they trade on it, no problem. but they must have great teams doing the analysis)