One quick point is divesting, while it would help a bit, wouldn’t obviously solve the problems I raise – AI safety advocates could still look like alarmists if there’s a crash, and other investments (especially including crypto) will likely fall at the same time, so the effect on the funding landscape could be similar.
With divestment more broadly, it seems like a difficult question.
I share the concerns about it being biasing and making AI safety advocates less credible, and feel pretty worried about this.
On the other side, if something like TAI starts to happen, then the index will go from 5% AI-companies to 50%+ AI companies. That’ll mean AI stocks will outperform the index by ~10x or more, while non-AI stocks will underperform by 2x or more.
So by holding the index, you’d be forgoing 90%+ of future returns (in the most high leverage scenarios), and being fully divested, giving up 95%+.
So the costs are really big (far far greater than divesting from oil companies).
Moreover, unless your p(doom) is very high, it’s plausible a lot of the value comes from what you could do in post-TAI worlds. AI alignment isn’t the only cause to consider.
On balance, it doesn’t seem like the negatives are so large as to reduce the value of your funds by 10x in TAI worlds. But I feel uneasy about it.
I would encourage EAs to go even further against the EMH than buying AI stocks. EAs have been ahead of the curve on lots of things, so we should be able to make even better returns elsewhere, especially given how crowded AI is now. It’s worth looking at the track record of the HSEACA investing group[1], but, briefly, I have had 2 cryptos that I learnt about in there in the last couple of years go up 1000x and 200x respectively (realising 100x and 50x gains respectively, so far in the case of the second). Lots of people also made big money shorting stock markets before the Covid crash, and there have been various other highly profitable plays, and promising non-AI start-ups posted about. There are plenty of other opportunities out there that are better than investing in AI, even from a purely financial perspective. More EAs should be spending time seeking them out, rather than investing in ethically questionable companies that go against their mission to prevent x-risk, and are very unlikely to provide significant profits that are actually usable before the companies they come from cause doom, or collapse in value from being regulated to prevent doom.
One quick point is divesting, while it would help a bit, wouldn’t obviously solve the problems I raise – AI safety advocates could still look like alarmists if there’s a crash, and other investments (especially including crypto) will likely fall at the same time, so the effect on the funding landscape could be similar.
With divestment more broadly, it seems like a difficult question.
I share the concerns about it being biasing and making AI safety advocates less credible, and feel pretty worried about this.
On the other side, if something like TAI starts to happen, then the index will go from 5% AI-companies to 50%+ AI companies. That’ll mean AI stocks will outperform the index by ~10x or more, while non-AI stocks will underperform by 2x or more.
So by holding the index, you’d be forgoing 90%+ of future returns (in the most high leverage scenarios), and being fully divested, giving up 95%+.
So the costs are really big (far far greater than divesting from oil companies).
Moreover, unless your p(doom) is very high, it’s plausible a lot of the value comes from what you could do in post-TAI worlds. AI alignment isn’t the only cause to consider.
On balance, it doesn’t seem like the negatives are so large as to reduce the value of your funds by 10x in TAI worlds. But I feel uneasy about it.
I would encourage EAs to go even further against the EMH than buying AI stocks. EAs have been ahead of the curve on lots of things, so we should be able to make even better returns elsewhere, especially given how crowded AI is now. It’s worth looking at the track record of the HSEACA investing group[1], but, briefly, I have had 2 cryptos that I learnt about in there in the last couple of years go up 1000x and 200x respectively (realising 100x and 50x gains respectively, so far in the case of the second). Lots of people also made big money shorting stock markets before the Covid crash, and there have been various other highly profitable plays, and promising non-AI start-ups posted about. There are plenty of other opportunities out there that are better than investing in AI, even from a purely financial perspective. More EAs should be spending time seeking them out, rather than investing in ethically questionable companies that go against their mission to prevent x-risk, and are very unlikely to provide significant profits that are actually usable before the companies they come from cause doom, or collapse in value from being regulated to prevent doom.
Would actually be great if someone did an analysis of this sometime!
It is! In fact, I think non-doom TAI worlds are highly speculative[1].
I’ve still not seen any good argument for them making up a majority of the probability space, in fact.