It seems like the obvious problem with this is that identifying the best investment opportunities is hard.
More specifically, I think EA really shines when identifying the problems nobody really cares about or is trying to solve already (eg, evaluating charity cost-effectiveness, improving the long-term future). It makes sense that there would be low hanging fruit for a competent altruist, because most of the world doesn’t care about those causes and isn’t trying. So there’s no reason to expect the low-hanging fruit not to already have been plucked.
Investment, on the other hand, gives EAs no such edge. The desire to make a lot of money seems near universal, and so you should expect the best investment returns to have already been taken. Because a lot of optimisation power is going into investment and into finding the best sources of returns. So I can’t see any clear edges of EAs here.
Arguably EAs have an edge in terms of caring an unusual amount about long time horizons? So I could believe that there are neglected investment opportunities that aren’t great in the short term but which sound excellent over 10+ year time horizons. And I’d be excited about seeing thought in that direction. This is still an area a lot of other people care about, but I think most investors care about shorter time horizons, so I can believe there are mispricings. It’d definitely require looking for things that aren’t also obviously good ideas in the short term though (ie not the Medallion Fund)
Long time-horizon institutions like university endowments, pension funds etc might be interesting places to look for what good strategies here look like.
It also seems plausible than an EA worldview isn’t fully priced into markets yet, eg if you believe there’s a realistic chance of transformative AI in the next few decades, tech/hardware companies might be relatively underpriced. Or more generally GCRs, like climate change, antibiotic resistance, risks of great power war, artificial pandemics might not be sufficiently priced in? (I’d have put natural pandemics on that list, but that’s probably priced in now?)
It seems like the obvious problem with this is that identifying the best investment opportunities is hard.
More specifically, I think EA really shines when identifying the problems nobody really cares about or is trying to solve already (eg, evaluating charity cost-effectiveness, improving the long-term future). It makes sense that there would be low hanging fruit for a competent altruist, because most of the world doesn’t care about those causes and isn’t trying. So there’s no reason to expect the low-hanging fruit not to already have been plucked.
Investment, on the other hand, gives EAs no such edge. The desire to make a lot of money seems near universal, and so you should expect the best investment returns to have already been taken. Because a lot of optimisation power is going into investment and into finding the best sources of returns. So I can’t see any clear edges of EAs here.
Arguably EAs have an edge in terms of caring an unusual amount about long time horizons? So I could believe that there are neglected investment opportunities that aren’t great in the short term but which sound excellent over 10+ year time horizons. And I’d be excited about seeing thought in that direction. This is still an area a lot of other people care about, but I think most investors care about shorter time horizons, so I can believe there are mispricings. It’d definitely require looking for things that aren’t also obviously good ideas in the short term though (ie not the Medallion Fund)
Long time-horizon institutions like university endowments, pension funds etc might be interesting places to look for what good strategies here look like.
It also seems plausible than an EA worldview isn’t fully priced into markets yet, eg if you believe there’s a realistic chance of transformative AI in the next few decades, tech/hardware companies might be relatively underpriced. Or more generally GCRs, like climate change, antibiotic resistance, risks of great power war, artificial pandemics might not be sufficiently priced in? (I’d have put natural pandemics on that list, but that’s probably priced in now?)