The need for donations is probably highest in worlds where AGI goes badly. Pick your favorite one. These are also likely the worlds where AI equity exposure craters.
If EA donors are increasingly concentrated in vehicles that are long AI supply chain, long frontier labs, long downstream adoption, it seems like EA’s funding base is highly correlated with “AGI goes smoothly,” and highly anti-correlated with “we urgently need more money.”
A counterpoint is the argument that the most likely situation conditional on declines in AI related equities is a world where capabilities are less impactful than expected. To which I would rebut by pointing to the technical evidence.
[Question] Does EA’s funding base have the wrong correlation structure?
The need for donations is probably highest in worlds where AGI goes badly. Pick your favorite one. These are also likely the worlds where AI equity exposure craters.
If EA donors are increasingly concentrated in vehicles that are long AI supply chain, long frontier labs, long downstream adoption, it seems like EA’s funding base is highly correlated with “AGI goes smoothly,” and highly anti-correlated with “we urgently need more money.”
A counterpoint is the argument that the most likely situation conditional on declines in AI related equities is a world where capabilities are less impactful than expected. To which I would rebut by pointing to the technical evidence.
What happened to mission-correlated investing?