Quick, non-exhaustive list of places where a few strategic, dedicated, and ambitious altruists could make a significant dent within a year (because, rn, EA is significantly dropping the ball).
Improving the media, China stuff, increasing altruism, moral circle expansion, AI mass movement stuff, frontier AI lab insider coordination (within and among labs), politics in and outside the US, building up compute infrastructure outside the US, security stuff, EA/longtermist/School for Moral Ambition/other field building, getting more HNW people into EA, etc.
A lack of sufficiently strategic, dedicated, and ambitious altruists. Deferrence to authority figures in the EA community when people should be thinking more independently. Suboptimal status and funding allocation etc.
Quick, non-exhaustive list of places where a few strategic, dedicated, and ambitious altruists could make a significant dent within a year (because, rn, EA is significantly dropping the ball).
Improving the media, China stuff, increasing altruism, moral circle expansion, AI mass movement stuff, frontier AI lab insider coordination (within and among labs), politics in and outside the US, building up compute infrastructure outside the US, security stuff, EA/longtermist/School for Moral Ambition/other field building, getting more HNW people into EA, etc.
(List originally shared with me by a friend)
Will’s list from his recent post has good candidates too:
AI character[5]
AI welfare / digital minds
the economic and political rights of AIs
AI-driven persuasion and epistemic disruption
AI for better reasoning, decision-making and coordination
the risk of (AI-enabled) human coups
democracy preservation
gradual disempowerment
biorisk
space governance
s-risks
macrostrategy
meta
What do you think is causing the ball to be dropped?
A lack of sufficiently strategic, dedicated, and ambitious altruists. Deferrence to authority figures in the EA community when people should be thinking more independently. Suboptimal status and funding allocation etc.
Would be curious to hear more. I’m interested in doing more independent projects in the near future but am not sure how they’d be feasible.
What’s “SMA” in this context?
sorry—school for moral ambition