Slightly hot take: Longtermist capacity/community building is pretty underdone at current margins and retreats (focused on AI safety, longtermism, or EA) are also underinvested in. By “longtermist community building”, I mean rather than AI safety. I think retreats are generally underinvested in at the moment. I’m also sympathetic to thinking that general undergrad and high school capacity building (AI safety, longtermist, or EA) is underdone, but this seems less clear-cut.
I think this underinvestment is due to a mix of mistakes on the part of Open Philanthropy (and Good Ventures)[1] and capacity building being lower status than it should be.
Here are some reasons why I think this work is good:
It’s very useful for there to be people who are actually trying really hard to do the right thing and they often come through these sorts of mechanisms. Another way to put this is that flexible, impact-obsessed people are very useful.
Retreats make things feel much more real to people and result in people being more agentic and approaching their choices more effectively.
Programs like MATS are good, but they get somewhat different people at a somewhat different part of the funnel, so they don’t (fully) substitute.
A large part of why I’m writing this is to try to make this work higher status and to encourage more of this work. Consider yourself to be encouraged and/or thanked if you’re working in this space or planning to work in this space.
I think these mistakes are: underfunding this work, Good Ventures being unwilling to fund some versions of this work, failing to encourage people to found useful orgs in this space, and hiring out many of the best people in this space to instead do (IMO less impactful) grantmaking.
AI safety pretty clearly swallows longtermist community building. If we want longtermism to be built and developed it needs to be very explicitly aimed at, not just mentioned on the side. I suspect that general EA group community building is better for this reason too—it isn’t overwhelmed by any one object level cause/career/demographic.
Slightly hot take: Longtermist capacity/community building is pretty underdone at current margins and retreats (focused on AI safety, longtermism, or EA) are also underinvested in. By “longtermist community building”, I mean rather than AI safety. I think retreats are generally underinvested in at the moment. I’m also sympathetic to thinking that general undergrad and high school capacity building (AI safety, longtermist, or EA) is underdone, but this seems less clear-cut.
I think this underinvestment is due to a mix of mistakes on the part of Open Philanthropy (and Good Ventures)[1] and capacity building being lower status than it should be.
Here are some reasons why I think this work is good:
It’s very useful for there to be people who are actually trying really hard to do the right thing and they often come through these sorts of mechanisms. Another way to put this is that flexible, impact-obsessed people are very useful.
Retreats make things feel much more real to people and result in people being more agentic and approaching their choices more effectively.
Programs like MATS are good, but they get somewhat different people at a somewhat different part of the funnel, so they don’t (fully) substitute.
A large part of why I’m writing this is to try to make this work higher status and to encourage more of this work. Consider yourself to be encouraged and/or thanked if you’re working in this space or planning to work in this space.
I think these mistakes are: underfunding this work, Good Ventures being unwilling to fund some versions of this work, failing to encourage people to found useful orgs in this space, and hiring out many of the best people in this space to instead do (IMO less impactful) grantmaking.
Other (probably more important, if combined) reasons :
wanting to have direct impact (i.e. risk aversion within longtermist interventions)
personal fit for founding (and specifically founding meta orgs, where impact is even harder to quantify)
not quite underfunding, but lack of funding diversity if your vision for the org differs from what OP is willing to fund at scale.
lack of founder-level talent
AI safety pretty clearly swallows longtermist community building. If we want longtermism to be built and developed it needs to be very explicitly aimed at, not just mentioned on the side. I suspect that general EA group community building is better for this reason too—it isn’t overwhelmed by any one object level cause/career/demographic.
Some of the arguments I make here are similar.