Should our EA residential program prioritize structured programming or open-ended residencies?
You can always host structured programs, perhaps on a regular cycle, but doing so to the exclusion of open-ended residencies seems to be giving up much of the counterfactual value the hotel provided. It seems like a strong overcommitment to a concern about AI doom in the next low-single-digit years, which remains (rightly IMO) a niche belief even in the EA world, despite heavy selection within the community for it.
Having said that, to some degree it sounds like you’ll need to follow the funding, and prioritise keeping operations running. If that funding is likely to be conditional on a short-term AI safety focus then you can always shift focus if the world doesn’t end in 2027 - though I would strive to avoid being long-term locked into that particular view.
[ETA] I’m not sure the poll is going to give you that meaningful results. I’m at approx the opposite end of it from @Chris Leong, but his answer sounds largely consistent with mine, primarily with a different emotional focus.
Thanks for the thoughtful comments, Arepo. A few clarifications:
We’re definitely not abandoning open-ended residencies. We’re trying to find the right balance between open residency and structured programming. That’s exactly why we’re polling: we don’t have the answer yet and want community input as one data point among many.
On AI safety focus: I think there’s a misread here. We’re not narrowing to short-timelines AI doom. Our scope is AI safety, reducing global catastrophic risks, and remaining roughly cause-neutral but EA-aligned. We’re following where both talent and high-impact opportunities are concentrated, not locking ourselves into a single timeline view.
You’re right that we need to align with funding realities to keep operations running, but we’re actively working to avoid being locked into any single cause area. The goal is to remain responsive to what the ecosystem needs as things evolve. That is why we’re doing very rudimentary market research that directly reaches the end user above, using the poll.
You can always host structured programs, perhaps on a regular cycle, but doing so to the exclusion of open-ended residencies seems to be giving up much of the counterfactual value the hotel provided. It seems like a strong overcommitment to a concern about AI doom in the next low-single-digit years, which remains (rightly IMO) a niche belief even in the EA world, despite heavy selection within the community for it.
Having said that, to some degree it sounds like you’ll need to follow the funding, and prioritise keeping operations running. If that funding is likely to be conditional on a short-term AI safety focus then you can always shift focus if the world doesn’t end in 2027 - though I would strive to avoid being long-term locked into that particular view.
[ETA] I’m not sure the poll is going to give you that meaningful results. I’m at approx the opposite end of it from @Chris Leong, but his answer sounds largely consistent with mine, primarily with a different emotional focus.
Thanks for the thoughtful comments, Arepo. A few clarifications:
We’re definitely not abandoning open-ended residencies. We’re trying to find the right balance between open residency and structured programming. That’s exactly why we’re polling: we don’t have the answer yet and want community input as one data point among many.
On AI safety focus: I think there’s a misread here. We’re not narrowing to short-timelines AI doom. Our scope is AI safety, reducing global catastrophic risks, and remaining roughly cause-neutral but EA-aligned. We’re following where both talent and high-impact opportunities are concentrated, not locking ourselves into a single timeline view.
You’re right that we need to align with funding realities to keep operations running, but we’re actively working to avoid being locked into any single cause area. The goal is to remain responsive to what the ecosystem needs as things evolve. That is why we’re doing very rudimentary market research that directly reaches the end user above, using the poll.
Oh, I think AI safety is very important; short-term AI safety too though not quite 2027 😂.
Knock-off MATS could produce a good amount of value, I just want the EA hotel to be even more ambitious.