I think this is an interesting idea, but currently the EA hotel is struggling to obtain sufficient funding, so I wouldn’t so it as beneficial to spring up a second long-term residential program while there’s already one very similar such experiment in progress. If anyone would want to be a student in such a school they can already apply to the EA hotel and learn the skills that they need. It’s more self-directed than your idea, but I suspect that this is a vital skill for most EA jobs.
That sounds like a much better approach as it requires much less resources to be committed upfront.
Does the~£265 figure take into account rent from those who are paying it?
“In general it’s probably best not to anonymize applications. Field studies generally show no effect on interview selection, and sometimes even show a negative effect (which has also been seen in the lab)”—It seems strange to mention this and then not even address the obvious implication that one might draw from this.
The other point is that these practises are analysed as though they don’t have tradeoffs, when there almost always is. I suppose discussing this would make this document even longer than it is, but you have listed these as “recommendations” as opposed to “possible approaches”.
I would distinguish between poor journalism and not taking a very EA perspective. We shouldn’t conflate the two. It’s worth noting that Future Perfect is inspired by EA, rather than an EA publication. I also think it’s important for the journalists writing to be able to share their own perspective even if it disagrees with the EA consensus. That said, some articles I’ve seen have been overly ideological or unnecessarily polarising and that worries me.
How much was the grant and from which organisation?
I have a few other ideas that aren’t listed here:
EA Climate Change Co-ordination—a reasonable number of EAs are interested in climate change, but nothing seems to have happened in this domain partly due to a lack of co-ordination
EA Leadership program—leadership programs are very popular among students and so this could be a good (if high effort) way of spreading EA ideas
Regarding career workshops, instead of an organisation trying to run this itself, some group could create a training course for people who want to give career advice
A group to perform rationality outreach (spreading rationality ideas generally seems good for in EA as the rationality community produces new ideas for EA and also provides a source of potential recruits)
A project to aggregate ideas on different topics. It could get a bunch of EAs together to brainstorm like this and also allow public submissions. All of the ideas could be written up into a post like this
Media outreach—most EAs are quite negative on media outreach, but I can see value in writing articles that specifically aim to correct misperceptions about both EA and AI
A project to produce personal outreach materials. Some groups (like evangelical Christianity) focus heavily on growing through personal outreach. You don’t want people to end up being pushy, but this is probably worth investigating more
Niche outreach was mentioned, but I expect niche outreach to Christians could be especially valuable
Religious organisations often send people to different cities in order to engage in movement building, this might be something that EA might want to experiment with as well
I think that a lot of the potential value comes from the connections that people make. Although this is probably lessened to an extent by the fact that the hotel is so fully booked up as it makes it harder for someone to just join someone else’s project after they’ve finished theirs.
It actually doesn’t make a difference in terms of expected value
“Where it really used to be the case that nanotechnology was… atomically precise manufacturing was regarded as one of the existential risks. And I think people just converged on thinking that actually that argument was very much overblown.”—Why is it considered overblown?
I’m curious why parts of the old website were removed, instead of being left up while the redevelopment occurred.
Enough possibility space still exists that it seems worth at least experimenting to see if a professional organiser could value add, even if it were only by better advertising and promoting the opportunities that already exist. Especially now that CEA is increasing its focus on one-on-one meetups.
“Also many of the things community builders are doing in other places make much less sense in Berkeley”—Could you clarify?
Curious to see that there isn’t a dedicated organiser in the Bay Area as that seems to be one of the regions that would be most receptive
Replace “growing” the rationality community with “developing” the rationality community. But that’s a good point. It is worthwhile keeping in mind that the two are seperate. I imagine one of the first tasks of such a group would be figuring out what this actually means.
The size of the rationality community hasn’t been limited so much by quality concerns, as by lack of effort expended in growth.
I was referring specifically to growing the rationality community as a cause area.
“In terms of web-traffic and general intellectual influence among the intellectual elite, the sequences as well as HPMOR and Scott Alexander’s writing have attracted significant attention and readership, and mostly continue doing so”—I was talking more about academia than the blogosphere. Here, only AI safety has had reasonable penetration. EA has had several heavyweights in philosophy, plus FHI for a while and also now GPI.