Thanks for writing this. I think that it would be good if there was at least some EA investment in climate change so a) we gain a better understanding of the issue b) we are in a better position to shift resources this direction if we receive evidence that it is likely to be worse than we expect c) we gain the opportunity to spread EA ideas into the climate change movement
I’m not proposing a huge amount of investment, but I’d love to see at least some.
I agree with the commentators that it is worthwhile keeping in mind that some of the political pressure may in fact be correct, but I also feel that this post is valuable because it helps highlight the kinds of pressures that we are subject to.
I think, for good or bad, EA is much more vulnerable to pressure from the left-wing because the institutions we interface with and the locations where most EAs are based lean that way.
To what extent do you think people are born that way and to what extent do you think they become it? If they are more born that way, how do we get such people involved? And if they become it, how do we make that happen?
I think this is an excellent idea and that someone should pursue this. I’m sure plenty of people have considered paying someone else to manage the negotiation for them before, but the risk is always that the fees don’t outweigh the increase in wage you would have gotten negotiating by yourself. Here, since the money is going to charity, this risk is much less of a concern because at least it is doing something and the world has improved even if you don’t earn a single extra dollar.
I’m actually donating to the Patreon, but here are the arguments against that are most persuasive to me:
One argument I’ve heard raised is that the EA hotel is a rather expensive way of testing the idea of supporting EAs with low-cost living. Perhaps it would have been better to have started with a smaller scale experiment such as a group house and perhaps funding the EA hotel is too costly a way of learning about the potential of such projects.
Another is that the EA hotel should be more selective about who it admits, unlike its current very minimally low bar in order to achieve sufficient expected return. Some people may believe that the current approach is unlikely to be cost effective and that the hotel as it is currently structured is therefore testing the wrong thing. In this case, spending a few hundred thousand pounds on informational value could be seen as waste. Worse, we can imagine that after such a failure, funders would be extremely reluctant to fund a similar project that was more selective. In this case, the thing that we’d want to test might never actually be tested.
A third option is that people might not want to donate because they don’t believe that other people will donate. Let’s suppose that you believe the hotel needs to run for at least another year before it could build up the kind of track record for it to be sustainable and you have the option to donate one month’s worth of funding. It seems that donating one month’s worth of operating expenses might allow the hotel to do one month’s worth of good regardless of whether it later collapses or not, so perhaps this is irrelevant.
However, there may be two ways in which you may be trying to leverage your donation to have more than just direct impact. Firstly, if the hotel survives to the point where it builds up a track record to justify for others to fund it, counterfactual value is generated to the extent that the hotel is better than the other opportunities available to those funders. And by allowing this opportunity to exist, you would get to claim part of this value. Secondly, we can imagine extreme success scenarios where the hotel turned out to be so successful that the EA community decided to copy the concept around the world. Again, you could claim partial responsibility for this.
But, the key point is that if you think other funders won’t be forthcoming, you’ll miss out on these highly leverage scenarios. And if these are the reasons you’d want to fund the hotel, you might decide it’s best to fund something else instead.
This is an interesting idea, but I’m skeptical as I think it underestimates the difficulties in co-ordination. Givewell has had difficulty with volunteers due to unreliability. Another datapoint is the shift in .IMPACT (now Rethink Charity) from relying on volunteers to relying on paid staff. Volunteer hierarchical organisations will be hit by these issues doubly hard as they are relying on volunteers for both management and object level work. I would love to be proven wrong though.
I’m not saying everyone should go into this, just that a portion should
Impact investing to encourage companies to do more on AI Safety is a particularly fascinating idea. I’m curious how much your influence depends on the number of shares. Obviously if you own 20% of a company you’re likely to be heard, but is there much difference between owning 1 share vs. 100?
I would suggest that the presence of such a high amount of talent means that projects like the EA hotel are more vital since this increases the amount of talent that can be deployed.
Yeah, InIn was the main attempt at this. Gleb was able to get a large number of articles published in news sources, but at the cost of quality. And some people felt that this would make people perceive rationality negatively, as well as drawing in people from the wrong demographic. I think he was improving over time, but perhaps too slowly?
PS. Have you seen this? https://www.clearerthinking.org
I would be surprised to see much activity on a comment on a three month old thread. If you want to pursue this, I’d suggest writing a new post. Good luck, I’d love to see someone pursuing this project!
What was the issue with the anti-debate?
I think this is an interesting idea, but currently the EA hotel is struggling to obtain sufficient funding, so I wouldn’t so it as beneficial to spring up a second long-term residential program while there’s already one very similar such experiment in progress. If anyone would want to be a student in such a school they can already apply to the EA hotel and learn the skills that they need. It’s more self-directed than your idea, but I suspect that this is a vital skill for most EA jobs.
That sounds like a much better approach as it requires much less resources to be committed upfront.
Does the~£265 figure take into account rent from those who are paying it?
“In general it’s probably best not to anonymize applications. Field studies generally show no effect on interview selection, and sometimes even show a negative effect (which has also been seen in the lab)”—It seems strange to mention this and then not even address the obvious implication that one might draw from this.
The other point is that these practises are analysed as though they don’t have tradeoffs, when there almost always is. I suppose discussing this would make this document even longer than it is, but you have listed these as “recommendations” as opposed to “possible approaches”.
I would distinguish between poor journalism and not taking a very EA perspective. We shouldn’t conflate the two. It’s worth noting that Future Perfect is inspired by EA, rather than an EA publication. I also think it’s important for the journalists writing to be able to share their own perspective even if it disagrees with the EA consensus. That said, some articles I’ve seen have been overly ideological or unnecessarily polarising and that worries me.