$100 Prize to Best Argument Against Donating to the EA Hotel
[Full disclosure: I’ve previously booked a stay in the EA Hotel later in the year and had this post reviewed by the organizers before posting, but otherwise I’m not affiliated and do not speak for the Hotel or its residents and seem to have somewhat different priors on some counts. Throughout this post, I’ll be referring to the case they’ve made for themselves so far here: 1, 2, 3. I take their data for granted, but not necessarily their conclusions.]
Summary: Comment below with reasons not to donate to the EA Hotel, and whichever gets the most upvotes earns $100 from me.
The Meta Level
As regular Forum readers know, the EA Hotel was first established and posted about almost a year ago to substantial (mostly positive) reception. Now, it seems to be fully functioning, with its rooms fully booked with ~20 residents working on EA projects. The only issue is, it’s running out of funding, according to the organizers (emphasis theirs):
We are getting critically low on runway. Our current shortfall is ~£5k/month from May onward. We will have to start giving current guests notice in 1 month.
I am personally surprised that the Hotel’s funding stream has been so dry, given the substantial enthusiasm it has received, both on this Forum and on EA social media. Evidently, I’m not the only one who’s confused and curious about this. When I try to model why this could be, one central observation sticks out:
Most of those excited about the Hotel are likely prospective residents. Conditional on someone being excited to work on their own (EA-related) thing for a while without having to worry about rent, chances are they don’t have much runway. This implies they are unlikely to have enough money to be major donors.
Under that assumption, the class of “people excited about the EA Hotel” may be something of a filter bubble. Except also an actual bubble, since the border is hard to see from certain angles.
With that framing, I can think of these plausible reasons for the discrepancy between the Hotel’s funding situation and the level of armchair enthusiasm:
A) There are good reasons to think the Hotel is low expected value (EV), and these reasons are generally understood to those who aren’t starry-eyed about free rent.
B) Outside the bubble, opinions of the Hotel are generally lukewarm. Unlike in (A), there aren’t compelling reasons against it, just not enough compelling reasons for it to warrant funding. Presumably, this also implies some active skepticism about the case the Hotel’s been making for itself (1, 2, 3).
C) The evidence indicates the Hotel is high EV for more or less the reasons that have been laid out by its organizers, but most major donors have not engaged with that very much.
Or, as always, it could be some combination of (A-C). But also, my basic framing could be wrong, and maybe there’s some other reason I’m not thinking of. Either way, I am curious about this, and feel like I would have a better model of how EA funding works in general if I understood this puzzle.
With that in mind, I would like to solicit the best argument(s) against donating to the EA Hotel, so I hereby offer $100 from my pocket to whoever in the comments gives the best such argument.
This will be judged simply by the number of upvotes on any comments posted here within exactly one week of the timestamp on this post. Feel free to use the comments section for other stuff, but only comments that contain an explicit argument against donating to the EA Hotel will be considered for the prize. To verify I’m a real person that will in fact award $100, find me on FB here.
Also, feel free to leave comments from an anonymous account. If you win, then you will have to message me from that account to confirm who you are. It might also be necessary to message a trusted 3rd-party to verify the transaction went through, but hopefully this will still be fine as far as reducing the incentives against negativity. For instance, I give my general impression of the current residents below. Opining that they’re worse than that is socially costly, so I want to allow space to air those opinions explicitly if they exist. But that said, I think most of the useful criticism I can imagine is not socially costly, so I don’t want to encourage everyone to post anonymously.
The Object Level
Here I’d like to review the skepticisms of the Hotel that I have seen so far, and why I don’t find these completely satisfactory. I only intend this as inspiration for more refined critiques, and I absolutely welcome comments that take a different line of argument than those below.
In large part, there have been general worries about who the Hotel is likely to attract. As one of the top comments on the original Hotel post last year put it:
the hotel could become a hub for everyone who doesn’t study at a university or work on a project that EA donors find worth funding, i.e. the hotel would mainly support work that the EA community as a whole would view as lower-quality. I’m not saying I’m confident this will happen, but I think the chance is non-trivial without the leadership and presence of highly experienced EAs (who work there as e.g. hotel managers / trustees).
Furthermore, people have repeatedly brought up the argument that the first “bad” EA project in each area can do more harm than an additional “good” EA project, especially if you consider tail risks, and I think this is more likely to be true than not. E.g. the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent. This provides a reason for EAs to be risk-averse.
Now, I certainly take the risk of net-negative projects seriously, but I don’t see much reason to think the Hotel will lead to these. Reading over the most comprehensive article the community has on the subject (to my knowledge), most of these risks tend to arise from at least one of a) unilateralism/lack of feedback, b) unfamiliarity EA and its norms, c) unfamiliarity with the specific field of research, and d) what I will bluntly call general incompetence/stupidity.
Under the counterfactual of the Hotel’s nonexistence, I’d guess most of the residents would only work on their projects by themselves part-time or not at all. Compared to that, the Hotel seems pretty much neutral on (c), but I would speculate, actually helps with (a,b), since it acts similar to an EA org in the way members can get easy feedback from other residents on the potential risks of their project. Obviously, the concern here is with (d), because the residents can be expected to be somewhat less smart/competent than those who’ve cleared the bar at EA orgs. Still, my impression from the profiles of the residents is that they’re competent enough such that (a) more than counteracts (d). Allow me to make these intuitions more explicit.
Suppose that, on some level of general competence, Alice is 95th percentile among EAs on the Forum and is working on her own EA project independently, while Bob is of 30th percentile competence and is working on his project while socially immersed in his many in-person EA contacts. I am significantly more worried about downside risk from Alice’s project than Bob’s. The reason for this is that, in a given field, many of these downside risks are very hard or near-impossible to envision ahead of time, even if you’re really smart and cautious. However, once these domain-specific pitfalls are pointed out to you, it’s not that cognitively taxing to grok them and adjust your thinking/actions accordingly. My guess is, 30th percentile competence is enough to do this without major issue, while 95th percentile is only enough for some of the envisioning (this certainly varies wildly by field). In my estimation, the former is about my lower bound for the general competence levels of the current residents (most seem to be at least 50th). Hence I see relatively little to worry about downside risks vis-a-vis the Hotel.
However, I look forward to seeing my reasoning here questioned, and updating my model of downside risks.
But the general concern here was not downside risks specifically, but that the average competence of the residents may make it unlikely that much successful work gets done. Currently, the most well-thought-out Hotel critique I know of is this comment from a couple months ago. Noting that relatively little successful work has (apparently) come out of the Hotel so far:
I don’t take this (apparent) absence of evidence to be a surprising or adverse signal. Among many reasons: the hotel has only been around for 8 months or so, and many projects wouldn’t be expected to be producing promising early results in this time; there are natural incentives that push against offering rough or unpolished work for public scrutiny (e.g. few PhD students—myself included—would be keen on presenting ‘what they’ve done so far’ at the 6m mark for public scrutiny); many ex ante worthwhile projects (e.g. skill building career development) may only have generally noisy and long delayed ex post confirmation.
Yet this also means there isn’t much to shift one’s priors. My own (which I think are often shared, particularly by those in EA in a position to make larger donations) are fairly autumnal: that a lot of ‘EA ideas’ are very hard to accomplish (and for some delicate areas have tricky pitfalls to navigate) even for highly motivated people, and so I’m more excited about signals of exceptional ability than exceptional commitment (cf. selectiveness, talent constraint, etc. etc.)
I understand the thinking behind the hotel takes a different view: that there is a lot of potential energy among committed EAs to make important contributions but cannot afford to devote themselves to it (perhaps due to mistakes among funders like insufficient risk-appetite, too ingroupy, exclusive in ways orthogonal to expected value, or whatever else). Thus a cheap ‘launch pad’ for these people can bring a lot of value.
If this is right, and I am wrong, I’d like to know sooner rather than later. Yet until I am corrected, the hotel doesn’t look really promising in first order terms, and the collective ‘value of information’ budget may not extend into the six figures.
Before commenting further, let me just say this is very well-put.
But still, after the wave of posts/discussions on this forum triggered by:
After one year of applying for EA jobs: It is really, really hard to get hired by an EA organisation
I sense there have been some general updates around the topic of “selectiveness”, such that while the priors mentioned in that comment may be as true as ever, I feel they now have to be more explicitly argued for.
At least, I think it’s fair to say that while most everyone who meets the hiring standards of EA orgs is quite competent, there is a very high false negative rate. So what happens to the relatively large number of committed, highly competent EAs who can’t get EA jobs? I certainly hope most either earn to give or pursue PhDs, but for those who are best-suited towards direct work/research, but for whatever reason aren’t suited for (or wouldn’t benefit much from) a PhD, then what?
Let D be this demographic: committed EAs who can’t get an EA job, are best fit for direct work/research, but not a good fit for academia (at least right now). Quite frankly, D certainly contains many EAs who likely aren’t “good enough” to be very impactful. But let E be the subset of D that is quite competent. My intuitions say that E is still a substantial demographic, because of the aforementioned false negative rate (and the fact that PhDs aren’t for everyone, even in research).
But even if that’s true, that doesn’t mean we should necessarily go out of our way to let the members of E work on their projects. By definition, this set is hard to filter for, and so there probably isn’t a way to reach them without also reaching the much larger number of less competent look-alikes in D. And if the inevitable costs associated with this are too high, then we as a community should be able to openly say “No, this isn’t worth it in EV.”
With that said, my intuitions still say the EV for the Hotel seems worth it. Very roughly speaking, the question seems to be whether $1 of research purchased from the Hotel is worth as much as $1 of research purchased from an EA org.
This isn’t actually right: for nuances, see both the Addendum below and the Hotel’s own EV calculation. Worse, I will fabricate a number for the sake of discussion (but please let me know a good estimate for its actual value): the average salary at an EA org.
It costs about £6,000 ($7,900) to fund a resident at the Hotel, so let’s round and suppose it costs £60,000 ($79,000) to hire someone at a random EA org (the Hotel’s residents seem to mostly do research, and research positions get paid more, so hopefully that number isn’t too nutty).
Then the question is (roughly) whether, given £60,000, it makes more sense to fund 1 researcher who’s cleared the EA hiring bar, or 10 who haven’t (and are in D).
(Note: We shouldn’t quite expect residents of the Hotel to just be random members of D. For instance, there’s an extra filter for someone willing to transplant to Blackpool: either they have no major responsibilities where they live or are committed enough to drop them. I think this implicit filter is a modest plus to the Hotel, while the other differences with D don’t add up to much, but there’s certainly room to argue otherwise).
It’s well-known here that top performers do orders of magnitude more to advance their field than the median, and I will almost always take 1 superb researcher over 10 mediocre ones. But the point here is the EV of 10 random members of D: if you think a random EA there has a probability p >10% of being as competent as an employed EA researcher, and you believe my arguments above that the other 9 are unlikely to be net-negative, then the EV works out in the Hotel’s favor. But if your subjective value of p is much less than 10%, then the other 9 probably won’t add all that much.
So what’s your p? I feel like this may be an important crux, or maybe I’m modeling this the wrong way. Either way I’d like to know. Also, I emphasize again the above paragraphs are embarrassingly oversimplified, but again that is just intended as a jumping-off point. For a more detailed/rigorous analysis, see the Hotel’s own.
Addendum: What precisely counts as an argument against donating?
When I first wanted to specify this, it seemed natural to say it’s any argument against the proposition:
$1 to the EA Hotel has at least as much EV as $1 to any of the usual EA organizations (e.g. FHI, MIRI, ACE, etc.)
And if you’re less of a pedant than me, read no further.
But this doesn’t quite work. For one, $1 might not be a good number since economies of scale may be involved. The Hotel is asking for £130,000 (~$172,000) to get 18 months runway, and presumably it would be better to have that up-front than on a week-to-week basis, due to the financial security of the residents etc. But I don’t know how much this matters.
The other problem is, this allows an argument of the form “organization X is really effective because of the work on topic Y they are doing”. Since the EA Hotel has a decently well-rounded portfolio of EA projects (albeit with some skew toward AI safety), the more relevant comparison would be more like $1 spread across multiple orgs, or better yet across the major cause-neutral meta-orgs.
But I’m not even sure it’s right to compare with major orgs (even though the Hotel organizers do in their own EV analysis). This is because the mantra “EA isn’t funding constrained” is true in the sense that all the major orgs seem to have little problem reaching their funding targets these days (correct me if this is too sweeping a generalization). But it’s false in the sense that there are plenty of smaller orgs/projects that struggle to get funding, even though some of them seem to be worth it. Since the role of an EA donor is to find and vet these projects, the relevant comparison for the Hotel would seem to be the collection of other small (but credible) projects that OpenPhil skipped over. For this purpose, good reference classes seem to be:
1) The list of grantees for EA Meta Funds, listed at the bottom of this page.
2) The list of grantees for the first round of EA grants, listed here.
With that in mind, I believe the specific proposition I would like to see critiqued is:
$172,000 to the EA Hotel has at least as much EV as $172,000 distributed randomly to grantees from (1) or (2)
- My Q1 2019 EA Hotel donation by 1 Apr 2019 2:23 UTC; 120 points) (
- Cash prizes for the best arguments against psychedelics being an EA cause area by 10 May 2019 18:13 UTC; 51 points) (
- Debrief: “cash prizes for the best arguments against psychedelics” by 14 Jul 2019 17:04 UTC; 47 points) (
- 27 Mar 2019 17:09 UTC; 21 points) 's comment on Why is the EA Hotel having trouble fundraising? by (
- 6 Nov 2019 13:24 UTC; 19 points) 's comment on EA Hotel Fundraiser 6: Concrete outputs after 17 months by (
- 8 Nov 2020 11:42 UTC; 18 points) 's comment on NunoSempere’s Quick takes by (
- Funding essay-prizes as part of pledged donations? by 3 Feb 2021 18:43 UTC; 13 points) (
- 17 Jun 2019 2:57 UTC; 11 points) 's comment on Tal Yarkoni: No, it’s not The Incentives—it’s you by (LessWrong;
- 27 Mar 2019 23:08 UTC; 8 points) 's comment on Why is the EA Hotel having trouble fundraising? by (
- 30 Sep 2020 15:37 UTC; 7 points) 's comment on How should we run the EA Forum Prize? by (
- 18 Apr 2019 19:28 UTC; 6 points) 's comment on What are people’s objections to earning-to-give? by (
- 5 Jun 2019 17:40 UTC; 3 points) 's comment on EA Forum Prize: Winners for April 2019 by (
- 7 May 2020 12:37 UTC; 3 points) 's comment on My open-for-feedback donation plans by (
Speaking for why I haven’t donated, this is close to the key question:
>Then the question is (roughly) whether, given £60,000, it makes more sense to fund 1 researcher who’s cleared the EA hiring bar, or 10 who haven’t (and are in D).
My intuition has been that if those 10 are chosen at random, then I’m moderately confident that it’s better to fund the 1 well-vetted researcher.
EA is talent-constrained in the sense that it needs more people like Nick Bostrom or Eric Drexler, but much less in the sense of needing more people who are average EAs to do direct EA work.
I’ve done some angel investing in startups. I initially took an approach of trying to fund anyone who has a a good idea. But that worked poorly, and I’ve shifted, as good VCs advise, to looking for signs of unusual competence in founders. (Alas, I still don’t have much reason to think I’m good at angel investing). And evaluating founder’s competence feels harder than evaluating a business idea, so I’m not willing to do it very often.
I use a similar approach with donating to early-stage charities, expecting to see many teams with decent ideas, but expecting the top 5% to be more than 10 times as valuable than the average. And I’m reluctant to evaluate more pre-track-record projects than I’m already doing.
With the hotel, I see a bunch of little hints that it’s not worth my time to attempt an in-depth evaluation of the hotel’s leaders. E.g. the focus on low rent, which seems like a popular meme among average and below average EAs in the bay area, yet the EAs whose judgment I most respect act as if rent is a relatively small issue.
I can imagine that the hotel attracts better than random EAs, but it’s also easy to imagine that it selects mainly for people who aren’t good enough to belong at a top EA organization.
Halffull has produced a better argument for the EA Hotel, but I find it somewhat odd that he starts with arguments that seem weak to me, and only in the middle did he get around to claims that are relevant to whether the hotel is better than a random group of EAs.
Also, if donors fund any charity that has a good idea, I’m a bit concerned that that will attract a larger number of low-quality projects, much like the quality of startups declined near the peak of the dot-com bubble, when investors threw money at startups without much regard for competence.
Startup founders are one possible reference class, but another possible reference class is researchers. People have proposed random funding for research proposals above a certain quality threshold:
People like Nick Bostrom and Eric Drexler are late in their careers, and they’ve had a lot of time to earn your respect and accumulate professional accolades. They find it easy to get funding and paying high rent is not a big issue for them. Given the amount of influence they have, it’s probably worthwhile for them to live in a major intellectual hub and take advantage of the networking opportunities that come with it.
I think a focus on funding established researchers can impede progress. Max Planck said that science advances one funeral at a time. I happen to think Nick Bostrom is wrong about some important stuff, but I’m not nearly as established as Bostrom and I don’t have the stature for people to take me as seriously when I make that claim.
Throwing small amounts of money at loads of startups is Y Combinator’s business model.
I think part of why Y Combinator is so successful is because funding so many startups has allowed them to build a big dataset for what factors do & don’t predict success. Maybe this could become part of the EA Hotel’s mission as well.
I’m unimpressed by the arguments for random funding of research proposals. The problems with research funding are mostly due to poor incentives, rather than people being unable to do much better than random guessing. EA organizations don’t have ideal incentives, and may be on the path to unreasonable risk-aversion, but they still have a fairly sophisticated set of donors setting their incentives, and don’t yet appear to be particularly risk-averse or credential-oriented.
Unless something has changed in the last few years, there are still plenty of startups with plausible ideas that don’t get funded by Y Combinator or anything similar. Y Combinator clearly evaluates a lot more startups than I’m willing or able to evaluate, but it’s not obvious that they’re being less selective than I am about which ones they fund.
I mentioned Nick Bostrom and Eric Drexler because they’re widely recognized as competent. I didn’t mean to imply that we should focus more funding on people who are that well known—they do not seem to be funding constrained now.
Let me add some examples of funding I’ve done that better characterize what I’m aiming for in charitable donations (at the cost of being harder for many people to evaluate):
My largest donations so far have been to CFAR, starting in early 2013, when their track record was rather weak, and almost unknown outside of people who had attended their workshops. That was based largely on impressions of Anna Salamon that I got by interacting with her (for reasons that were only marginally related to EA goals).
Another example is Aubrey de Grey. I donated to the Methuselah Mouse Prize for several years starting in 2003, when Aubrey had approximately no relevant credentials beyond having given a good speech at the Foresight Institute and a similar paper on his little-known website.
Also, I respected Nick Bostrom and Eric Drexler fairly early in their careers. Not enough to donate to their charitable organizations at their very beginning (I wasn’t actively looking for effective charities before I heard of GiveWell). But enough that I bought and read their first books, primarily because I expected them to be thoughtful writers.
I think the EA hotel is trying to do something different from Y-Combinator—Y-Combinator is much more like EA grants, and the EA hotel is doing something different. Y-Combinator basically plays the game of get status and connections, increase deal-flow, and then choose from the cream of the crop.
It’s useful to have something like that, but a game of “use tight feedback loops to find diamonds in the rough” seems to be useful as well. Using both strategies is more effective than just one.
Relevant: When should EAs allocate funding randomly? An inconclusive literature review.
Good idea. It will be somewhat tricky since we don’t have the luxury of measuring success in monetary terms, but we should certainly brainstorm about this at some point.
This seems very wrong to me. I work at Founders Pledge in London, and I doubt a single one of the staff there would disagree with a proposition like ‘the magnitude of London rents has a profound effect on my lifestyle’.
They also pay substantially closer to market rate salaries now than they did for the first 2-3 years of existence, during which people no doubt would have been far more sympathetic to the claim.
Thank you.
Your posts suggests that there is some class of EA’s that is a lot more competent than everyone else, which means that what everyone else is doing doesn’t matter all that much. While I haven’t met (or recognized) a lot of people that impress me this much, I still give this idea a lot of credence. I’d like to verify it for myself, to get on the same page with you (and perhaps even change my plans). Could you name some examples, besides Drexler and Bostrom, of EA’s that are on this level of competence?
I’m not looking for credentials, I’m looking for resources that demonstrate how these people are thinking, or stories about impressive feats, so I can convince my S1 to sit down and be humble (and model their minds so I can copy the good bits).
Podcasts, maybe?
>which means that what everyone else is doing doesn’t matter all that much
Earning to give still matters a moderate amount. That’s mostly what I’m doing. I’m saying that average EA should start with the outside view that they can’t do better than earning to give, and then attempt some more difficult analysis to figure out how they compare to average.
And it’s presumably possible to matter more than the average earning to give EA, by devoting above-average thought to vetting new charities.
The post is organized by dependency not by strength of argument. First people have to convinced that funding projects make sense at all (given that there’s so much grant money already in EA) before we can talk about the way in which to fund them.
I agree with Brendon that the Hotel should charge the tenants, and the tenants should seek their own funding.
If I was contemplating donating to the Hotel, the decision would hinge almost entirely on who is at the hotel and what they are working on. Moreover, I expect I would almost certainly want to tie my donation to a specific tenant/group of tenants, because I wouldn’t a priori expect all of them to be good donation targets.
At this point, why would I not just fund the specific person directly? Better yet, why would I not donate to the EA Funds/CEA and let professional grant-makers sift through the tenants’ personal applications?
When I look at the current guest list, it’s just very short, general introductory paragraphs. Surely you wouldn’t expect a grant-maker to make a funding decision based on these.
The Hotel itself is a cool idea which makes sense: create a transient EA hub somewhere where land is cheap. I love it. I am in fact one of those people who were excited about the project when it was first announced.
But in the current model, you cannot separate funding the Hotel from funding the specific people who stay there, and potential donors just don’t have enough information about those people to confidently fund them.
I think this gets to the big flaw in the current appeal from a design perspective -
the idea of the hotel is too new and cannot demonstrate impact on an aggregate scale (unlike say cash transfers) in an easy to understand way.
Therefore people look for specific examples of what people are doing at the hotel to reassure them of the impact
But as there are numerically few residents so far and the first residents had little competition to be accepted, many are not seen as competitive to what funders would independently decide to fund so they don’t make the hotel look good. (At least as they have been presented)
4)therefore the pitch is either that the hotel will continue to attract stronger applicants and/or get better at selecting good funding opportunities in the future so that they are doing some valuable intermediate role for the ultimate funders but a) no evidence has been given as to why the hotel staff are likely to be good at that role and b)trust has already been destroyed by the fact that funders now are looking at the first “lowest quality” residents and not finding them attractive.
The best thing by far imo the hotel could do is present the current and past residents in a way which explains why they have done valuable things at the hotel and why they could not have done them without the hotel. I don’t think they have done this well at all but my knowledge of a few of the residents suggests that there are good stories to tell here. Hopefully the planned post on this will give the hotel an injection of new interest from funders.
For more on our residents to date, and how the hotel has helped them, here are some case studies. See also: outputs.
I think this is unfair. I have not had any potential funders specifically say this. I actually think that the quality of residents is already pretty high (and this is under-appreciated from the outside). I also disagree with the notion of the first residents necessarily being of low quality just by way of not having strong competition to be accepted; there might even be a “debut album effect” in that there is a glut of good people (“songs”) ready for the initial release. I know I’ve personally been pleasantly surprised more than once with guests who I were a bit unsure of originally but was happy to take a chance on. There is also the consideration that it takes a certain amount of pioneer spirit to join a new project and community early in its life, and that this is positively correlated with agency and via that, value creation.
See our pitch doc (that we’ve recently circulated to interested people) for more detail on our Staff/Trustees. Ultimately, most of the grant makers in charge of the EA Funds have gained their experience through being actively involved in EA and making funding decisions (donations) over many years. We are not that different. See my record of EA donations here*. Also, as Vipul says, we have skin in the game here.
*”EA Hotel” is the money I’ve put in as of 29th March 2019, not including purchasing the building. Meta note: would be good if it was possible to make pages on app.effectivealtruism.org/dashboard/pledge/donations publicly viewable. Is this feature planned?
I think this view as presented has an overly narrow focus. In terms of thinking of the expected value of the hotel and whether it’s worth funding on the margin, it’s useful to also consider:
The benefits of the in-person community in terms of support, feedback, motivation, productivity, collaboration, networking.
All the potential future value from future guests and iterating, expanding and franchising the model.
The effect it failing from a lack of funding would have on the likelihood of similar initiatives being started in future.
The notion of Hits-based Giving.
Also note that the counterfactuals—EA Grants and EA Meta Fund—have not had assessments of their outputs performed, or at least made public (apart from the larger more established projects they have funded). Indeed—someone correct me if I’m wrong—we don’t even know who the recipients are of any of the EA Grants made after Fall 2017 (but then again, EA Grants in and of itself does not publicly accept donations, so does not need to be so transparent). Also: our costs per person are significantly lower than those of the average EA Grants or Meta Fund grantee. You need to factor this multiplier in when judging the relative merits of them vs. the EA Hotel.
In terms of picking and choosing people to fund, this is not readily possible without some kind of aggregation of applicants. Such aggregation projects have been proposed here and here, but not without controversy (see comments on those posts). The hotel has the effect of aggregating a selection of people and projects starting out in EA, and in principle would-be funders can offer to pay for the costs of individual people or projects hosted at the hotel (or even offer to further fund them at higher rates to expand their projects elsewhere). But this does require the hotel to continue to exist (or some other kind of aggregator to take its place). Another way of looking at it would be by way of analogy to venture capital. By funding the EA hotel you are buying the portfolio of a VC firm (that is taking a hits-based giving approach); by picking individual projects you are doing the work of an Angel Investor. The latter takes significantly more time and work. (NB this is different to ordinary investing in that we need to be concerned about anti-unicorns whilst on the look out for unicorns).
The controversy around aggregators revolves around increasing the risk from considerations relating to the “unilateralists curse” i.e. potentially net negative projects would have more of a chance of being funded if given a wider audience of potential funders. I think this is one reason why EA Grants don’t widely share their applications. The hotel guards against this somewhat by having a committee of trustees (and soon to add—external advisors) involved with overseeing the vetting process. Also there is the filter for commitment that is having people physically move to Blackpool, rather than just taking the money. And the on hand in-person community to get advice and feedback from.
The following arguments are ideas and have not been thoroughly researched. They may not reflect my actual views. Counterarguments are not mentioned because the OP is “mainly interested in seeing critiques.” I may post counterarguments after the reward deadline has passed.
Claim to argue against: “$172,000 to the EA Hotel has at least as much EV as $172,000 distributed randomly to grantees from EA Meta Fund grantees or EA grants grantees.”
Argument 1: The EA Hotel has a low counterfactually-adjusted impact
In this post, the EA Hotel states:
This datapoint supports the view that most EA Hotel residents would be doing the same work whether or not they stay at the hotel. The claim that “the hotel allows them to do, on average, 2.2 times more EA work” could be incorrect. To gain more certainty about this, the EA Hotel should track what residents that are not accepted actually end up doing instead.
EA Hotel residents have many options to consider to do the same work while not staying at the hotel. For example, depending on the time and location requirements of the work, they could do some combination of: (1) part-time work to finance their living expenses, (2) living with parents, friends, or another location with near-zero living expenses, or (3) living in very low-cost housing that resembles the cost of the EA Hotel.
If someone pursues option (2), the EA Hotel is negative EV because someone can choose a free option instead of the EA Hotel, which consumes community funds.
If someone pursues options (1) and (3), they might only have to work a very limited amount of time. For example, I believe I recently heard of someone that was able to find a one bedroom living arrangement in Berkeley, CA in a large house for $500 a month, although they have to share a bathroom with many people. So someone might only need to do paid work 25% of the time and can do EA work 75% of the time. This suggests that the “2.2 times more EA work” figure greatly overstates the benefit of the EA Hotel in terms of reducing living expenses. Pursing options (1) and (3) seems to be feasible for the vast majority of people.
If direct funding allows people to pursue option (3) and secure low-cost housing, and if the cost is around the same as the EA Hotel, there may be no need for the EA Hotel itself to exist. The question becomes what is the counterfactually-adjusted impact of funding living expenses at the EA Hotel compared to option (3)? Adjustments should be made for things like missing out on the benefits of living elsewhere than Blackpool as well as relocation time and expenses which would further reduce counterfactual impact. The EA Hotel community certainly provides benefits, although coworking out of REACH may provide similar benefits.
Argument 2: The EA Hotel should charge users directly instead of raising funding
Rather than fundraising from EAs, the hotel should try to directly charge people who are benefiting from their services and community, which is an argument against donating to the hotel.
There doesn’t seem to be a need to fund people who can afford the hotel. It’s not clear what proportion of people fall under this category, but considering that it only takes 13 weeks of work at $15/hour to pay $7,900 for a one year stay at the hotel, it is possible that majority of residents can already afford to stay at the hotel.
For people who cannot afford the EA Hotel, applicants to funding organizations like EA Grants can include that they are requesting funding for living expenses and indicate EA Hotel expenses as part of their requested grant funding. EA Grants evaluators and other funders may be better equipped to evaluate the EV of projects people are working on as opposed to EA Hotel staff. If EA Grants can already cover this, there is no need to donate to the EA Hotel.
Argument 3: Funding projects has a higher impact than funding living expenses
I assume that EA Grants funds applicants’ project expenses as well as their personal salary and living expenses. This could be higher impact than solely funding living expenses. Working at the EA Hotel with an unfunded project may be quite unproductive, particularly if the project requires funding to get anywhere. Seeking early-stage EA project funding seems to require waiting for long periods of time (perhaps months) for funders to get back to you rather than working full-time trying to acquire funding.
Argument 4: People should not donate to the EA Hotel until they improve their impact metrics and reporting
The EV estimation for the EA Hotel is highly mathematical and commenters have expressed that it is difficult to follow. Actual impact reporting appears to consist of testimonials which are hard to evaluate. It’s even trickier to evaluate the counterfactually-adjusted impact.
1. The same work, yes, but at a much slower rate (factoring in the need to work outside jobs to support themselves; the haste consideration might also come into play), or burning runway at a faster rate than the EA Hotel is spending on them (if you take the view that conserving EA money in general is a plus, then this is a win). Regarding your point (3) as stated in the original blog post, we are open to funding people’s living expenses in other places if they are comparable to what they are for hotel guests. No one has yet taken us up on this though. And it also requires a high level of trust that people won’t just spend more money somewhere else and so therefore work for a shorter amount of time for the cost.
As you mention, the additional community benefits are likely to be considerable. Co-working out of REACH might give some of them, but is also likely to be a lot more expensive, and there isn’t that much space at REACH for people to do this. Also living together generally allows for a much greater degree of social interaction and a more close-knit supportive community.
2. This cost saving for all involved (EA Grants, EA Grantees) requires the hotel to exist though. And it won’t if it doesn’t bring in enough early stage funding to become more established. Also, it’s worth considering that fact that it’s unlikely the hotel would’ve got off the ground at all had we charged everyone from the outset.
Note that we request people who have salaries or >2 years of runway to pay cost price.
3. What project expenses are you imagining that don’t include salary? (I say this as a salary is usually spent on living costs and if living costs are provided the need for it is a lot lower). The hotel allows for the cheap granting—and purchasing—of runway for EA projects.
4. See here and here for a start.
The EV post might look like a lot of scary maths on the face of it, but it’s really not that complicated. It’s just an attempt to explicitly factor out the different considerations that go into estimating the EV of the EA Hotel. However, I understand that gut feelings tend to dominate such estimates. It can be useful to sanity check your gut instincts by doing such an explicit calculation, but equally it’s the mathematical model that often needs adjusting (or throwing out) if it doesn’t give you an answer close to your intuitive estimate.
Crossposting my comment from Facebook. Full disclosure: I work at the EA Hotel.
“Here’s the article Greg mentions, and I think that it’s the best argument for being lukewarm about the EA Hotel: https://www.effectivealtruism.org/articles/ea-neoliberal/
The tldr is that the neoliberals managed to change the world to adopt their ideology chiefly through convincing academia. This isn’t just some random hypothesis: they claimed that this was how to do it, *and then they did it*. Academia is mostly influenced through weird things like prestige and respectability, and therefore the success of the EA movement hinges on the impression that it makes, which hinges not on its total output as much as the *average* quality of its organisations. It seems likely that the EA hotel would push this average down.
(one can argue against this, for example by disagreeing that academia sets the overton window, or by arguing that ideology is just as likely to “trickle up” from the masses as it is likely to trickle down, or generally that there are other ways to set the overton window. The neoliberal story is compelling though, and one could claim that the same thing happened with social justice recently)”
I think this concern becomes much less of an issue if the EA Hotel didn’t have “EA” in its name
We could just start calling it the Athena Hotel. That also disambiguates if additional hotels are opened in the future.
I’m actually donating to the Patreon, but here are the arguments against that are most persuasive to me:
One argument I’ve heard raised is that the EA hotel is a rather expensive way of testing the idea of supporting EAs with low-cost living. Perhaps it would have been better to have started with a smaller scale experiment such as a group house and perhaps funding the EA hotel is too costly a way of learning about the potential of such projects.
Another is that the EA hotel should be more selective about who it admits, unlike its current very minimally low bar in order to achieve sufficient expected return. Some people may believe that the current approach is unlikely to be cost effective and that the hotel as it is currently structured is therefore testing the wrong thing. In this case, spending a few hundred thousand pounds on informational value could be seen as waste. Worse, we can imagine that after such a failure, funders would be extremely reluctant to fund a similar project that was more selective. In this case, the thing that we’d want to test might never actually be tested.
A third option is that people might not want to donate because they don’t believe that other people will donate. Let’s suppose that you believe the hotel needs to run for at least another year before it could build up the kind of track record for it to be sustainable and you have the option to donate one month’s worth of funding. It seems that donating one month’s worth of operating expenses might allow the hotel to do one month’s worth of good regardless of whether it later collapses or not, so perhaps this is irrelevant.
However, there may be two ways in which you may be trying to leverage your donation to have more than just direct impact. Firstly, if the hotel survives to the point where it builds up a track record to justify for others to fund it, counterfactual value is generated to the extent that the hotel is better than the other opportunities available to those funders. And by allowing this opportunity to exist, you would get to claim part of this value. Secondly, we can imagine extreme success scenarios where the hotel turned out to be so successful that the EA community decided to copy the concept around the world. Again, you could claim partial responsibility for this.
But, the key point is that if you think other funders won’t be forthcoming, you’ll miss out on these highly leverage scenarios. And if these are the reasons you’d want to fund the hotel, you might decide it’s best to fund something else instead.
I don’t think a small group house would’ve generated community (or interest) on the same level. What we have seems more valuable on account of its scale than, say, 4 separate group houses with the same number of total residents.
In the scheme of things, I don’t think it’s been that costly (~£60k spent to date by donors).
Depends on what your counterfactual is. My initial thoughts were on the lines of “is the EV of funding this person’s work likely to be greater than that of donating the money to a GiveWell top charity?”. We are currently working on implementing a rating system for projects. I have suggested that, space-permitting, we set the bar to clear as “equivalent to donating the money to Give Directly”. The bar would be raised proportionally to how little free capacity the hotel has, although in principle the hotel has the potential to expand given available buildings on our street. Obviously making such judgements comes with large error bars and a heavy weighting of priors. Also, given this is Hits-based Giving, I’m hopeful that the long-term mean value of projects will be significantly above the entry bar.
We have considered that perhaps a Kickstarter-like mechanism is needed here. However, given the recent interest of a few people in donating at the 4-figure level, I’m more optimistic that we can get by without it (although it might be useful for other new projects in the EA space that require a significant initial outlay, the added bonus being the social proof).
And it looks like the prize goes to PeterMcCluskey’s comment, which at the current time has 33 votes, the next highest being a tie with 21.
I’m going to break your rules a bit and instead start by critiquing the proposition:
$X to the EA Hotel has at least as much EV as $X to the most promising person at the Hotel.
It may not be easy to fund individuals, but if someone wanted to give, say, $10,000, what’s to stop them from looking at the Hotel’s guest list, picking the best-sounding project, and offering money directly to the person behind it? (Then, if that person doesn’t need/want the money, they move to the next-best person, and so on.)
This may burn time on vetting, but it’s at least easier than vetting everyone at the Hotel to get a sense for its average impact.
--
You could also try to estimate the Hotel’s value as a tool for creating networks—boosting research productivity by giving people an easier way to start conversations and help with one another’s work. If that’s the case, the comparison to EA Meta grantees becomes more apt.
That said, there are a lot of Meta grantees, and trying to find the “best” of them is difficult by any measure. So people may end up wanting to fund organizations with longer histories (like LEAN or The Life You Can Save), or organizations with an extremely good “best-case” scenario (like the Center for Election Science or Sparrow). It’s hard to think of which sub-factor the EA Hotel is “best at” compared to all those other organizations.
Just to give one example: For $10,000, I could fund Giving Games where several hundred people are introduced to EA and make their first “EA-aligned” donation, or pay for 1.3 years of EA Hotel time. Those are very different things, and I could imagine at least 50% of potential meta donors thinking that the first option is better.
If the rest of those donors then compare the Hotel to the next project on the list, and the next… well, there aren’t many people who make large donations to individual meta projects, and it’s not surprising if only a small fraction of that already-small pool lands on the Hotel as their “final answer”.
(This model is too simple, of course, since many donors give to multiple organizations. The most important point is that the Hotel has a lot of competition, and may not stand out enough compared to all the other options.)
--
I work for CEA, but these views are my own.
In answer to your first point, see my reply to Moses here.
--
Regarding your second point—comparing with other Meta opportunities—one might also want to consider that many Meta projects have goals of bringing more people into the community. We need to then cater for these people. The EA Hotel can help do that, and fill an important gap.
I agree being immersed is important because risks are hard to anticipate for a single individual. I would argue that the scenario seems somewhat artificial, as general competence of someone not interacting with EAs is unlikely to be in the 95th percentile.
I agree. However, this is not really about skill or intelligence: Humans in general often don’t take critical feedback nearly as seriously as they should, and often don’t adjust their thinking/actions due to sunk costs, wanting to save face in their peer group, grandiose personality, etc. This also applies to EAs (maybe somewhat but not vastly less so).
From looking at the published list of EA Hotel residents, I tentatively think some people’s work might come with high downside risk, while others have high upside potential and seem worth supporting. I’m not sure how this balances out. Discussing individual projects in public seems difficult, which is maybe part of the reason why people find the arguments against funding the EA Hotel unconvincing. All else equal, I’d probably prefer something like Aaron Gertler’s approach of “looking at the Hotel’s guest list, picking the best-sounding project, and offering money directly to the person behind it.” I have also shared some thoughts for how to design the admission process with EA Hotel staff.
(If one accepted the premise that downside risk is prevalent and significant, one could argue that any donation to the EA Hotel that doesn’t set incentives to reduce downside risk might counterfactually replace a donation that does. I’m not sure this argument works, but it could be worth thinking about.)
(All my personal opinion, not speaking for anyone here.)
Edited to add: In many ways, the EA Hotel acts like a de facto EA grantmaker, so the concerns outlined in my comment here apply:
Agree that this can be tough (from experience). I would add that it can be emotionally draining, especially if the feedback is uncharitable or has some basis of misunderstanding or factual error. Also, it can be further complicated if after reflection one still doesn’t fully agree with the feedback and there is a genuine philosophical disagreement. (NB I’m happy to have received feedback from Jonas Vollmer and think it has/will make the EA Hotel project stronger; “uncharitable or has some basis of misunderstanding or factual error” does not apply to his feedback).
This does require the hotel to exist though (or something like it). See my comment here.
Based on some initial ideas from Jonas, we are working on a rating system for applicants and ongoing hosted projects. Tentatively it might be something like a logarithmic scale of EV, {-5,+5} with +1 = giving the money to GiveDirectly*. Trustees/Manager in one anonymous pool, Advisors in another. Bayesian priors stated in words. 95% confidence intervals given. Another round of scoring after seeing others’ input and discussion (special care taken to discuss when ratings <=-1 are given). Final scores aggregated. Guests accepted if clearing a bar of +1 (to increase with diminishing capacity). If falling below, guests have 3 months to pivot/improve.
*Would be interested in comparing with any numerical schemes other EA grantmakers are using.
I agree. The changes you’re making seem great! I also like the concise description.
(Will get back on some of the details via email, e.g., not sure 95% CIs are worth the effort.)
(Strong upvoted.)
FYI people are allowed/encouraged to defend the Hotel here, but I’m mainly interested in seeing critiques so that is what I’m financially incentivizing. I don’t personally intend to get into the object-level any more than I did above (unless asked to clarify something).
This is awesome – thank you for doing it!
The link appears to be broken.
(my interest here is in finding/popularizing ways for users of this forum to easily prove their identity to other users in case they wish to).
I’m guessing it’s because it links to a post in the EA Hotel Facebook group, which is a closed group. You can join here.
Now I have both the intelligence and attention span of a doorhinge, so forgive me if I’m missing something obvious, but I’m not at all convinced that the counterfactual would be working on their problems in solitude.
What exactly does a hotel in Blackburn provide, that couldn’t be provided much cheaper in other ways?
I assume most EA’s would be living in larger cities if not at the EA hotel. Whichever city they would be living in, would gain a lot of value from having their presence. Why couldn’t they move in together with a other EA’s from the same city and achieve a social situation roughly as good?
All the problems the EA hotel sets out to solve, I don’t understand why we couldn’t solve without the need for a large expensive hotel in blackburn of all places.
It’s Blackpool, and it’s a cheap hotel (£130k for 17 bedrooms; £6k/person/year all inclusive—accommodation, food, bills, cooking, cleaning, stipend, management). It would be more expensive for EAs to live together in other cities. Also harder to have such a large community in such close proximity. And a big part of the EA Hotel project is providing free living for people unable to fund themselves to study/research/work on start-ups.
Relatedly: there already is reasonable infrastructure (and continuing to be more) oriented towards getting EAs to live in a few hub cities.
This is good, but it leaves open an alternate path (living in a cheap place, not optimized for being near silicon-valley money or Oxford respectability), that is currently very underexplored.
I wouldn’t be convinced either, but we interviewed our guests and 15 out of 20 were already doing the same work before taking up residence at the hotel. They were either working parttime or burning through runway.