Global moratorium on AGI, now (Twitter). Founder of CEEALAR (née the EA Hotel; ceealar.org)
Greg_Colbourn
EA Hotel with free accommodation and board for two years
Funding circle aimed at slowing down AI—looking for participants
This actually goes back further, to OpenPhil funding CEA in 2017, with Nick Beckstead as the grant investigator whilst simultaneously being a Trustee of CEA (note that the history of this is now somewhat obscured, given that he later stepped down, but then stepped back up in 2021). The CoI has never been acknowledged or addressed as far as I know. I was surprised that no one seemed to have noticed this (at least publicly), so I (eventually) raised it with Max Dalton (Executive Director of CEA) in March 2021 - at least I anonymously sent a message to his Admonymous. In hindsight, it might’ve been better to publicly post (e.g. to the EA Forum), but I was concerned about EA’s reputation being damaged, and possibly lessening the chances of my own org getting funding (perhaps I was a victim of/too in sway to Ra?). Even now part of me is recognising that this could be seen as “kicking people when they are down”, or a betrayal, or mark me out as a troublemaker, and is causing me to pause [I’ve sat with this comment for hours; if you’re reading it, I must’ve finally pressed “submit”]. Then again, perhaps now is the right time to be airing concerns, lest they never be aired and improvements never made. This is what I sent to Max:
1. CEA: I’m surprised that no one has made anything of this, given it’s public information (or maybe they have, but not publicly?) - OpenPhil’s grants to CEA were based on very limited public reasoning, and in the original grant write-up, it was stated that they would review later, but never did (at least publicly). Also—and this is the kicker—Nick Beckstead was the grantmaker, when he was also a Trustee of CEA (and still is). So obviously a massive conflict of interest! Makes it seem like things are very nepotistic with OpenPhil and CEA. Also the EA committee of OpenPhil (https://www.openphilanthropy.org/committee-effective-altruism-support) being anonymous looks like a blatant cover for this! Seems like a double standard when considering CEA (/ EA Funds / EA in general)’s demand for rigorous justification for deciding on whether to fund things. Which is basically how the world works for most things (insiders and outsiders), but I would’ve hoped EA was better than this.
2. …[stuff about EA Global not having any public cost-effectiveness estimates or justification for spending so much; which was later addressed with a blog post that I can’t currently find]...
In general, it seems that there is very little public justification for CEA and it’s projects getting funding. Which is a bad example for EA. Given that the existence and funding of most EA projects is justified by extensive public write-ups and analyses.
- 17 Nov 2022 10:42 UTC; 4 points) 's comment on Some important questions for the EA Leadership by (
The vast bulk of funds in EA (OpenPhil and, until last week, FTX Future Fund) are controlled by very few people (financial). As is admission to EA Global (social). Intellectual direction is more open with e.g. the EA Forum, but things like big book projects and their promotion (The Precipice, WWOTF) are pretty centralised, as is media engagement in general.
Timelines are short, p(doom) is high: a global stop to frontier AI development until x-safety consensus is our only reasonable hope
AGI rising: why we are in a new era of acute risk and increasing public awareness, and what to do now
AGI x-risk timelines: 10% chance (by year X) estimates should be the headline, not 50%.
It’s not about the sex in and of itself, it’s about the conflict of interest and favouritism. Romantic love interest is enough for that too. EA could probably learn a lot from how mainstream orgs deal with this.
- 18 Nov 2022 15:15 UTC; 16 points) 's comment on Does Sam make me want to renounce the actions of the EA community? No. Does your reaction? Absolutely. by (
- 17 Nov 2022 10:42 UTC; 4 points) 's comment on Some important questions for the EA Leadership by (
EA Hotel Fundraiser 2: Current guests and their projects
[Question] If your AGI x-risk estimates are low, what scenarios make up the bulk of your expectations for an OK outcome?
Apply to CEEALAR to do AGI moratorium work
UN Secretary-General recognises existential threat from AI
Are you Holden? (sorry, couldn’t resist)
Why didn’t the FTX Foundation secure its bag?
I bought the hotel next door with my own money, and I’ve not spent any of the EA Hotel’s money on it. Given it’s relatively low cost, I see it as a decent investment largely independent of the EA Hotel (i.e. even if the EA Hotel fails I think there’s a reasonable chance property prices will go up in Blackpool in the next 5-10 years).
Perhaps in terms of maximising my positive impact it would’ve been best for me to donate the money to the EA Hotel. I think that remains to be seen though. Although I think I probably was a little over-optimistic about the funding prospects for the EA Hotel at the time, in hindsight.
(Note the timing of the purchase wasn’t ideal. It came up for auction. Strategically, I didn’t want to lose the opportunity to enable the EA Hotel to easily expand (i.e. through knocking through the wall and using all same resources in terms of kitchen, appliances, stock etc) in the event it is successful enough to warrant it.)
- 26 Oct 2019 17:34 UTC; 15 points) 's comment on EA Hotel Fundraiser 5: Out of runway! by (
- 26 Oct 2019 18:34 UTC; 9 points) 's comment on EA Hotel Fundraiser 5: Out of runway! by (
I think this is sad, and doesn’t bode well for the future of the movement. I understand legal risk is scary, but we should be doing some collective soul searching as a community, with more openness and transparency in our communications (including—or perhaps especially(!) -- from leaders), and less Ra (arguably it was, ironically, too much worrying about prestige and PR that got us into this mess in the first place!) EDIT to add: staying silent, whist legally safer in most cases, is also a statement (which can be interpreted in multiple ways).
- 18 Nov 2022 9:26 UTC; 9 points) 's comment on Why you’re not hearing as much from EA orgs as you’d like by (
- 17 Nov 2022 18:00 UTC; 7 points) 's comment on Samo Burja: What the collapse of FTX means for effective altruism by (
To be honest, I’m at a point now where I’m putting significant weight on lying. Some evidence here that FTX bailed out Alameda for ~$4B in FTT on Sep 28th. There are the blockchain transactions (disclaimed by SBF at the time), and the resignation of a high-profile figure (President of FTX.US) the day before. (Note that whilst this doesn’t look good, it’s still inconclusive. I’m sure the truth will come out eventually.)
My response (edited from my email to Habryka)
I think I would be in favor of giving a grant that covers the runway of the hotel for the next year.
Wow this is awesome, thanks!
# Thoughts on the EA Hotel:
Thanks for your detailed response.
First, the reasons why I am excited about the EA Hotel:
Providing a safety net …
Acting on historical interest …
Building high-dedication cultures …
All good reasons, eloquently put!
1. I think the people behind the EA Hotel where initially overeager to publicize the EA Hotel via broad media outreach in things like newspapers and others media outlets with broad reach.
I think this is based on an unfortunate misconception. The whole thing with the media interest has been quite surprising to us. We have never courted the media—quite the opposite in fact. It started with The Economist approaching us. This was whilst I was on holiday and out of communication. The first I heard about it was 3 days before they went to press (the piece appeared in print whilst I was still away). The journalist was told not to come to Blackpool. I spoke to them on the phone and said I wanted more time to think about it and discuss it with people. They went ahead anyway and were told “no comment” by a resident when they knocked on the door. They picked up the story from Slate Star Codex originally and decided — whether we liked it or not — to run a piece on it. I don’t think there was anything we could’ve done to prevent it.
After that, The Times (and many other media outlets) picked it up. The Times journalist booked a call with me via my Calendly. At the exact time I was expecting the call, there was a knock on the door instead and she was there with a photographer—we got doorstepped! I had a panicked 5 minute talk “off the record” with her outside, where she explained that they were going to write about us anyway (whether we liked it or not) so I might as well let her in to interview some people so we could have at least some control over the narrative (we had none with the Economist piece).
I thought it went pretty well, and the piece could’ve been worse. However, they printed some errors, despite me sending clarifications—see the “For the record” here—which made me lose more faith in the journalistic process. It seems that even when you send corrections/clarifications they don’t factor them in if it doesn’t fit their narrative. And of course you have no right of reply (or at least no right of reply at the same level of visibility).
After The Times we had another national newspaper showing up at the door unannounced the next day. Gave them nothing despite them being very persistent.
In the next couple of weeks we were inundated with media requests. We discussed the issue with many people in EA and at CEA and 80k and decided against embracing the media (we could’ve been on prime-time TV and radio with millions of viewers/listeners). The decision was largely based around considerations encapsulated by the fidelity model of spreading ideas and the Awareness/Inclination Model of movement growth. We have turned down something like 20 media requests since The Times. Most were in October following the initial media interest. But we still get some every now and again. The outside view from my friends and family is that I’m completely crazy not to accept any of these offers. I think it’s probably the right call for the EA movement, but I’m still not 100% sure given that there is basically no data on the impact of mass media appearances on movement growth/talent discovery for EA—as far as I can tell, there hasn’t been any since the launch of GWWC and 80k (I’m talking appearances in national/international media with millions of viewers/readers here).
To try and avoid this misconception being perpetuated, I have added a disclaimer to the media page on our website saying that we have never courted the media. Also, journalistic ethics are such that requesting the media cover you is not something you can easily do or be successful with. You can write a press release and send it out, but they don’t generally do requests (note we definitely did not post a press release, nor do anything to publicise the project really, apart from me posting my initial EA Forum piece and sharing it on a couple of EA Facebook groups).
because it communicates the presence of free resources that are for the taking of anyone vaguely associated with the community
I’ve actually been surprised at how few applicants from outside the movement we’ve had, even after the media.
2. I expect the EA Hotel to attract a kind of person who is pretty young, highly dedicated and looking for some guidance on what to do with their life. I think this makes them a particularly easy and promising target for people who tend to abuse that kind of trust relationship and who are looking for social influence.
Yes, one thing I’m wary of is anyone looking to gain too much social influence at the hotel. Note that the average age is actually reasonably high at around 28 though (range 20-40) (i.e. there are a fair few people changing the trajectory of their careers).
the EA Hotel could form a geographically and memetically isolated group that is predisposed for conflict with the rest of the EA community in a way that could result in a lot of negative-sum conflict.
I don’t think we are especially memetically isolated—most of us keep up with the EA Forum and EA Facebook groups etc. There is generally a high level of shared memetic culture/jargon etc that is general to the broader movement. Geographically, many guests have travelled to EA events in continental Europe, and have visited other UK EA hubs like London and Oxford.
3. I don’t have a sense that Greg wants to really take charge on the logistics of running the hotel, and don’t have a great candidate for someone else to run it. Though it seems pretty plausible that we could find someone to run it if we invest some time into finding someone.
Yes, it’s not something that I want to do long term (although I have been doing a lot). And it’s taking a lot longer than I initially thought it would take to get things fully set up (especially setting up a charity and fundraising). There are two main aspects to the job really—logistics and guest mentoring/vetting. Currently one of the guests is taking on most of the cooking/dishes/food monitor work, and we have a rota for weekends (this could potentially be outsourced with more funds available). And we have a cleaner doing the cleaning/room changes. (Interim Manager) Toon has been doing checkins with guests to discuss their work. He’s only been working on the hotel part time though (he also runs RAISE) and is leaving in a couple of months. We haven’t been able to start the process for hiring a full time manager to take over due to funding insecurity. Would be great if you could help us find someone, thanks!
- 7 Nov 2019 18:20 UTC; 26 points) 's comment on EA Hotel Fundraiser 5: Out of runway! by (
- 18 Nov 2022 9:59 UTC; 14 points) 's comment on Media attention on EA (again) by (
It made it to the White House Press Briefing. This clip is like something straight out of the film Don’t Look Up. Really hope that the ending is better (i.e. the warning is actually heeded).
Sven Roneshould’ve won a prize in the Red Teaming contest[1]:
The Effective Altruism movement is not above conflicts of interest
[published Sep 1st 2022]
(Note that this issue was commented on here a month ago.) This whole thing is now starting to look like the classic “ends justify the means” criticism of Utilitarianism writ large :(
although looks like it wasn’t actually entered? Edit: it was, but not posted as a top-level post on the EA Forum (see comments below).