Global moratorium on AGI, now (Twitter). Founder of CEEALAR (née the EA Hotel; ceealar.org)
Greg_Colbourn
This actually goes back further, to OpenPhil funding CEA in 2017, with Nick Beckstead as the grant investigator whilst simultaneously being a Trustee of CEA (note that the history of this is now somewhat obscured, given that he later stepped down, but then stepped back up in 2021). The CoI has never been acknowledged or addressed as far as I know. I was surprised that no one seemed to have noticed this (at least publicly), so I (eventually) raised it with Max Dalton (Executive Director of CEA) in March 2021 - at least I anonymously sent a message to his Admonymous. In hindsight, it might’ve been better to publicly post (e.g. to the EA Forum), but I was concerned about EA’s reputation being damaged, and possibly lessening the chances of my own org getting funding (perhaps I was a victim of/too in sway to Ra?). Even now part of me is recognising that this could be seen as “kicking people when they are down”, or a betrayal, or mark me out as a troublemaker, and is causing me to pause [I’ve sat with this comment for hours; if you’re reading it, I must’ve finally pressed “submit”]. Then again, perhaps now is the right time to be airing concerns, lest they never be aired and improvements never made. This is what I sent to Max:
1. CEA: I’m surprised that no one has made anything of this, given it’s public information (or maybe they have, but not publicly?) - OpenPhil’s grants to CEA were based on very limited public reasoning, and in the original grant write-up, it was stated that they would review later, but never did (at least publicly). Also—and this is the kicker—Nick Beckstead was the grantmaker, when he was also a Trustee of CEA (and still is). So obviously a massive conflict of interest! Makes it seem like things are very nepotistic with OpenPhil and CEA. Also the EA committee of OpenPhil (https://www.openphilanthropy.org/committee-effective-altruism-support) being anonymous looks like a blatant cover for this! Seems like a double standard when considering CEA (/ EA Funds / EA in general)’s demand for rigorous justification for deciding on whether to fund things. Which is basically how the world works for most things (insiders and outsiders), but I would’ve hoped EA was better than this.
2. …[stuff about EA Global not having any public cost-effectiveness estimates or justification for spending so much; which was later addressed with a blog post that I can’t currently find]...
In general, it seems that there is very little public justification for CEA and it’s projects getting funding. Which is a bad example for EA. Given that the existence and funding of most EA projects is justified by extensive public write-ups and analyses.
- 17 Nov 2022 10:42 UTC; 4 points) 's comment on Some important questions for the EA Leadership by (
The vast bulk of funds in EA (OpenPhil and, until last week, FTX Future Fund) are controlled by very few people (financial). As is admission to EA Global (social). Intellectual direction is more open with e.g. the EA Forum, but things like big book projects and their promotion (The Precipice, WWOTF) are pretty centralised, as is media engagement in general.
It’s not about the sex in and of itself, it’s about the conflict of interest and favouritism. Romantic love interest is enough for that too. EA could probably learn a lot from how mainstream orgs deal with this.
- 18 Nov 2022 15:15 UTC; 16 points) 's comment on Does Sam make me want to renounce the actions of the EA community? No. Does your reaction? Absolutely. by (
- 17 Nov 2022 10:42 UTC; 4 points) 's comment on Some important questions for the EA Leadership by (
Are you Holden? (sorry, couldn’t resist)
I bought the hotel next door with my own money, and I’ve not spent any of the EA Hotel’s money on it. Given it’s relatively low cost, I see it as a decent investment largely independent of the EA Hotel (i.e. even if the EA Hotel fails I think there’s a reasonable chance property prices will go up in Blackpool in the next 5-10 years).
Perhaps in terms of maximising my positive impact it would’ve been best for me to donate the money to the EA Hotel. I think that remains to be seen though. Although I think I probably was a little over-optimistic about the funding prospects for the EA Hotel at the time, in hindsight.
(Note the timing of the purchase wasn’t ideal. It came up for auction. Strategically, I didn’t want to lose the opportunity to enable the EA Hotel to easily expand (i.e. through knocking through the wall and using all same resources in terms of kitchen, appliances, stock etc) in the event it is successful enough to warrant it.)
- 26 Oct 2019 17:34 UTC; 15 points) 's comment on EA Hotel Fundraiser 5: Out of runway! by (
- 26 Oct 2019 18:34 UTC; 9 points) 's comment on EA Hotel Fundraiser 5: Out of runway! by (
I think this is sad, and doesn’t bode well for the future of the movement. I understand legal risk is scary, but we should be doing some collective soul searching as a community, with more openness and transparency in our communications (including—or perhaps especially(!) -- from leaders), and less Ra (arguably it was, ironically, too much worrying about prestige and PR that got us into this mess in the first place!) EDIT to add: staying silent, whist legally safer in most cases, is also a statement (which can be interpreted in multiple ways).
- 18 Nov 2022 9:26 UTC; 9 points) 's comment on Why you’re not hearing as much from EA orgs as you’d like by (
- 17 Nov 2022 18:00 UTC; 7 points) 's comment on Samo Burja: What the collapse of FTX means for effective altruism by (
To be honest, I’m at a point now where I’m putting significant weight on lying. Some evidence here that FTX bailed out Alameda for ~$4B in FTT on Sep 28th. There are the blockchain transactions (disclaimed by SBF at the time), and the resignation of a high-profile figure (President of FTX.US) the day before. (Note that whilst this doesn’t look good, it’s still inconclusive. I’m sure the truth will come out eventually.)
My response (edited from my email to Habryka)
I think I would be in favor of giving a grant that covers the runway of the hotel for the next year.
Wow this is awesome, thanks!
# Thoughts on the EA Hotel:
Thanks for your detailed response.
First, the reasons why I am excited about the EA Hotel:
Providing a safety net …
Acting on historical interest …
Building high-dedication cultures …
All good reasons, eloquently put!
1. I think the people behind the EA Hotel where initially overeager to publicize the EA Hotel via broad media outreach in things like newspapers and others media outlets with broad reach.
I think this is based on an unfortunate misconception. The whole thing with the media interest has been quite surprising to us. We have never courted the media—quite the opposite in fact. It started with The Economist approaching us. This was whilst I was on holiday and out of communication. The first I heard about it was 3 days before they went to press (the piece appeared in print whilst I was still away). The journalist was told not to come to Blackpool. I spoke to them on the phone and said I wanted more time to think about it and discuss it with people. They went ahead anyway and were told “no comment” by a resident when they knocked on the door. They picked up the story from Slate Star Codex originally and decided — whether we liked it or not — to run a piece on it. I don’t think there was anything we could’ve done to prevent it.
After that, The Times (and many other media outlets) picked it up. The Times journalist booked a call with me via my Calendly. At the exact time I was expecting the call, there was a knock on the door instead and she was there with a photographer—we got doorstepped! I had a panicked 5 minute talk “off the record” with her outside, where she explained that they were going to write about us anyway (whether we liked it or not) so I might as well let her in to interview some people so we could have at least some control over the narrative (we had none with the Economist piece).
I thought it went pretty well, and the piece could’ve been worse. However, they printed some errors, despite me sending clarifications—see the “For the record” here—which made me lose more faith in the journalistic process. It seems that even when you send corrections/clarifications they don’t factor them in if it doesn’t fit their narrative. And of course you have no right of reply (or at least no right of reply at the same level of visibility).
After The Times we had another national newspaper showing up at the door unannounced the next day. Gave them nothing despite them being very persistent.
In the next couple of weeks we were inundated with media requests. We discussed the issue with many people in EA and at CEA and 80k and decided against embracing the media (we could’ve been on prime-time TV and radio with millions of viewers/listeners). The decision was largely based around considerations encapsulated by the fidelity model of spreading ideas and the Awareness/Inclination Model of movement growth. We have turned down something like 20 media requests since The Times. Most were in October following the initial media interest. But we still get some every now and again. The outside view from my friends and family is that I’m completely crazy not to accept any of these offers. I think it’s probably the right call for the EA movement, but I’m still not 100% sure given that there is basically no data on the impact of mass media appearances on movement growth/talent discovery for EA—as far as I can tell, there hasn’t been any since the launch of GWWC and 80k (I’m talking appearances in national/international media with millions of viewers/readers here).
To try and avoid this misconception being perpetuated, I have added a disclaimer to the media page on our website saying that we have never courted the media. Also, journalistic ethics are such that requesting the media cover you is not something you can easily do or be successful with. You can write a press release and send it out, but they don’t generally do requests (note we definitely did not post a press release, nor do anything to publicise the project really, apart from me posting my initial EA Forum piece and sharing it on a couple of EA Facebook groups).
because it communicates the presence of free resources that are for the taking of anyone vaguely associated with the community
I’ve actually been surprised at how few applicants from outside the movement we’ve had, even after the media.
2. I expect the EA Hotel to attract a kind of person who is pretty young, highly dedicated and looking for some guidance on what to do with their life. I think this makes them a particularly easy and promising target for people who tend to abuse that kind of trust relationship and who are looking for social influence.
Yes, one thing I’m wary of is anyone looking to gain too much social influence at the hotel. Note that the average age is actually reasonably high at around 28 though (range 20-40) (i.e. there are a fair few people changing the trajectory of their careers).
the EA Hotel could form a geographically and memetically isolated group that is predisposed for conflict with the rest of the EA community in a way that could result in a lot of negative-sum conflict.
I don’t think we are especially memetically isolated—most of us keep up with the EA Forum and EA Facebook groups etc. There is generally a high level of shared memetic culture/jargon etc that is general to the broader movement. Geographically, many guests have travelled to EA events in continental Europe, and have visited other UK EA hubs like London and Oxford.
3. I don’t have a sense that Greg wants to really take charge on the logistics of running the hotel, and don’t have a great candidate for someone else to run it. Though it seems pretty plausible that we could find someone to run it if we invest some time into finding someone.
Yes, it’s not something that I want to do long term (although I have been doing a lot). And it’s taking a lot longer than I initially thought it would take to get things fully set up (especially setting up a charity and fundraising). There are two main aspects to the job really—logistics and guest mentoring/vetting. Currently one of the guests is taking on most of the cooking/dishes/food monitor work, and we have a rota for weekends (this could potentially be outsourced with more funds available). And we have a cleaner doing the cleaning/room changes. (Interim Manager) Toon has been doing checkins with guests to discuss their work. He’s only been working on the hotel part time though (he also runs RAISE) and is leaving in a couple of months. We haven’t been able to start the process for hiring a full time manager to take over due to funding insecurity. Would be great if you could help us find someone, thanks!
- 7 Nov 2019 18:20 UTC; 26 points) 's comment on EA Hotel Fundraiser 5: Out of runway! by (
- 18 Nov 2022 9:59 UTC; 14 points) 's comment on Media attention on EA (again) by (
It made it to the White House Press Briefing. This clip is like something straight out of the film Don’t Look Up. Really hope that the ending is better (i.e. the warning is actually heeded).
As part of an AMA I put on X, I was asked for my “top five EA hot takes”. If you’ll excuse the more X-suited tone and spiciness, here they are:
1. OpenAI, Anthropic (and to a lesser extent DeepMind) were the worst cases of Unilateralists Curse of all time. EAs love to discourage enthusiastic newcomers by warning to not do “net negative” unilateralist actions (i.e. don’t start new projects in case they crowd out better, more “well thought through” projects in future, with “more competent” people doing them), but nothing will ever top the monumental unilateralist curse fuck up that was supporting Big AGI in it’s beginnings.
2. AI Safety is nothing without a Pause. Too many EAs are stuck in the pre-GPT-4 paradigm of maxing research, when it’ll all be for nothing unless we get a Pause first. More EAs should switch to Notkilleveryoneism/PauseAI/StopAGI.
3. EA is too elitist. We should be triaging the world’s problems like crazy, and the top 1-2% of people are more than capable of that (most jobs that need doing in EA don’t require top 0.1%).
4. EA is too PR focused—to the point where it actually backfires spectacularly and now there is lots of bad press [big example: SBF’s bad character being known about but not addressed].
5. Despite all it’s flaws, EA is good (and much better than the alternatives in most cases).
My main question re the Future Fund at the moment is: why does it seem like there weren’t any ring-fenced funds under legal ownership by the Future Fund or the FTX Foundation? Are there any? Were there any when it was founded last year (i.e. presumably when FTX/Alameda was still solvent)? If not, why not? Did this not raise suspicions amongst any of you? I can imagine maybe SBF saying something like the max-EV thing to do is keeping all the funds in the for-profit companies to maximise their growth, and you going along with it because you trusted him (or you just independently agreed and didn’t put any significant weight on FTX/Alameda collapsing or even just becoming less rich). Obviously an error in hindsight. Or maybe you kept asking about getting (more) ring-fenced funds, and kept getting fobbed off? That should’ve raised alarm bells if so! Sorry if this is a bit ranty and speculative, or too soon, or too accusatory, but I’m grasping for answers here. I’m grateful for everything you’ve done for the world and EA in your careers, but can’t help feeling that you might’ve messed up a bit here.
Megastar salaries for AI alignment work
Artificial Intelligence
Aligning future superhuman AI systems is arguably the most difficult problem currently facing humanity; and the most important. In order to solve it, we need all the help we can get from the very best and brightest. To the extent that we can identify the absolute most intelligent, most capable, and most qualified people on the planet – think Fields Medalists, Nobel Prize winners, foremost champions of intellectual competition, the most sought-after engineers – we aim to offer them salaries competitive with top sportspeople, actors and music artists to work on the problem. This is complementary to our AI alignment prizes, in that getting paid is not dependent on results. The pay is for devoting a significant amount of full time work (say a year), and maximum brainpower, to the problem; with the hope that highly promising directions in the pursuit of a full solution will be forthcoming. We will aim to provide access to top AI alignment researchers for guidance, affiliation with top-tier universities, and an exclusive retreat house and office for fellows of this program to use, if so desired.
- Recruit the World’s best for AGI Alignment by 30 Mar 2023 16:41 UTC; 34 points) (
- 5 Mar 2022 0:30 UTC; 30 points) 's comment on The Future Fund’s Project Ideas Competition by (
- 2 Mar 2022 19:43 UTC; 10 points) 's comment on Have You Tried Hiring People? by (LessWrong;
- 4 Mar 2022 9:07 UTC; 9 points) 's comment on AGI x-risk timelines: 10% chance (by year X) estimates should be the headline, not 50%. by (
- 3 Mar 2022 12:58 UTC; 7 points) 's comment on The Future Fund’s Project Ideas Competition by (
- 25 Aug 2022 21:50 UTC; 3 points) 's comment on Who would you have on your dream team for solving AGI Alignment? by (
Very hard hitting and emotional. I’m feeling increasingly like I did in February/March 2020, pre-lockdown. Full on broke down to tears after reading this. Shut it all down.
I think it’s great timing. I’ve been increasingly thinking that now is the time for a global moratorium. In fact, I was up until the early hours drafting a post on why we need such a moratorium! Great to wake up and see this :)
- 29 Mar 2023 16:38 UTC; 16 points) 's comment on FLI open letter: Pause giant AI experiments by (
SBF has broke his silence on Twitter.
(continues in a 21 tweet thread)
Eliezer Yudkowsky
…Throwing more money at this problem does not obviously help because it just produces more low-quality workMaybe you’re not thinking big enough? How about offering the world’s best mathematicians (e.g. Terence Tao) a lot of money to work on AGI Safety. Say $5M to work on the problem for a year. Perhaps have it open to any Fields Medal recipient. (More)
- Recruit the World’s best for AGI Alignment by 30 Mar 2023 16:41 UTC; 34 points) (
- 12 Nov 2021 12:58 UTC; 3 points) 's comment on Discussion with Eliezer Yudkowsky on AGI interventions by (LessWrong;
Going to say that personally, I still very much think this is the best use of EA money on the margin, considering the low costs per person-year of work, hits-based giving, community building and network effects, and room for more funding (i.e. the current accute need of funding). Especially in the current situation, I think it’s an outstanding opportunity for small/medium-sized donations to move the needle.
However, I’m at the stage where I’m having to consider losing my own financial independence / potential for investing in the future (including in EA things) if I want to give further financial support to the EA Hotel. And I’m not quite ready to do that.
1. We hope to post a list of outputs soon (within the next week).
2. Those on salaries from Rethink Priorities have been paying cost price (£10/day). AFAIK they have not adjusted their salaries downward because staff are staying at the hotel. RAISE has contributed to the Hotel from the limited funding they have received recently.
3. Depends what your counterfactual use of the money is in terms of what the bar of EV to clear is. Given our low costs, the EV bar could be quite low over a number of comparisons. We aim to adjust the entry bar depending on supply.
4. We have a pitch doc we’ve been circulating to potential funders. Will release it publicly soon (within the next week).
5. The rationalist group house in Manchester was most definitely not my project! I just moved into the original rental house with the organiser and a couple of others (and later bought a house and offered it as a shared space, while they continued to organise the project).
6. This doesn’t seem like something most EAs would be that concerned with, but I could be wrong. If you think having more backers for nominal amounts is good, please donate a nominal amount!
$5-$15M seems very cheap. I’m guessing that the buildings alone are worth more than that (and it must at least own the Oxford one). Has anyone enquired about the actual price?
Sven Roneshould’ve won a prize in the Red Teaming contest[1]:
The Effective Altruism movement is not above conflicts of interest
[published Sep 1st 2022]
(Note that this issue was commented on here a month ago.) This whole thing is now starting to look like the classic “ends justify the means” criticism of Utilitarianism writ large :(
although looks like it wasn’t actually entered? Edit: it was, but not posted as a top-level post on the EA Forum (see comments below).