[Only a weak recommendation.] I last looked at this >5 years ago and never read the whole thing. But FYI that Katja Grace wrote a case study on the Asilomar Conference on Recombinant DNA, which established a bunch of voluntary guidelines that have been influential in biotech. Includes analogy to AI safety. (No need to pay me.) https://intelligence.org/files/TheAsilomarConference.pdf
Howie_Lempel
Lincoln Quirk has joined the EV UK board
Hi, thanks for raising these questions. I wanted to confirm that Effective Ventures has seen this and is looking into it. We take our legal obligations seriously and have started an internal review to make sure we know the relevant facts.
Hi Matt—thanks for the suggestion. I agree that we should have a page like this. I’ve asked someone to take this on but we’ve got a lot of things to update at the moment so it won’t go up immediately. In the meantime, CEA’s team page has links to bios for most of the trustees here.
Regulatory inquiry into Effective Ventures Foundation UK
Thanks for the update on this! I don’t think I’d heard about it.
“In 1993, he obtained a bachelor’s degree in radio from Emerson College in Boston,[4] where one of his professors was the writer David Foster Wallace”
Yes — since the first week of the crisis, Nick and Will have been recused from the relevant discussions / decisions on the boards of both EV entities to avoid any potential conflict of interest. Staff in both EV entities were informed about that decision in mid-November.
The 80k podcast also has some potentially relevant episodes though they’re prob not directly what you most want.
https://80000hours.org/podcast/episodes/phil-trammell-patient-philanthropy/
https://80000hours.org/podcast/episodes/will-macaskill-ambition-longtermism-mental-health/
Maybe especially the section on patient philanthropy.
https://80000hours.org/podcast/episodes/will-macaskill-what-we-owe-the-future/
Some bits of this. E.g. some of the bits on political donations.
My guess is that Part II, trajectory changes will have a bunch of relevant stuff. Maybe also a bit of part 5. But unfortunately I don’t remember too clearly.
It’s been a while since I read it but Joe Carlsmith’s series on expected utility might help some.
[My impression. I haven’t worked on grantmaking for a long time.] I think this depends on the topic, size of the grant, technicality of the grant, etc. Some grantmakers are themselves experts. Some grantmakers have experts in house. For technical/complicated grants, I think non-expert grantmakers will usually talk to at least some experts before pulling the trigger but it depends on how clearcut the case for the grant is, how big the grant is, etc.
I think parts of What We Owe the Future by Will MacAskill discuss this approach a bit.
Others, most of which I haven’t fully read and not always fully on topic:
Richard Posner. Catastrophe: Risk and Response. (Precursor)
Richard A Clarke and RP Eddy. Warnings: Finding Cassandras to Stop Catastrophes
General Leslie Groves. Now It Ca Be Told: the Story of the Manhattan Project (nukes)
Much narrower recommendation for nearby problems is Overcoming Perfectionism (~a CBT workbook).
I’d recommend to some EAs who are already struggling with these feelings (and know some who’ve really benefitted from it). (It’s not precisely aimed at this but I think it can be repurposed for a subset of people.)
Wouldn’t recommend to students recently exposed to EA who are worried about these feelings in future.
If you haven’t come across it, a lot of EAs have found Nate Soares’ Replacing Guilt series useful for this. (I personally didn’t click with it but have lots of friends who did).
I like the way some of Joe Carlsmith’s essays touch on this.
FYI—subsamples of that survey were asked about this in other ways, which gave some evidence that “extremely bad outcome” was ~equivalent to extinction.
Explicit P(doom)=5-10% The levels of badness involved in that last question seemed ambiguous in retrospect, so I added two new questions about human extinction explicitly. The median respondent’s probability of x-risk from humans failing to control AI [1]was 10%, weirdly more than median chance of human extinction from AI in general,[2] at 5%. This might just be because different people got these questions and the median is quite near the divide between 5% and 10%. The most interesting thing here is probably that these are both very high—it seems the ‘extremely bad outcome’ numbers in the old question were not just catastrophizing merely disastrous AI outcomes.
Thanks for this! It was really useful and will save 80,000 Hours a lot of time.
I think the people responsible for EA Global admissions (including Amy Labenz, Eli Nathan, and others) have added a bunch of value to me over the years by making it more likely that a conversation or meeting with somebody at EA Global who I don’t already know will end up being productive. Making admissions decisions at EAG (and being the public face of an exclusive admissions policy) sounds like a really thankless job and I know a bunch of the people involved end up having to make decisions that make them pretty sad because they think it’s best for the world. I mostly just wanted to express some appreciation for them and to mention that I’ve benefitted from it since it feels uncomfortable to say out loud so is probably under expressed.
One positive effect of selective admissions that I don’t often see discussed is that it makes me more likely to take meetings with folks I don’t already know. I’d guess that this increases the accessibility of EA leaders to a bunch of folks in the community.
Fwiw, I’ve sometimes gotten overambitious with the number of meetings I take at EAG and ended up socially exhausted enough to be noticeably less productive for several days afterwards. This is a big enough cost that I’ve skipped some years. So, I think in the past I’ve probably been on the margin where if the people at EAG had not been selected for being folks I could be helpful to, I’d have been less likely to go.- 9 Feb 2023 8:34 UTC; 20 points) 's comment on Solidarity for those Rejected from EA Global by (
Hey Bob—Howie from EV UK here. Thanks for flagging this! I definitely see why this would look concerning so I just wanted to quickly chime in and let you/others know that we’ve already gotten in touch with relevant regulators about this and I don’t think there’s much to worry about here.
The thing going on is that EV UK has an extended filing deadline (from 30 April to 30 June 2023) for our audited accounts,[1] which are one of the things included in our Annual Return. So back in April, we notified the Charity Commission that we’ll be filing our Annual Return by 30 June.
This is due to a covid extension, which the UK government has granted to many companies.