Formerly Executive Director at BERI; now Secretary and board member. Current board member at SecureBio and FAR.AI, where Iâm also the Treasurer.
sawyerđ¸
Great points, thanks David. I especially like the compare and contrast between personal connections and academic credentials. I think probably youâre more experienced with academia and non-EA philanthropy than I am, so your empirical views are different. But I also think that even if EA is better than these other communities, we should still be thinking about (1) keeping it that way, and (2) maybe getting even less reliant. This is part of what I was saying with:
None of this is unique to EA. While I think EA is particularly guilty of some of these issues, in general I could aim this criticism in any direction and hit someone guilty of it. But âeveryone else does itâ is not in and of itself a reason to accept it. We claim to be doing something really difficult and important, so we should try to be as good as possible.
I think your observations may be counterevidence to anyone saying that EA should become more reliant on personal connections. Since you think (possibly correctly) that other major philanthropy is more reliant on personal connections than EA is, and I assume we agree that EA philanthropy is better than most other major philanthropy.
I think the extent to which âmember of the EA communityâ comes along with a certain way of thinking (i.e. âa lot of useful framesâ) is exaggerated by many people Iâve heard talk about this sort of thing. I think ~50% of the perceived similarity is better described as similar ways of speaking and knowledge of jargon. I think that there actually not that many people who have fully internalized new ways of thinking that are 1.) very rare outside of EA, and 2.) shared across most EA hiring managers.
Another way to put this would be: I think EA hiring managers often weight âmembership in the EA communityâ significantly more highly than it should be weighted. I think our disagreement is mostly about how much this factor should be weighted.
Fair point on the fast changing thing. I have some thoughts, but theyâre not very clear and I think what you said is reasonable. One very rough take: Yes youâd still the people you know, but you might go from, âI know 50% of the people in AI alignmentâ to âI know 10% of the people in AI Alignmentâ in 3 months, which could be disorienting and demoralizing. So itâs more of a relative thing than the absolute number of people you know.
Explicitly asking for a reference the head organizer knows personally.
That feels pretty bad to me! I can imagine some reason that this would be necessary for some programs, but in general requiring this doesnât seem healthy.
I find the request for references on the EA Fundsâ application to be a good middle-ground. Thereâs several sentences to it, but the most relevant one is:
References by people who are directly involved in effective altruism and adjacent communities are particularly useful, especially if we are likely to be familiar with their work and thinking.
Itâs clearly useful to already be in the fund managersâ network, but itâs also clearly not required. Of course thereâs always a difference between the policy and the practice, but this is a pretty good public policy from my perspective.
Thanks Chi, this was definitely a mistake on my part and I will edit the post. I do think that your websiteâs âGet Involvedâ â âCLR Fundâ might not be the clearest path for people looking for funding, but I also think I should have spent more time looking.
Thanks for the thoughtful feedback Chris!
I think that the author undervalues value alignment and how the natural state is towards one of regression to the norm unless specific action is taken to avoid this
I think there is difference between âvalue alignmentâ and âpersonal connectionâ. I agree that the former is important, and I think the latter is often used (mostly successfully) as a tool to encourage the former. I addressed one aspect of this in the Hiring Managers section.
I agree that as EA scales, we will be less able to rely personal relationships, but I see no reason to impose those costs now
Fair, but I worry that if weâre not prepared for this then the costs will be greater, more sudden, and confusing, e.g. people starting to feel that EA is no longer fun or good and not knowing why. I think itâs good to be thinking about these things and make the tactical choice to do nothing, rather than leaving âoverreliance on personal connections can be badâ out of our strategic arsenal completely.
I agree that it may affect our reputation in the outside world, but I donât think itâs worth increasing the risk of bad hires to attempt to satisfy our critics.
I donât think my suggestions for hiring managers would increase the risk of bad hires. In fact, I think moving away from âmy friend is friends with this personâ and towards âthis person demonstrates that they care deeply about this missionâ would decrease the risk of bad hires. (Sorry if this doesnât make sense, but I donât want to go on for too long in a comment.)
tension between reliance on personal connections and high rates of movement growth. You take this to be a reason for relying on personal connections less, but one may argue it is a reason for growing more slowly.
I completely agree! I think probably some combination is best, and/âor it could differ between subcommunities.
Also thanks for pointing out the FTX Future Fundâs experience, Iâd forgotten about that. I completely agree that this is evidence against my hypothesis, specifically in the case of grantee-grantor relationships.
Great point about mitigating as opposed to solving. Itâs possible that my having a âsolutionsâ section wasnât the best framing. I definitely donât think personal connections should be vilified or gotten rid of entirely (if that were even possible), and going too far in this direction would be really bad.
Thanks Stefan! I agree with those strengths of personal connections, and I think there are many others. I mainly tried to argue that there are negative consequences as well, and that the negatives might outweigh the positives at some level of use. Did any of the problems I mentioned in the post strike you as wrong? (Either you think they donât tend to arise from reliance on personal connections, or you think theyâre not important problems even if they do arise?)
I think this is a good idea as a neutral tracking resource, but I might be against it if it had the effect of heaping additional praise on the billionaires. (I donât like Elliotâs Impact List idea.) I think transparency is good.
Will you be taking open applications from organizations looking for funding?
Hi Lucas! If youâre still looking, you might consider applying for the Deputy Director position at the Berkeley Existential Risk Initiative. Let me know if you have any questions.
Iâm excited to see this happening and I think youâre one of the better people to be launching it. I think thereâs probably some helpful overlap with BERIâs world here, so please reach out if youâd like to talk about anything.
The Berkeley Existential Risk Initiative (BERI) is seeking a Deputy Director to help me grow BERIâs university collaborations program and create new programs, all with the mission of improving human civilizationâs long-term prospects for survival and flourishing.
This is BERIâs first âcoreâ hire since I was hired 3 years agoâall of our hires since then are embedded at some particular research group, and arenât responsible for running BERI as an organization.
This is a great opportunity for an early- to mid-career person with some experience and interest in operations. I expect that the Deputy Director will contribute substantially to BERIâs strategy and direction moving forward.
Iâd prefer the person to be based in New York City (where I am), but remote is also an option. The position is full-time, and I expect the salary range to be $70-100k/âyear.
Hereâs a bunch of links to persuade you to work at BERI:
My thoughts on what BERI is and why I think itâs important
Results from BERIâs collaborator survey in January (tl;dr people like BERI)
Annual reports for 2020 and 2021, which show how much BERI has done and include thoughts on future directions.
But do oracular funders (eg OpenPhil, Future Fund) pay taxes at all, or benefit from tax-deductability? Iâm not clear on this.
In theory it would be great to get a lawyer/âmoney manager from one of these orgs to comment on this, but I donât expect that to happen, so Iâm going to give my guess as someone who runs a charity that has gotten money from both of these orgs.
I think most of Open Philâs money is stored at a DAF at SVCF. Dustin presumably got a big tax deduction when he donated to that DAF. Open Phil also sometimes distributes money in other ways, which may or may not benefit from tax-deductibility. But those should be thought of as âmore expensiveâ since it costs money (taxes) to get money from Dustinâs pocket into those funds. So I think impact certificates would be more expensive for Open Phil than if theyâd funded the project directly.
I donât think the Future Fund has a DAF, but instead benefits from being based in The Bahamas, which doesnât have income or capital gains taxes. I expect that impact certificates would not really be more expensive for them because...FTX and SBF donât pay taxes at all?
I think most existing charitable funders look more like Open Phil here.
The relative value of taxes vs donations underlies a lot of EA thinking and doesnât get discussed much, so Iâm glad you brought this up. I think itâs important how one defines âevading taxesâ. If we grant the argument that âtaxes are not your moneyâ (which is plausible and appeals to me aesthetically), itâs pretty critical to identify the âcorrect amountâ of taxes which one owes. I might say the correct amount is whatever the tax authorities say I need to pay, which basically amounts to âwhatever I can get away withâ. Or you might say a bunch of the normal loopholes arenât morally legitimate, and that the correct amount is âwhatever your tax bracket saysâ. Or if youâre a tax protester, you might say one or taxes are not morally legitimate, and so the correct amount of taxes you owe is in fact less than the tax authorities say it is.
My point is, establishing how much money I owe in taxes (and therefore how much of my income belongs to the state) is as much a political question as it is a legal or administrative question.
In my opinion (and it seems you agree) Jeffâs proposal is sufficient far away from what most people consider âtax evasionâ that it doesnât really run into the problem youâre identifying. But I occasionally see other EA proposals that look closer to âsteal money to buy bed netsâ.
Note for readers: At Adamâs request I reviewed and approved the section on BERI prior to posting. I feel that it presents BERI accurately, and I canât think of any improvements that would be important enough to include.
Iâm excited for FARâs work and Iâm glad to see this post!
Today is Asteroid Day. From the website:
Asteroid Day as observed annually on 30 June is the United Nations sanctioned day of public awareness of the risks of asteroid impacts. Our mission is to educate the public about the risks and opportunities of asteroids year-round by hosting events, providing educational resources and regular communications to our global audience on multiple digital platforms.
I didnât know about this until today. Seems like a potential opportunity for more general communication on global catastrophic risks.
Is there any relationship between this project and the Fønix project?
- Jun 15, 2022, 7:35 PM; -1 points) 's comment on EA FoÂrum feaÂture sugÂgesÂtion thread by (
Can you explain the relationship between this project and the upcoming SHELTER Weekend?
Good catch, thanks! I canât find my original quote, so I think this was a recent change. I will edit my post accordingly.