This is a great overview of ops, thanks for writing! I especially like the emphasis on service mindset. For me, the immediate reward of ops has typically been pleasing people: It makes me happy in the moment to help people, and then I get my higher-level abstract satisfaction from my belief that those people are doing stuff to improve the world.
sawyer
EA is too reliant on personal connections
[Question] Under what conditions should FTX grantees voluntarily return their grants?
[Question] Should there be a BERI for animal welfare or global poverty?
[Question] Is there an EA grants database?
Downvoted because I think this is too harsh and accusatory:
I cannot believe that some of you delete your posts simply because it ends up being downvoted.
Also because I disagree in the following ways:
Donating anonymously seems precisely opposed to transparency. At the very least, I don’t think it’s obvious that donor anonymity works towards the values you’re expressing in your post. Personally I think being transparent about who is donating to what organizations is pretty important for transparency, and I think this is a common view.
I don’t think FTX’s mistakes are particularly unique to crypto, but rather just normal financial chicanery.
“if the only way we aggregate how “good” red-teaming is is by up-votes, that is flawed”
IIRC the red-teaming contest did not explicitly consider up-votes in their process for granting awards, and the correlation between upvotes and prize-winners was weak.
“What makes EA, EA, what makes EA antifragile, is its ruthless transparency.”
For better or for worse, I don’t think ruthless transparency is a focus or a strength of EA. I agree with your sentence right after that, but I don’t think that’s much related to transparency.
BERI seeking new collaborators
As Nathan Young mentioned in his comment, this argument is also similar to Carl Shulman’s view expressed in this podcast: https://80000hours.org/podcast/episodes/carl-shulman-common-sense-case-existential-risks/
I think a simpler explanation for his bizarre actions is that he is probably the most stressed-out person on the face of the earth right now. Or he’s not seeing the situation clearly, or some combination of the two. Also probably sleep-deprived, struggling to get good advice from people around him, etc.
(This is not meant to excuse any of his actions or words, I think he’s 100% responsible for everything he says and does.)
BERI is seeking new collaborators (2022)
SBF’s Protect Our Future PAC has put more than $7M towards Flynn’s campaign. I think this is what _pk and others are concerned about, not direct donations. And this is what most people concerned with “buying elections” are concerned about. (This is what the Citizens United controversy is about.)
BERI is hiring an ML Software Engineer
BERI seeking new collaborators
Do QURI or you (Nuno) generally accept commissions for evaluations like this? I’m potentially interested in commissioning an evaluation of BERI. Of course I totally understand if you don’t have time for this, or if you’re refusing such commissions for some other reason.
[Question] How important is it for new hires to be EA-aligned?
The relative value of taxes vs donations underlies a lot of EA thinking and doesn’t get discussed much, so I’m glad you brought this up. I think it’s important how one defines “evading taxes”. If we grant the argument that “taxes are not your money” (which is plausible and appeals to me aesthetically), it’s pretty critical to identify the “correct amount” of taxes which one owes. I might say the correct amount is whatever the tax authorities say I need to pay, which basically amounts to “whatever I can get away with”. Or you might say a bunch of the normal loopholes aren’t morally legitimate, and that the correct amount is “whatever your tax bracket says”. Or if you’re a tax protester, you might say one or taxes are not morally legitimate, and so the correct amount of taxes you owe is in fact less than the tax authorities say it is.
My point is, establishing how much money I owe in taxes (and therefore how much of my income belongs to the state) is as much a political question as it is a legal or administrative question.
In my opinion (and it seems you agree) Jeff’s proposal is sufficient far away from what most people consider “tax evasion” that it doesn’t really run into the problem you’re identifying. But I occasionally see other EA proposals that look closer to “steal money to buy bed nets”.
I really enjoyed this post. In addition to being well-written and a nice read, it’s packed full with great links supporting and contextualizing your thoughts. Given how much has been written about related topics recently, I was happy you chose to make those connections explicitly. I feel like it helped me understand where you were positioning your own arguments in the wider conversation.
SFF (website) is a donor advised fund, advised by the people who make up BERI’s Board of Directors
This is not strictly true. The two fund advisors listed on SFF’s website are Andrew and Eric. BERI’s board is Andrew, Sawyer, and Jess (who replaced Eric earlier this year). I have personally never been involved with SFF’s operations or grant evaluations (and in fact, doing so would be a major conflict of interest since BERI receives a lot of funding through those rounds). You’re not the only person making this mistake, and it seems like an easy one to make given BERI and SFF’s history. I don’t know if this is just a minor gripe from a fussy insider, or if this conflation makes other people worry about conflicts of interest between the two orgs. But I figured I’d bring it up either way.
Amazing post as always, I’m so glad you do this!
One general answer and one specific answer. (Numbering is my own and doesn’t correspond to your questions.)
When implementing a new system that will be used by a variety of people in your org, it’s important to make this as easy as possible for them. As Martin said, if you end up with one person who doesn’t want to use the system, this will greatly increase your work. When you’re choosing a new tool or system for people, you should assume that all of your new users don’t care about it, and that any time they have to spend learning the new system will be time they’d rather spend doing something else. To that end:
Don’t just ask people to read the software’s own published user guide. Even if it’s really great and you couldn’t imagine making anything better! People don’t want to click on links, they don’t want to go somewhere else and see new branding, they just want you to tell them what to do.
Do write specific instructions, in an email, as clearly and concisely as possible. Even if those instructions mirror the software’s user guide, it’s likely you can make them more concise because your instructions are for your org’s specific use case.
Do offer to help anyone who struggles with the new system. It’s unlikely anyone will take you up on this offer, but I think it makes people feel like you know they don’t want to do this thing you’re asking them to do.
Be ready for people to keep trying to use the old system. Keep trying to shepherd them into the new system. Don’t give up, don’t resign yourself to the two-system world. People will fail to see your first email, but happily respond to your follow-up. Seize opportunities to walk someone through a concrete example: Maybe they didn’t set up their Expensify account the first time, but now they’re asking for reimbursement, so you can (nicely, graciously) force them to use Expensify to get it.
BERI uses QuickBooks Online (QBO) as our accounting software. It works well enough, but has a lot of quirks that make certain things much more difficult and confusing than they have to be. Consequently, I’m (slowly, hesitantly) looking into other options, Aplos in particular. Obviously I’ll want to try out all of our normal transactions and reports in Aplos before transitioning. But even if all of that works out really well, one thing I’m very worried about is loss of audit trail: QBO keeps a record of every change to every transaction, including the time, date, and user. I’ve only used this occasionally, but when I have it’s been a lifesaver. I’m not sure how I’ll get around this if we start using Aplos, but it’s definitely something I’ll be keeping in mind. This concern would apply to any important piece of software that stores a complex history of interactions.
Building off of Jason’s comment: Another way to express this is that comparing directly to the $5,500 Givewell bar is only fair for risk-neutral donors (I think?). Most potential donors are not really risk neutral, and would rather spend $5,001 to definitely save one life than $5,000 to have a 10% chance of saving 10 lives. Risk neutrality is a totally defensible position, but so is non-neutrality. It’s good to have the option of paying a “premium” for a higher confidence (but lower risk-neutral EV).
Leaving math mode...I love this post. It made me emotional and also made me think, and it feels like a really central example of what EA should be about. I’m very impressed by your resolve here in following through with this plan, and I’m really glad to have people like you in this community.