Re “epistemics and integrity”—I’m glad to see this problem being described. It’s also why I left (angrily!) a few years ago, but I don’t think you’re really getting to the core of the issue. Let me try to point at a few things
centralized control and disbursion of funds, with a lot of discretionary power and a very high and unpredictable bar, gives me no incentive to pursue what I think is best, and all the incentive to just stick to the popular narrative. Indeed groupthink. Except training people not to groupthink isn’t going to change their (existential!) incentive to groupthink. People’s careers are on the line, there are only a few opportunities for funding, no guarantee to keep receiving it after the first round, and no clear way to pivot into a safer option except to start a new career somewhere your heart does not want to be, having thrown years away
lack of respect for “normies”. Many EA’s seemingly can’t stand interacting with non-EA’s. I’ve seen EA meditation, EA bouldering, EA clubbing, EA whatever. Orgs seem to want everyone and the janitor to be “aligned”. Everyone’s dating each other. It seems that we’re even afraid of them. I will never forget that just a week before I arrived at an org I was to be the manager of, they turned away an Economist reporter at their door...
perhaps in part due to the above, massive hubris. I don’t think we realise how much we don’t know. We started off with a few slam dunks (yeah wow 100x more impact than average) and now we seem to think we are better at everything. Clearly the ability to discern good charities does not transfer to the ability to do good management. The truth is: we are attempting something of which we don’t even know whether it is possible at all. Of course we’re all terrified! But where is the humility that should go along with that?
It seems that we’re even afraid of them. I will never forget that just a week before I arrived at an org I was to be the manager of, they turned away an Economist reporter at their door...
Fwiw, I think being afraid of journalists is extremely healthy and correct, unless you really know what you’re doing or have very good reason to believe they’re friendly. The Economist is probably better than most, but I think being wary is still very reasonable.
Thanks! Sorry to hear the epistemics stuff was so frustrating for you and caused you to leave EA.
Yes, very plausible that the example interventions don’t really get to the core of the issue—I didn’t spend long creating those and they’re more meant to be examples to help spark ideas rather than confident recommendations on the best interventions or some such. Perhaps I should have flagged this in the post.
Re “centralized control and disbursion of funds”: I agree that my example ideas in the epistemics section wouldn’t help with this much. Would the “funding diversification” suggestions below help here?
And I’m intrigued if you’re up for elaborating why you don’t think the sorts of “What could be done?” suggestions would help with the other two problems you highlight. (They’re not optimising for addressing those two specific concerns of course, but insofar as they all relate back to bad/weird epistemic practices, then things like epistemics training programmes might help?) No worries if you don’t want to or don’t have time though.
Yes, I imagine funding diversification would help, though I’m not sure if it would go far enough to make EA a good career bet.
My own solution is to work myself up to the point where I’m financially independent from EA so my agency is not compromised by someone elses model of what works
And you’re right that better epistemics might help address the other two problems, but only insofar that these are interventions that are targeted at “s1 epistemics” i.e. the stuff that doesn’t necessarily follow from conscious deliberation. Most of the techniques in this category would fall under the banner of spirituality (the pragmatic type without metaphysics). This is something that the rationalist project has not addressed sufficiently. I think there’s a lot of unexplored potential there.
I’ve seen EA meditation, EA bouldering, EA clubbing, EA whatever. Orgs seem to want everyone and the janitor to be “aligned”. Everyone’s dating each other. It seems that we’re even afraid of them.
I am not in the Bay Area or London, so I guess I’m maybe not personally familiar with the full extent of what you’re describing, but there are elements of this that sound mostly positive to me.
Like, of course, it is possible to overemphasize the importance of culture fit and mission alignment when making hiring decisions. It seems like a balance and depends on the circumstance and I don’t have much to say there.
As far as the extensive EA fraternizing goes, that actually seems mostly good. Like, to the extent that EA is a “community”, it doesn’t seem surprising or bad that people are drawn to hang out. Church groups do that sort of thing all the time for example. People often like hanging out with others with shared values, interests, experiences, outlook, and cultural touchstones. Granted, there are healthy and unhealthy forms of this.
I’m sure there’s potential for things to get inappropriate and for inappropriate power dynamics to occur when it comes to ambiguous overlap between professional contexts, personal relationships, and shared social circles. At their best though, social communities can provide people a lot of value and support.
It seems that living in the Bay Area as an EA has a huge impact, and the dynamics are healthier elsewhere. (The fact that a higher concentration of EAs is worse, of course, is at least indicative of a big problem.)
Re “epistemics and integrity”—I’m glad to see this problem being described. It’s also why I left (angrily!) a few years ago, but I don’t think you’re really getting to the core of the issue. Let me try to point at a few things
centralized control and disbursion of funds, with a lot of discretionary power and a very high and unpredictable bar, gives me no incentive to pursue what I think is best, and all the incentive to just stick to the popular narrative. Indeed groupthink. Except training people not to groupthink isn’t going to change their (existential!) incentive to groupthink. People’s careers are on the line, there are only a few opportunities for funding, no guarantee to keep receiving it after the first round, and no clear way to pivot into a safer option except to start a new career somewhere your heart does not want to be, having thrown years away
lack of respect for “normies”. Many EA’s seemingly can’t stand interacting with non-EA’s. I’ve seen EA meditation, EA bouldering, EA clubbing, EA whatever. Orgs seem to want everyone and the janitor to be “aligned”. Everyone’s dating each other. It seems that we’re even afraid of them. I will never forget that just a week before I arrived at an org I was to be the manager of, they turned away an Economist reporter at their door...
perhaps in part due to the above, massive hubris. I don’t think we realise how much we don’t know. We started off with a few slam dunks (yeah wow 100x more impact than average) and now we seem to think we are better at everything. Clearly the ability to discern good charities does not transfer to the ability to do good management. The truth is: we are attempting something of which we don’t even know whether it is possible at all. Of course we’re all terrified! But where is the humility that should go along with that?
Fwiw, I think being afraid of journalists is extremely healthy and correct, unless you really know what you’re doing or have very good reason to believe they’re friendly. The Economist is probably better than most, but I think being wary is still very reasonable.
Thanks! Sorry to hear the epistemics stuff was so frustrating for you and caused you to leave EA.
Yes, very plausible that the example interventions don’t really get to the core of the issue—I didn’t spend long creating those and they’re more meant to be examples to help spark ideas rather than confident recommendations on the best interventions or some such. Perhaps I should have flagged this in the post.
Re “centralized control and disbursion of funds”: I agree that my example ideas in the epistemics section wouldn’t help with this much. Would the “funding diversification” suggestions below help here?
And I’m intrigued if you’re up for elaborating why you don’t think the sorts of “What could be done?” suggestions would help with the other two problems you highlight. (They’re not optimising for addressing those two specific concerns of course, but insofar as they all relate back to bad/weird epistemic practices, then things like epistemics training programmes might help?) No worries if you don’t want to or don’t have time though.
Thanks again!
Yes, I imagine funding diversification would help, though I’m not sure if it would go far enough to make EA a good career bet.
My own solution is to work myself up to the point where I’m financially independent from EA so my agency is not compromised by someone elses model of what works
And you’re right that better epistemics might help address the other two problems, but only insofar that these are interventions that are targeted at “s1 epistemics” i.e. the stuff that doesn’t necessarily follow from conscious deliberation. Most of the techniques in this category would fall under the banner of spirituality (the pragmatic type without metaphysics). This is something that the rationalist project has not addressed sufficiently. I think there’s a lot of unexplored potential there.
I am not in the Bay Area or London, so I guess I’m maybe not personally familiar with the full extent of what you’re describing, but there are elements of this that sound mostly positive to me.
Like, of course, it is possible to overemphasize the importance of culture fit and mission alignment when making hiring decisions. It seems like a balance and depends on the circumstance and I don’t have much to say there.
As far as the extensive EA fraternizing goes, that actually seems mostly good. Like, to the extent that EA is a “community”, it doesn’t seem surprising or bad that people are drawn to hang out. Church groups do that sort of thing all the time for example. People often like hanging out with others with shared values, interests, experiences, outlook, and cultural touchstones. Granted, there are healthy and unhealthy forms of this.
I’m sure there’s potential for things to get inappropriate and for inappropriate power dynamics to occur when it comes to ambiguous overlap between professional contexts, personal relationships, and shared social circles. At their best though, social communities can provide people a lot of value and support.
Why is “EA clubbing” a bad thing?
I strongly agree.
It seems that living in the Bay Area as an EA has a huge impact, and the dynamics are healthier elsewhere. (The fact that a higher concentration of EAs is worse, of course, is at least indicative of a big problem.)