I often think of it as EA being too conservative rather than having a culture of fear, and maybe those are different things, but here’s some of what I see happening.
People reason that EA orgs and people representing EA need to be respectable because this will later enable doing more good. And I’d be totally fine with that if every instance of it was clearly instrumental to doing the most good.
However, I think this goal of being respectable doesn’t take long to become fixed in place, and now people are optimizing for doing the most good AND being respectable, which means they will trade off doing the most good and respectability along the efficiency frontier. Ditto for other things people might optimize for: being right, growing the movement, gaining power, etc.
To the extent that EA is about doing the most good, we should be very clear when we start trying to optimize for other things. Yes, if we optimize for doing the most good in the short term we’ll likely harm ourselves in the long term, but so to does the movement harm itself by trading away doing the most good for other things that someone thinks maybe will matter rather than having a solid case that it’s the right thing to do. You could argue that someone like Will MacAskill put a lot of thought into being respetable and had good reason to do it rather than just immediately do the short term thing that would have done the most good for EA but would have been weird and bad for the movement long term, but today I don’t think most people are doing this sort of calculation and are instead just saying “ah, I think in EA we should be respectable or whatever” and then optimizing for that AND doing the most good, thus probably failing to get the most good. 😞
I often think of it as EA being too conservative rather than having a culture of fear, and maybe those are different things, but here’s some of what I see happening.
People reason that EA orgs and people representing EA need to be respectable because this will later enable doing more good. And I’d be totally fine with that if every instance of it was clearly instrumental to doing the most good.
However, I think this goal of being respectable doesn’t take long to become fixed in place, and now people are optimizing for doing the most good AND being respectable, which means they will trade off doing the most good and respectability along the efficiency frontier. Ditto for other things people might optimize for: being right, growing the movement, gaining power, etc.
To the extent that EA is about doing the most good, we should be very clear when we start trying to optimize for other things. Yes, if we optimize for doing the most good in the short term we’ll likely harm ourselves in the long term, but so to does the movement harm itself by trading away doing the most good for other things that someone thinks maybe will matter rather than having a solid case that it’s the right thing to do. You could argue that someone like Will MacAskill put a lot of thought into being respetable and had good reason to do it rather than just immediately do the short term thing that would have done the most good for EA but would have been weird and bad for the movement long term, but today I don’t think most people are doing this sort of calculation and are instead just saying “ah, I think in EA we should be respectable or whatever” and then optimizing for that AND doing the most good, thus probably failing to get the most good. 😞