But if we already know each other and trust each other’s intentions then it’s different. Most of us have already done extremely costly activities without clear gain as altruists.
That signals altruism, not effectiveness. My main concern is that the EA movement will not be able to maintain the epistemic standards necessary to discover and execute on abnormally effective ways of doing good, not primarily that people won’t donate at all. In this light, concerns about core metrics of the EA movement are very relevant. I think the main risk is compromising standards to grow faster rather than people turning out to have been “evil” all along, and I think that growth at the expense of rigor is mostly bad.
Being at all intellectually dishonest is much worse for an intellectual movement’s prospects than it is for normal groups.
instead of assuming that it’s actually true to a significant degree
The OP cites particular instances of cases where she thinks this accusation is true—I’m not worried that this is likely in the future, I’m worried that this happens.
Plus, it can be defeated/mitigated, just like other kinds of biases and flaws in people’s thinking.
I agree, but I think more likely ways of dealing with the issues involve more credible signals of dealing with the issues than just saying that they should be solvable.
I think the main risk is compromising standards to grow faster rather than people turning out to have been “evil” all along, and I think that growth at the expense of rigor is mostly bad.
Okay, so there’s some optimal balance to be had (there are always ways you can be more rigorous and less growth-oriented, towards a very unreasonable extreme). And we’re trying to find the right point, so we can err on either side if we’re not careful. I agree that dishonesty is very bad, but I’m just a bit worried that if we all start treating errors on one side like a large controversy then we’re going to miss the occasions where we err on the other side, and then go a little too far, because we get really strong and socially damning feedback on one side, and nothing on the other side.
The OP cites particular instances of cases where she thinks this accusation is true—I’m not worried that this is likely in the future, I’m worried that this happens.
To be perfectly blunt and honest, it’s a blog post with some anecdotes. That’s fine for saying that there’s a problem to be investigated, but not for making conclusions about particular causal mechanisms. We don’t have an idea of how these people’s motivations changed (maybe they’d have the exact same plans before having come into their positions, maybe they become more fair and careful the more experience and power they get).
Anyway the reason I said that was just to defend the idea that obtaining power can be good overall. Not that there are no such problems associated with it.
That signals altruism, not effectiveness. My main concern is that the EA movement will not be able to maintain the epistemic standards necessary to discover and execute on abnormally effective ways of doing good, not primarily that people won’t donate at all. In this light, concerns about core metrics of the EA movement are very relevant. I think the main risk is compromising standards to grow faster rather than people turning out to have been “evil” all along, and I think that growth at the expense of rigor is mostly bad.
Being at all intellectually dishonest is much worse for an intellectual movement’s prospects than it is for normal groups.
The OP cites particular instances of cases where she thinks this accusation is true—I’m not worried that this is likely in the future, I’m worried that this happens.
I agree, but I think more likely ways of dealing with the issues involve more credible signals of dealing with the issues than just saying that they should be solvable.
Okay, so there’s some optimal balance to be had (there are always ways you can be more rigorous and less growth-oriented, towards a very unreasonable extreme). And we’re trying to find the right point, so we can err on either side if we’re not careful. I agree that dishonesty is very bad, but I’m just a bit worried that if we all start treating errors on one side like a large controversy then we’re going to miss the occasions where we err on the other side, and then go a little too far, because we get really strong and socially damning feedback on one side, and nothing on the other side.
To be perfectly blunt and honest, it’s a blog post with some anecdotes. That’s fine for saying that there’s a problem to be investigated, but not for making conclusions about particular causal mechanisms. We don’t have an idea of how these people’s motivations changed (maybe they’d have the exact same plans before having come into their positions, maybe they become more fair and careful the more experience and power they get).
Anyway the reason I said that was just to defend the idea that obtaining power can be good overall. Not that there are no such problems associated with it.