If your goal is to do X, but you’re not doing as much as you can of X, you are failing (with respect to X).
But your claim is more like “If your goal is to do X, you need to Y, otherwise you will not do as much as of X as you can”. The Y here is “the project of effective altruism”. Hence there needs to be an explanation of why you need to do Y to achieve X. If X and Y are the same thing, we have a tautology (“If you want do X, but you do not-X, you won’t do X”).
In short, it seems necessary to say that is distinctive about the project of EA.
Analogy: say I want to be a really good mountain climber. Someone could say, oh, if you want to do that, you need to “train really hard, invest in high quality gear, and get advice from pros”. That would be helpful, specific advice about what the right means to achieve my end are. Someone who says “if you want to be good at mountain climbing, follow the best advice on how to good at mountain climbing” hasn’t yet told me anything I don’t already know.
If your goal is to do X, but you’re not doing as much as you can of X, you are failing (with respect to X).
But your claim is more like “If your goal is to do X, you need to Y, otherwise you will not do as much as of X as you can”. The Y here is “the project of effective altruism”. Hence there needs to be an explanation of why you need to do Y to achieve X. If X and Y are the same thing, we have a tautology (“If you want do X, but you do not-X, you won’t do X”).
In short, it seems necessary to say that is distinctive about the project of EA.
Analogy: say I want to be a really good mountain climber. Someone could say, oh, if you want to do that, you need to “train really hard, invest in high quality gear, and get advice from pros”. That would be helpful, specific advice about what the right means to achieve my end are. Someone who says “if you want to be good at mountain climbing, follow the best advice on how to good at mountain climbing” hasn’t yet told me anything I don’t already know.