Yeah, a lot of interventions/causes/worldviews that have power in EA will have more than adequate resources to do what they are trying to do. This is why, to some extent, “getting a job at an EA org” may not be a particularly high EV move because it is not clear that the counterfactual employee would be worse than you (although, this reasoning is somewhat weakened by the fact that you could ostensibly free an aligned person to do other work, and so on).
Lending your abilities and resources to promising causes/etc. that do not have power behind them is probably a way that someone of mediocre abilities could have high impact, perhaps much more impact than much more talented people serving well-resourced masters. Of course, the trick here would be identifying what are these “promising”, neglected areas, especially when the lack of attention by the powers that be may be interpreted as a lack of merit.
Yeah, a lot of interventions/causes/worldviews that have power in EA will have more than adequate resources to do what they are trying to do. This is why, to some extent, “getting a job at an EA org” may not be a particularly high EV move because it is not clear that the counterfactual employee would be worse than you (although, this reasoning is somewhat weakened by the fact that you could ostensibly free an aligned person to do other work, and so on).
Lending your abilities and resources to promising causes/etc. that do not have power behind them is probably a way that someone of mediocre abilities could have high impact, perhaps much more impact than much more talented people serving well-resourced masters. Of course, the trick here would be identifying what are these “promising”, neglected areas, especially when the lack of attention by the powers that be may be interpreted as a lack of merit.
This is particularly true to the extent that EA organizations overvalue alignment for certain roles.