But I want to push back on “this frees up other EA researchers to do more important work”. I think you probably mean “this frees up other EA researchers to do work that they’re more uniquely suited for”? I think (and your comment seems to imply you agree?) that there’s not a very strong correlation between importance and difficulty/uniqueness-of-skillset-required—i.e., many low-hanging fruit remain unplucked despite being rather juicy.
I think this is probably true. One thing to flag here is people’s counterfactuals are not necessarily in research. I think one belief that I recently updated towards but haven’t fully incorporated in my decision-making is that for a non-trivial subset of EAs in prominent org positions (particularly STEM-trained risk-neutral Americans with elite networks), counterfactuals might be more like expected E2G earnings more in the mid-7 figures or so* than the low- to mid- 6 figures I was previously assuming.
*to be clear, almost all of this is EV is in the high upside things, very few people make 7 figures working jobby jobs.
I think this is probably true. One thing to flag here is people’s counterfactuals are not necessarily in research. I think one belief that I recently updated towards but haven’t fully incorporated in my decision-making is that for a non-trivial subset of EAs in prominent org positions (particularly STEM-trained risk-neutral Americans with elite networks), counterfactuals might be more like expected E2G earnings more in the mid-7 figures or so* than the low- to mid- 6 figures I was previously assuming.
*to be clear, almost all of this is EV is in the high upside things, very few people make 7 figures working jobby jobs.