It was a very quick lower bound. From the LT survey a few years ago, basically about ~50% of influences on quality-adjusted work in longtermism were from EA sources (as opposed to individual interests, idiosyncratic non-EA influences, etc), and of that slice, maybe half of that is due to things that look like EA outreach or infrastructure (as opposed to e.g. people hammering away at object-level priorities getting noticed).
And then I think about whether I’d a) rather all EAs except one disappear and have 4B more, or b) have 4B less but double the quality-adjusted number of people doing EA work. And I think the answer isn’t very close.
It was a very quick lower bound. From the LT survey a few years ago, basically about ~50% of influences on quality-adjusted work in longtermism were from EA sources (as opposed to individual interests, idiosyncratic non-EA influences, etc), and of that slice, maybe half of that is due to things that look like EA outreach or infrastructure (as opposed to e.g. people hammering away at object-level priorities getting noticed).
And then I think about whether I’d a) rather all EAs except one disappear and have 4B more, or b) have 4B less but double the quality-adjusted number of people doing EA work. And I think the answer isn’t very close.