I think it’s best to think about the importance of EA as a matter of degree. I briefly mention this in the post:
Moreover, we can say that it’s more of a mistake not to pursue the project of effective altruism the greater the degree to which each of the premises hold. For instance, the greater the degree of spread, the more you’re giving up by not searching (and same for the other two premises).
I agree that if there were only, say, 2x differences in the impact of actions, EA could still be very worthwhile. But it wouldn’t be as important as in a world where there are 100x differences. I talk about this a little more in the podcast.
I think ideally I’d reframe the whole argument to be about how important EA is rather than whether it’s important or not, but the phrasing gets tricky.
Hi Jamie,
I think it’s best to think about the importance of EA as a matter of degree. I briefly mention this in the post:
I agree that if there were only, say, 2x differences in the impact of actions, EA could still be very worthwhile. But it wouldn’t be as important as in a world where there are 100x differences. I talk about this a little more in the podcast.
I think ideally I’d reframe the whole argument to be about how important EA is rather than whether it’s important or not, but the phrasing gets tricky.