That’s very fair, I should have been a lot more specific in my original comment. I have been a bit disappointed that within EA longtermism is so often framed in utilitarian terms—I have found the collection of moral arguments in favour of protecting the long-term future brought forth in The Precipice a lot more compelling and wish they would come up more frequently.
I also like the arguments in The Precipice. But per my above comment, I’m not sure if they are arguments for longtermism, strictly speaking. As far as I recall, The Precipice argues for something like “preventing existential risk is among our most important moral concerns”. This is consistent with, but neither implied nor required by longtermism: if you e.g. thought that there are 10 other moral concerns of similar weight, and you choose to mostly focus on those, I don’t think your view is ‘longtermist’ even in the weak sense. This is similar to how someone who thinks that protecting the environment is somewhat important but doesn’t focus on this concern would not be called an environmentalist.
Yes, I agree with that too—see my comments later in the thread. I think it would be great to be clearer that the arguments for xrisk and longtermism are separate (and neither depends on utilitarianism).
That’s very fair, I should have been a lot more specific in my original comment. I have been a bit disappointed that within EA longtermism is so often framed in utilitarian terms—I have found the collection of moral arguments in favour of protecting the long-term future brought forth in The Precipice a lot more compelling and wish they would come up more frequently.
I agree!
I also like the arguments in The Precipice. But per my above comment, I’m not sure if they are arguments for longtermism, strictly speaking. As far as I recall, The Precipice argues for something like “preventing existential risk is among our most important moral concerns”. This is consistent with, but neither implied nor required by longtermism: if you e.g. thought that there are 10 other moral concerns of similar weight, and you choose to mostly focus on those, I don’t think your view is ‘longtermist’ even in the weak sense. This is similar to how someone who thinks that protecting the environment is somewhat important but doesn’t focus on this concern would not be called an environmentalist.
Yes, I agree with that too—see my comments later in the thread. I think it would be great to be clearer that the arguments for xrisk and longtermism are separate (and neither depends on utilitarianism).