This is an interesting point, and I guess it’s important to make, but it doesn’t exactly answer the question I asked in the OP.
In 2013, Nick Bostrom gave a TEDx talk about existential risk where he argued that it’s so important to care about because of the 10^umpteen future lives at stake. In the talk, Bostrom referenced even older work by Derek Parfit. (From a quick Google, the Parfit stuff on existential risk was from his book Reasons and Persons, published in 1984.)
I feel like people in the EA community only started talking about “longtermism” in the last few years, whereas they had been talking about existential risk many years prior to that.
Suppose I already bought into Bostrom’s argument about existential risk and future people in 2013. Does longtermism have anything new to tell me?
I guess I think of caring about future people as the core of longtermism, so if you’re already signed up to that, I would already call you a longtermist? I think most people aren’t signed up for that, though.
This is an interesting point, and I guess it’s important to make, but it doesn’t exactly answer the question I asked in the OP.
In 2013, Nick Bostrom gave a TEDx talk about existential risk where he argued that it’s so important to care about because of the 10^umpteen future lives at stake. In the talk, Bostrom referenced even older work by Derek Parfit. (From a quick Google, the Parfit stuff on existential risk was from his book Reasons and Persons, published in 1984.)
I feel like people in the EA community only started talking about “longtermism” in the last few years, whereas they had been talking about existential risk many years prior to that.
Suppose I already bought into Bostrom’s argument about existential risk and future people in 2013. Does longtermism have anything new to tell me?
I guess I think of caring about future people as the core of longtermism, so if you’re already signed up to that, I would already call you a longtermist? I think most people aren’t signed up for that, though.
I agree that if you’re already bought in to moral consideration for 10^umpteen future people, that’s longtermism.