If there are 100 nonillion potential people, there is nothing that could happen in your lifetime that could possibly matter compared to ensuring the continued survival of humanity. All resources should be diverted to preventing existential risk, even if we really don’t know whether these risks are real or these efforts are effective, because the value at stake is simply too large.
If true, so much the worse for other causes. But I don’t think it’s true, longtermism can also imply that we should focus on expanding civilization more quickly or putting humanity on a better trajectory of political development. And if directly working on x-risks is ineffective then indirect steps like creating a more thoughtful and tolerant culture could be the most effective way to reduce x-risk.
But if we introduce the concept of marginal existence, we can make the question of potential future people and longtermism more workable. In other words, we need to be more specific about our conditions. For example, there will almost certainly be another generation after me. Therefore, it is sensible (necessary, even) to condition on the existence of that generation, at which point they are no longer marginal. Conditional on the existence of that generation, it is a moral good to improve their well-being. There will almost certainly be another generation after that, and many more after that. As long as we agree to condition on their existence, we probably agree that it is morally important to improve their well-being, by working on climate change, preventing AI risk, etc.
Well sure? It sounds like you’re stating the obvious. Like, of course long term impacts won’t happen if future generations don’t exist, I thought that wouldn’t need to be stated.
If true, so much the worse for other causes. But I don’t think it’s true, longtermism can also imply that we should focus on expanding civilization more quickly or putting humanity on a better trajectory of political development. And if directly working on x-risks is ineffective then indirect steps like creating a more thoughtful and tolerant culture could be the most effective way to reduce x-risk.
Well sure? It sounds like you’re stating the obvious. Like, of course long term impacts won’t happen if future generations don’t exist, I thought that wouldn’t need to be stated.