We should change the longtermism pitch and make it more realistic for people who would otherwise be deterred. Avoid (at first) mentioning trillions of potential future people or even existential risks, and put more emphasis on how these risks (e.g. pandemics, nuclear wars, etc.) can affect actual people in our lifetime.
Or perhaps a ‘shorter’ version of longermism (which would also be easier to model):
Your lifetime, that of your children, grandchildren and x generations in the future, which is a given rather than requiring assumptions to reach the higher numbers and therefore more open to dispute e.g. of humanity spreading to the stars.
We should change the longtermism pitch and make it more realistic for people who would otherwise be deterred. Avoid (at first) mentioning trillions of potential future people or even existential risks, and put more emphasis on how these risks (e.g. pandemics, nuclear wars, etc.) can affect actual people in our lifetime.
Or perhaps a ‘shorter’ version of longermism (which would also be easier to model):
Your lifetime, that of your children, grandchildren and x generations in the future, which is a given rather than requiring assumptions to reach the higher numbers and therefore more open to dispute e.g. of humanity spreading to the stars.