We may believe that existential risk is astronomically bad, such that killing 8 billion people is much worse than 2x as bad as killing 4 billion people.
Reasons to go lower:
Certain interventions in reducing xrisk may save a significantly lower number of present people’s lives than 8 billion, for example much work in civilizational resilience/recovery, or anything that has a timeline of >20 years for most of the payoff.
As a practical matter, longtermist EA has substantially less money than implied by these odds.
Reasons to go higher:
We may believe that existential risk is astronomically bad, such that killing 8 billion people is much worse than 2x as bad as killing 4 billion people.
Reasons to go lower:
Certain interventions in reducing xrisk may save a significantly lower number of present people’s lives than 8 billion, for example much work in civilizational resilience/recovery, or anything that has a timeline of >20 years for most of the payoff.
As a practical matter, longtermist EA has substantially less money than implied by these odds.
Agreed on all points