I’m a mathematician working on collective decision making, game theory, formal ethics, international coalition formation, and a lot of stuff related to climate change. Here’s my professional profile.
My definition of value :
I have a wide moral circle (including aliens as long as they can enjoy or suffer life)
I have a zero time discount rate, i.e., value the future as much as the present
I am (utility-) risk-averse: I prefer a sure 1 util to a coin toss between 0 and 2 utils
I am (ex post) inequality-averse: I prefer 2 people to each get 1 util for sure to one getting 0 and one getting 2 for sure
I am (ex ante) fairness-seeking: I prefer 2 people getting an expected 1 util to one getting an expected 0 and one getting an expected 2.
Despite all this, I am morally uncertain
Conditional on all of the above, I also value beauty, consistency, simplicity, complexity, and symmetry
Related to that:
Your figure says
but consider a certain baseline trajectory A on which
longterm population = 3 gazillion person life years for sure
average wellbeing = 3 utils per person per life year for sure,
so that their expected product equals 9 gazillion utils, and an uncertain alternative trajectory B on which
if nature’s coin lands heads, longterm population = 7 gazillion person life years but average wellbeing = 1 util per person per life year
if nature’s coin lands tails, longterm population = 1 gazillion person life years but average wellbeing = 7 utils per person per life year,
so that their expected product equals (7 x 1 + 1 x 7) / 2 = 7 gazillion utils.
Then an event that changes the trajectory from A to B is a longtermist regress since it reduces the expected utility.
But it is NEITHER a contraction NOR an average wellbeing decrease. In fact, it is BOTH an Expansion, since the expected longterm population increases from 3 to 4 gazillion person life years, AND an average wellbeing increase, since that increases from 3 to 4 utils per person per life year.