For example, if future value is roughly binary, the increase in its value is directly proportional to the decrease in the likelihood/severity of the worst outcomes, in which case existential risk reduction seems particularly useful. On the other hand, if value is roughly uniform, focussing on trajectory changes would make more sense.
That depends on what you mean by “existential risk” and “trajectory change”. Consider a value system that says that future value is roughly binary, but that we would end up near the bottom of our maximum potential value if we failed to colonise space. Proponents of that view could find advocacy for space colonisation useful, and some might find it more natural to view that as a kind of trajectory change. Unfortunately it seems that there’s no complete consensus on how to define these central terms. (For what it’s worth, the linked article on trajectory change seems to define existential risk reduction as a kind of trajectory change.)
I would agree existential risk reduction is a type of trajectory change (as I mentioned in this footnote). That being said, depending on the shape of future value, one may want to focus on some particular types of trajectory changes (e.g. x-risk reduction). To clarify, I have added “multiple types of ” before “trajectory changes”.
It could be better to be more specific—e.g. to talk about value changes, human extinction, civilisational collapse, etc. Your framing may make it appear as if a binary distribution entails that, e.g. value change interventions have a low impact, and I don’t think that’s the case.
In my view, we should not assume that value change interventions are not an effective way of reducing existential risk, so they may still be worth pursuing if future value is binary.
That depends on what you mean by “existential risk” and “trajectory change”. Consider a value system that says that future value is roughly binary, but that we would end up near the bottom of our maximum potential value if we failed to colonise space. Proponents of that view could find advocacy for space colonisation useful, and some might find it more natural to view that as a kind of trajectory change. Unfortunately it seems that there’s no complete consensus on how to define these central terms. (For what it’s worth, the linked article on trajectory change seems to define existential risk reduction as a kind of trajectory change.)
FWIW I take issue with that definition, as I just commented in the discussion of that wiki page here.
I would agree existential risk reduction is a type of trajectory change (as I mentioned in this footnote). That being said, depending on the shape of future value, one may want to focus on some particular types of trajectory changes (e.g. x-risk reduction). To clarify, I have added “multiple types of ” before “trajectory changes”.
I don’t think that change makes much difference.
It could be better to be more specific—e.g. to talk about value changes, human extinction, civilisational collapse, etc. Your framing may make it appear as if a binary distribution entails that, e.g. value change interventions have a low impact, and I don’t think that’s the case.
In my view, we should not assume that value change interventions are not an effective way of reducing existential risk, so they may still be worth pursuing if future value is binary.