For a given individual, can they have a higher probability of averting extinction (i.e. making the difference) or for a different long-term trajectory change? If you discount small enough probabilities of making a difference or are otherwise difference-making risk averse (as an individual), would one come out ahead as a result?
Some thoughts: extinction is a binary event. But there’s a continuum of possible values that future agents could have, including under value lock-in. A small tweak in locked-in values seems more achievable counterfactually than being the difference for whether we go extinct. And a small tweak in locked-in values would still have astronomical impact if they persist into the far future. It seems like value change might depend less on very small probabilities of making a difference.
For a given individual, can they have a higher probability of averting extinction (i.e. making the difference) or for a different long-term trajectory change? If you discount small enough probabilities of making a difference or are otherwise difference-making risk averse (as an individual), would one come out ahead as a result?
Some thoughts: extinction is a binary event. But there’s a continuum of possible values that future agents could have, including under value lock-in. A small tweak in locked-in values seems more achievable counterfactually than being the difference for whether we go extinct. And a small tweak in locked-in values would still have astronomical impact if they persist into the far future. It seems like value change might depend less on very small probabilities of making a difference.