Thanks for sharing. I think my post covers some different ground (e.g. the specific considerations) than that discussion, and it’s valuable to share an independent perspective.
I do agree it touches on many of the same points.
I might not agree with your claim that it’s been a “prominent” part of discussion. I rarely see it brought up. I also might not agree that “Trajectory Changes” are a slightly cleaner version of “quality risks,” but those points probably aren’t very important.
As to your own comments at the end:
The reason people don’t usually think about trajectory changes (and quality risks) is not that they’ve just overlooked that possibility.
Maybe. Most of the people I’ve spoken with did just overlook (i.e. didn’t give more than an hour or two of thought—probably not more than 5 minutes) the possibility, but your experience may be different.
It’s that absent some device for fixing them in society, the (expected) impact of most societal changes decays over time.
I’m not sure I agree, although this claim is a bit vague. If society’s value (say, moral circles) is rated on a scale of 1 to 100 at every point in time and is currently at, say, 20, then even if there’s noise that moves it up and down, a shift of 1 will increase the expected value at every future time period.
You might mean something different.
However, it is not straightforward to argue that such an ideology would be expected to thrive for millenea when almost all other poliyical and ethical ideologies have not.
I don’t think it’s about having the entire “ideology” survive, just about having it affect future ideologies. If you widen moral circles now, then the next ideology that comes along might have slightly wider circles than it would otherwise.
The challenge that the existential risk community has not yet successfully achieved, is to think of ones that are probable and worth moving altruistic resources towards, tgat couod as easily be used to reduce exyinction risk.
As a community, I agree. And I’m saying that might be because we haven’t put enough effort into considering them. Although personally, I see at least one of those (widening moral circles) as more promising than any of the extinction risks currently on our radar. But I’m always open to arguments against that.
Thanks for sharing. I think my post covers some different ground (e.g. the specific considerations) than that discussion, and it’s valuable to share an independent perspective.
I do agree it touches on many of the same points.
I might not agree with your claim that it’s been a “prominent” part of discussion. I rarely see it brought up. I also might not agree that “Trajectory Changes” are a slightly cleaner version of “quality risks,” but those points probably aren’t very important.
As to your own comments at the end:
Maybe. Most of the people I’ve spoken with did just overlook (i.e. didn’t give more than an hour or two of thought—probably not more than 5 minutes) the possibility, but your experience may be different.
I’m not sure I agree, although this claim is a bit vague. If society’s value (say, moral circles) is rated on a scale of 1 to 100 at every point in time and is currently at, say, 20, then even if there’s noise that moves it up and down, a shift of 1 will increase the expected value at every future time period.
You might mean something different.
I don’t think it’s about having the entire “ideology” survive, just about having it affect future ideologies. If you widen moral circles now, then the next ideology that comes along might have slightly wider circles than it would otherwise.
As a community, I agree. And I’m saying that might be because we haven’t put enough effort into considering them. Although personally, I see at least one of those (widening moral circles) as more promising than any of the extinction risks currently on our radar. But I’m always open to arguments against that.