My impression is that these concerns are in practice pretty much recognized, which is behind the EA focus on extinction or similar permanent changes, i.e. absorbing states. Forecast precision becomes irrelevant after entering an absorbing state, and so becomes “diminution” and “washing out” (“option unawareness” still seems relevant).
I think you are right that in practice much of Longtermist work is concerned with extinction risk. However, this paper (and others like it from GPI) are concerned with whether Longtermism could have any wider implications. This is part of the value of their work on Longtermism- either discovering or ruling out areas of practical work which don’t involve avoiding extinction/ lock-in. So dimunition, washing out etc… are all pretty relevant to the project of widening the implications of Longtermism (or not).
My impression is that these concerns are in practice pretty much recognized, which is behind the EA focus on extinction or similar permanent changes, i.e. absorbing states. Forecast precision becomes irrelevant after entering an absorbing state, and so becomes “diminution” and “washing out” (“option unawareness” still seems relevant).
I think you are right that in practice much of Longtermist work is concerned with extinction risk. However, this paper (and others like it from GPI) are concerned with whether Longtermism could have any wider implications. This is part of the value of their work on Longtermism- either discovering or ruling out areas of practical work which don’t involve avoiding extinction/ lock-in. So dimunition, washing out etc… are all pretty relevant to the project of widening the implications of Longtermism (or not).