Pursuing (or influencing others to pursue) larger cardinal numbers of value, e.g. creating or preventing the existence of ℵ5 possible beings, seems sufficiently neglected relative to extinction risk reduction and the chances of value-lock-in are high enough that increasing or decreasing the expected amount of resources used to generate such higher cardinals of (dis)value or improving their quality conditional on an advanced stable civilization looks at least roughly as promising as extinction risk reduction for a scope-sensitive expected value maximizer. (However, plausibly you should just be indifferent to everything, if you aggregate value before taking differences rather than after.)
(Edited to remove some bits.)
Pursuing (or influencing others to pursue) larger cardinal numbers of value, e.g. creating or preventing the existence of ℵ5 possible beings, seems sufficiently neglected relative to extinction risk reduction and the chances of value-lock-in are high enough that increasing or decreasing the expected amount of resources used to generate such higher cardinals of (dis)value or improving their quality conditional on an advanced stable civilization looks at least roughly as promising as extinction risk reduction for a scope-sensitive expected value maximizer. (However, plausibly you should just be indifferent to everything, if you aggregate value before taking differences rather than after.)