One takeaway, I think, is that these things which already seem good under common sense are much more important in the longtermist view. For example, I think a longtermist would want extinction risk to be much lower than what you’d want from a commonsense view.
Yes. I think your list of commonsense priorities are even more beneficial in the view of longtermism. Factors like “would this have happened anyway, just a bit later” may still apply and reduce the impact of any given intervention. Then again, notions like “we can reach more of the universe the sooner we start expanding” could be an argument for sooner being better for economic growth.
One takeaway, I think, is that these things which already seem good under common sense are much more important in the longtermist view. For example, I think a longtermist would want extinction risk to be much lower than what you’d want from a commonsense view.
Does this apply to things other than existential risk?
Yes. I think your list of commonsense priorities are even more beneficial in the view of longtermism. Factors like “would this have happened anyway, just a bit later” may still apply and reduce the impact of any given intervention. Then again, notions like “we can reach more of the universe the sooner we start expanding” could be an argument for sooner being better for economic growth.