To me, the question is “what are the logical conclusions that longtermism leads to?” The idea that as of today we have not exhausted every intervention available is less relevant in considerations of 100s of thousand and millions of years.
I suspect that if someone had an idea about an intervention that they thought was super great and cost effective for future generations and awful for people alive today, well they would probably post that idea on EA Forum just like anything else, and then people would have a lively debate about it.
I agree. The debate would be whether to follow the moral reasoning of longtermism or not. Something that might be “awful for people alive today” is completely in line with longtermism—it could be the situation. To not support the intervention would constitute a break between theory and practice.
I think it is important to address the implications of this funny situation sooner rather than later.
To me, the question is “what are the logical conclusions that longtermism leads to?” The idea that as of today we have not exhausted every intervention available is less relevant in considerations of 100s of thousand and millions of years.
I agree. The debate would be whether to follow the moral reasoning of longtermism or not. Something that might be “awful for people alive today” is completely in line with longtermism—it could be the situation. To not support the intervention would constitute a break between theory and practice.
I think it is important to address the implications of this funny situation sooner rather than later.