Far-future effects are the most important determinant of what we ought to do
“Far future” is an extremely fuzzy concept. We won’t get very far anyway if we don’t solve the near-term problems, like preventing ASI. For me, longtermism really means keeping the space of possible decisions for the next generation as large as possible, not determining the future in a way we today think is “good”. So it’s not much different from sustainability.
“Far future” is an extremely fuzzy concept. We won’t get very far anyway if we don’t solve the near-term problems, like preventing ASI. For me, longtermism really means keeping the space of possible decisions for the next generation as large as possible, not determining the future in a way we today think is “good”. So it’s not much different from sustainability.