How would you define longtermism so that it isn’t pretty much by definition EA? Like longtermism that isn’t necessarily primarily motivated by consequences for people in the future? I think GPI may have explored some such views, but I think it’s close enough to EA that we don’t need a new term.
If we’re including progress studies, why not international development, global health, AI safety, biosecurity, nuclear security, social movements, animal ethics, vegan studies, conflict and peace studies, transhumanism, futurism, philosophy of mind, etc.? Is progress studies more cause-neutral?
I think EA and Rationality are fine.
How would you define longtermism so that it isn’t pretty much by definition EA? Like longtermism that isn’t necessarily primarily motivated by consequences for people in the future? I think GPI may have explored some such views, but I think it’s close enough to EA that we don’t need a new term.
If we’re including progress studies, why not international development, global health, AI safety, biosecurity, nuclear security, social movements, animal ethics, vegan studies, conflict and peace studies, transhumanism, futurism, philosophy of mind, etc.? Is progress studies more cause-neutral?