I haven’t met anyone who’s working on this stuff and says they’re deferring on the philosophy (while I feel like I’ve often heard that people feel iffy/confused about the empirical claims).
Fair—maybe I feel that people mostly buy ‘future people have non-zero worth and extinction sure is bad’, but may be more uncertain on a totalising view like ‘almost all value is in the far future, stuff today doesn’t really matter, moral worth is the total number of future people and could easily get to >=10^20’.
I’m sympathetic to something along these lines. But I think that’s a great case (from longtermists’ lights) for keeping longtermism in the curriculum. If one week of readings has a decent chance of boosting already-impactful people’s impact by, say, 10x (by convincing them to switch to 10x more impactful interventions), that seems like an extremely strong reason for keeping that week in the curriculum.
Agreed! (Well, by the lights of longtermism at least—I’m at least convinced that extinction is 10x worse than civilisational collapse temporarily, but maybe not 10^10x worse). At this point I feel like we mostly agree—keeping a fraction of the content on longtermism, after x-risks, and making it clear that it’s totally legit to work on x-risk without buying longtermism would make me happy
Fair—maybe I feel that people mostly buy ‘future people have non-zero worth and extinction sure is bad’, but may be more uncertain on a totalising view like ‘almost all value is in the far future, stuff today doesn’t really matter, moral worth is the total number of future people and could easily get to >=10^20’.
Agreed! (Well, by the lights of longtermism at least—I’m at least convinced that extinction is 10x worse than civilisational collapse temporarily, but maybe not 10^10x worse). At this point I feel like we mostly agree—keeping a fraction of the content on longtermism, after x-risks, and making it clear that it’s totally legit to work on x-risk without buying longtermism would make me happy