Longtermism’ is a new term, which may well become quite common and influential. The aim in giving the term a precise meaning while we still have the chance is to prevent confusions before they arise. This is particularly important if you’re hoping that a research field will develop around the idea. I think that this is really crucial.
Some confusions that happened in part because we were slow to give ‘Effective Altruism’ a precise definition: people unsure on how much EA required sacrifice and sometimes seeing it as extremely demanding; people unsure on whether you could be focused on preserving nature for its own sake and count as an EA; people seeing it as no different from applied utilitarianism.
Some confusions that are apt to arise with respect to longtermism:
are we talking about the strong version or the minimal version? The former is a lot more unintuitive, do we want to push that?
How long is long term? Are you in the longtermist club if you’re focused on the next hundred years? What if you’re focused on climate change? (that’s a bit of important pedantry I didn’t get into in the post!)
Are you committed to a particular epistemology? Ben Kuhn seemed to think so. But my next post is on what I call ‘boring longtermism’, which separates out longtermism from some other claims that long-term oriented EAs tend to endorse.
Is this just a thing for sci if nerds? Is this intellectual movement just focused on existential risk or something broader? Etc
Thanks – I agree that confusions are likely to arise somewhere as a new term permeates the zeitgeist.
I don’t think longtermism is a new term within EA or on the EA Forum, and I haven’t seen any recent debates over its definition.
[Edited: the Forum doesn’t seem like a well-targeted place for clarification efforts intending to address potential confusions around this (which seem likely to arise elsewhere)]. Encyclopedia entries, journal articles, and mainstream opinion pieces all seem better targeted to where confusion is likely to arise.
Even if the Forum isn’t a “well-targeted place” for a certain piece of EA content, it still seems good for things to end up here, because “getting feedback from people who are sympathetic to your goals and have useful background knowledge” is generally a really good thing no matter where you aim to publish something eventually.
Perhaps there will come a time in the future when “longtermism” becomes enough of a buzzword to justify clarification in a mainstream opinion piece or journal article. At that point, it seems good to have a history of discussion behind the term, and ideally one meaning that people in EA already broadly agree upon. (“This hasn’t been debated recently” =/= “we all have roughly the same definition that we are happy with”.)
Longtermism’ is a new term, which may well become quite common and influential. The aim in giving the term a precise meaning while we still have the chance is to prevent confusions before they arise. This is particularly important if you’re hoping that a research field will develop around the idea. I think that this is really crucial.
Some confusions that happened in part because we were slow to give ‘Effective Altruism’ a precise definition: people unsure on how much EA required sacrifice and sometimes seeing it as extremely demanding; people unsure on whether you could be focused on preserving nature for its own sake and count as an EA; people seeing it as no different from applied utilitarianism.
Some confusions that are apt to arise with respect to longtermism:
are we talking about the strong version or the minimal version? The former is a lot more unintuitive, do we want to push that?
How long is long term? Are you in the longtermist club if you’re focused on the next hundred years? What if you’re focused on climate change? (that’s a bit of important pedantry I didn’t get into in the post!)
Are you committed to a particular epistemology? Ben Kuhn seemed to think so. But my next post is on what I call ‘boring longtermism’, which separates out longtermism from some other claims that long-term oriented EAs tend to endorse.
Is this just a thing for sci if nerds? Is this intellectual movement just focused on existential risk or something broader? Etc
Thanks – I agree that confusions are likely to arise somewhere as a new term permeates the zeitgeist.
I don’t think longtermism is a new term within EA or on the EA Forum, and I haven’t seen any recent debates over its definition.
[Edited: the Forum doesn’t seem like a well-targeted place for clarification efforts intending to address potential confusions around this (which seem likely to arise elsewhere)]. Encyclopedia entries, journal articles, and mainstream opinion pieces all seem better targeted to where confusion is likely to arise.
Even if the Forum isn’t a “well-targeted place” for a certain piece of EA content, it still seems good for things to end up here, because “getting feedback from people who are sympathetic to your goals and have useful background knowledge” is generally a really good thing no matter where you aim to publish something eventually.
Perhaps there will come a time in the future when “longtermism” becomes enough of a buzzword to justify clarification in a mainstream opinion piece or journal article. At that point, it seems good to have a history of discussion behind the term, and ideally one meaning that people in EA already broadly agree upon. (“This hasn’t been debated recently” =/= “we all have roughly the same definition that we are happy with”.)