There’s “longtermism” as the group of people who talk a lot about x-risk, AI safety and pandemics because they hold some weird beliefs here
Interesting—When I think of the group of people “longtermists” I think of the set of people who subscribe to (and self-identify with) some moral view that’s basically “longtermism,” not people who work on reducing existential risks. While there’s a big overlap between these two sets of people, I think referring to e.g. people who reject caring about future people as “longtermists” is pretty absurd, even if such people also hold the weird empirical beliefs about AI (or bioengineered pandemics, etc) posing a huge near-term extinction risk. Caring about AI x-risk or thinking the x-risk from AI is large is simply not the thing that makes a person a “longtermist.”
But maybe people have started using the word “longtermist” in this way and that’s the reason Yglesias’ worded his post as he did? (I haven’t observed this, but it sounds like you might have.)
But maybe people have started using the word “longtermist” in this way and that’s the reason Yglesias’ worded his post as he did? (I haven’t observed this, but it sounds like you might have.)
Yeah this feels like the crux, my read is that “longtermist EA” is a term used to encompass holy shit x risk EA too
Interesting—When I think of the group of people “longtermists” I think of the set of people who subscribe to (and self-identify with) some moral view that’s basically “longtermism,” not people who work on reducing existential risks. While there’s a big overlap between these two sets of people, I think referring to e.g. people who reject caring about future people as “longtermists” is pretty absurd, even if such people also hold the weird empirical beliefs about AI (or bioengineered pandemics, etc) posing a huge near-term extinction risk. Caring about AI x-risk or thinking the x-risk from AI is large is simply not the thing that makes a person a “longtermist.”
But maybe people have started using the word “longtermist” in this way and that’s the reason Yglesias’ worded his post as he did? (I haven’t observed this, but it sounds like you might have.)
Yeah this feels like the crux, my read is that “longtermist EA” is a term used to encompass holy shit x risk EA too