We have two categories (“Moral Philosophy”, “Long-Term Risks and Flourishing”) which capture lots of material relevant to longtermism.
As for the cause area section specifically:
AI is its own cluster because we currently have an enormous number of articles about it. If we only had one article about AI risk, I’d put it under “Global Catastrophic Risks” and that would be that.
The “Global Catastrophic Risks (other)” cluster feels well-defined to me in a way that a “longtermist” cluster wouldn’t. When I look at the “Other” cluster, most of the seemingly “longtermist” causes are still things that many people work on hoping to achieve substantial change within their lifetimes, for the sake of present-day people — anti-aging research, land use reform, climate change...
If you ask me about a cause area in that section, I can fairly confidently say whether or not it counts as a GCR. In many cases, I wouldn’t be able to say whether or not it counted as “longtermist”. (And as you mention, many of the areas could be prioritized for longtermist or non-longtermist reasons.)
I think of longtermism as a common value system in EA. Many causes seem especially valuable to work on given a longtermist value system, but few such causes require a longtermist value system to make sense. (But I spend less time thinking about this kind of thing than you do, so I’m open to counterpoints I might not be considering.)
We have two categories (“Moral Philosophy”, “Long-Term Risks and Flourishing”) which capture lots of material relevant to longtermism.
As for the cause area section specifically:
AI is its own cluster because we currently have an enormous number of articles about it. If we only had one article about AI risk, I’d put it under “Global Catastrophic Risks” and that would be that.
The “Global Catastrophic Risks (other)” cluster feels well-defined to me in a way that a “longtermist” cluster wouldn’t. When I look at the “Other” cluster, most of the seemingly “longtermist” causes are still things that many people work on hoping to achieve substantial change within their lifetimes, for the sake of present-day people — anti-aging research, land use reform, climate change...
If you ask me about a cause area in that section, I can fairly confidently say whether or not it counts as a GCR. In many cases, I wouldn’t be able to say whether or not it counted as “longtermist”. (And as you mention, many of the areas could be prioritized for longtermist or non-longtermist reasons.)
I think of longtermism as a common value system in EA. Many causes seem especially valuable to work on given a longtermist value system, but few such causes require a longtermist value system to make sense. (But I spend less time thinking about this kind of thing than you do, so I’m open to counterpoints I might not be considering.)