Sorry, this was unclear, and I’m both not sure that we disagree, and want to apologize if it seemed like I was implying that you haven’t done a tremendous amount for the community, and didn’t hope for its success, etc. I do worry that there is a perspective (which you seem to agree with) that if we magically removed all the various epistemic issues with knowing about the long term impacts of decisions, longtermists would no longer be aligned with others in the EA community.
I also think that longtermism is plausibly far better as a philosophical position than as a community, as mentioned in a different comment, but that point is even farther afield, and needs a different post and a far more in-depth discussion.
Sorry, this was unclear, and I’m both not sure that we disagree, and want to apologize if it seemed like I was implying that you haven’t done a tremendous amount for the community, and didn’t hope for its success, etc. I do worry that there is a perspective (which you seem to agree with) that if we magically removed all the various epistemic issues with knowing about the long term impacts of decisions, longtermists would no longer be aligned with others in the EA community.
I also think that longtermism is plausibly far better as a philosophical position than as a community, as mentioned in a different comment, but that point is even farther afield, and needs a different post and a far more in-depth discussion.