I agree, EA is a movement of different but compatible values, and given its existence, I don’t want to force anything on it, or force anyone to change their values. It’s a great collaboration of a number of people with different perspectives, and I am glad it exists. Indeed the interests of different people in the community are pretty compatible, as evidenced by the many meta interventions that seem to help many causes at the same time.
I don’t think my interests are incompatible with most of EA, and am not sure why you think that? I’ve clearly invested a huge amount of my resources into making the broader EA community better in a wide variety of domains, and generally care a lot about seeing EA broadly get more successful and grow and attract resources, etc.
But I think it’s important to be clear which of these benefits are gains from trade, vs. things I “intrinsically care about” (speaking a bit imprecisely here). If I could somehow get all of these resources and benefits without having to trade things away, and instead just build something that was more directly aligned with my values of similar scale and level of success, that seems better to me. I think historically this wasn’t really possible, but with longtermist stuff finding more traction, I am now more optimistic about it. But also, I still expect EA to provide value for the broad range of perspectives under its tend, and expect that investing in it in some capacity or another will continue to be valuable.
Sorry, this was unclear, and I’m both not sure that we disagree, and want to apologize if it seemed like I was implying that you haven’t done a tremendous amount for the community, and didn’t hope for its success, etc. I do worry that there is a perspective (which you seem to agree with) that if we magically removed all the various epistemic issues with knowing about the long term impacts of decisions, longtermists would no longer be aligned with others in the EA community.
I also think that longtermism is plausibly far better as a philosophical position than as a community, as mentioned in a different comment, but that point is even farther afield, and needs a different post and a far more in-depth discussion.
I think we don’t disagree?
I agree, EA is a movement of different but compatible values, and given its existence, I don’t want to force anything on it, or force anyone to change their values. It’s a great collaboration of a number of people with different perspectives, and I am glad it exists. Indeed the interests of different people in the community are pretty compatible, as evidenced by the many meta interventions that seem to help many causes at the same time.
I don’t think my interests are incompatible with most of EA, and am not sure why you think that? I’ve clearly invested a huge amount of my resources into making the broader EA community better in a wide variety of domains, and generally care a lot about seeing EA broadly get more successful and grow and attract resources, etc.
But I think it’s important to be clear which of these benefits are gains from trade, vs. things I “intrinsically care about” (speaking a bit imprecisely here). If I could somehow get all of these resources and benefits without having to trade things away, and instead just build something that was more directly aligned with my values of similar scale and level of success, that seems better to me. I think historically this wasn’t really possible, but with longtermist stuff finding more traction, I am now more optimistic about it. But also, I still expect EA to provide value for the broad range of perspectives under its tend, and expect that investing in it in some capacity or another will continue to be valuable.
Sorry, this was unclear, and I’m both not sure that we disagree, and want to apologize if it seemed like I was implying that you haven’t done a tremendous amount for the community, and didn’t hope for its success, etc. I do worry that there is a perspective (which you seem to agree with) that if we magically removed all the various epistemic issues with knowing about the long term impacts of decisions, longtermists would no longer be aligned with others in the EA community.
I also think that longtermism is plausibly far better as a philosophical position than as a community, as mentioned in a different comment, but that point is even farther afield, and needs a different post and a far more in-depth discussion.