Thanks for posting! My current belief is that EA has not become purely about longtermism. In fact, recently it has been argued in the community that longtermism is not necessary to pursue the kind of things we currently do, as pandemics or AI Safety can also be justified in terms of preventing global catastrophes.
That being said, I’d very much prefer the EA community bottom line to be about doing “the most good” rather than subscribing to longtermism or any other cool idea we might come up with. These are all subject to change and debate, whether doing the most good really shouldn’t.
Additionally, it might be worth highlighting, especially when talking with unfamiliarized people, that we deeply care about all present people suffering. Quoting Nate Soares:
One day, we may slay the dragons that plague us. One day we, like the villagers in their early days, may have the luxury of going to any length in order to prevent a fellow sentient mind from being condemned to oblivion unwillingly. If we ever make it that far, the worth of a life will be measured not in dollars, but in stars. That is the value of a life. It will be the value of a life then, and it is the value of a life now.
Thanks for posting! My current belief is that EA has not become purely about longtermism. In fact, recently it has been argued in the community that longtermism is not necessary to pursue the kind of things we currently do, as pandemics or AI Safety can also be justified in terms of preventing global catastrophes.
That being said, I’d very much prefer the EA community bottom line to be about doing “the most good” rather than subscribing to longtermism or any other cool idea we might come up with. These are all subject to change and debate, whether doing the most good really shouldn’t.
Additionally, it might be worth highlighting, especially when talking with unfamiliarized people, that we deeply care about all present people suffering. Quoting Nate Soares: