EA != minimize suffering

In many ways, most EAs are extraordinarily smart, but in one way EAs are naive. The most well known EAs have stated that the goal of EA is to minimize suffering. I can’t explain this well at all, but I’m certain that is not the cause or effect of altruism as I understand it.

Consider The Giver. Consider a world where everyone was high on opiates all the time. There is no suffering or beauty. Would you disturb it?

Considering this, my immediate reaction is to restate the goal of EA as maximizing the difference between happiness and suffering. This still seems naive. Happiness and suffering are so interwoven, I’m not sure this can be done. The disappointment from being rejected by a girl may help you come to terms with reality. The empty feeling in the pit of your stomach when your fantasy world crumbles motivates you to find something more fulfilling.

It’s difficult to say. Maybe one of you can restate it more plainly. This isn’t an argument against EA. This is an argument that while we probably do agree on what actions are altruistic—the criteria used to explain it are overly simplified.

I don’t know if there is much to be gained by having criteria to explain altruism, but I am tired of “reducing suffering.” I like to think about it more as doing what I can to positively impact the world—and using EA to maximize that positivity where possible. Because altruism isn’t always as simple as where to send your money.