Well, it’s unfortunate, but the fundamental goal of Effective Altruism is rational investment, and that means, among other things, not investing based on emotions.
This is wrong; it’s a black-or-white logical fallacy. Emotions are an important channel of data. Not factoring them into calculations leads to false conclusions. Check out bilateral amygdala damage or frontotemporal dementia.
EA discourages emotion-only or emotion-overweighted decision-making. However, if emotion were not a part of EA, we would simply give every dollar to bednets in Africa and ignore every other cause.
Maybe I’m misreading your argument, but you seem to say there are legitimate cases to be made for 100% investment in humans, at the expense of complete obliteration of the remainder of the animal kingdom. The whole ecosystem we rely on for survival would collapse.
I might agree with you that the planet would be better off long term if we devoted 100% to animals and, conversely, obliterated all the people. There are (at least) two things wrong with this other extreme scenario, though:
It values human flourishing at zero, which is antithetical to EA
We might be the only higher consciousness in the Galaxy, or even the Universe, with a non-zero risk of ours being the only one that arises. It would be a massive loss to have the next googol years play out unaware.
This is wrong; it’s a black-or-white logical fallacy. Emotions are an important channel of data. Not factoring them into calculations leads to false conclusions. Check out bilateral amygdala damage or frontotemporal dementia.
EA discourages emotion-only or emotion-overweighted decision-making. However, if emotion were not a part of EA, we would simply give every dollar to bednets in Africa and ignore every other cause.
Maybe I’m misreading your argument, but you seem to say there are legitimate cases to be made for 100% investment in humans, at the expense of complete obliteration of the remainder of the animal kingdom. The whole ecosystem we rely on for survival would collapse.
I might agree with you that the planet would be better off long term if we devoted 100% to animals and, conversely, obliterated all the people. There are (at least) two things wrong with this other extreme scenario, though:
It values human flourishing at zero, which is antithetical to EA
We might be the only higher consciousness in the Galaxy, or even the Universe, with a non-zero risk of ours being the only one that arises. It would be a massive loss to have the next googol years play out unaware.