This is a really big problem for EA. When you have people taking seriously such an overarching principle, you end up with stressed, nervous people, people anxious that they are living wrongly. The correct critique of this situation isn’t the one Singer makes: that it prevents them from doing the most good. The critique is that it is the wrong way to live.
It seems natural to me to think of the “most good” as also being effective at taking care of myself. How much resources do I need to be happy and effective is something I strive to be more efficient with over time. In a sense it is learning how to be yourself better. With the caveat that one should be cautious in becoming too optimistically lean. I do agree that it is common trap that needs to be addressed and perhaps taught earlier on but I’m not sure the core tenants of EA can be simplified any more at this stage. Perhaps something like “Striving for more good”?
Illegibility: A common argument against EA is that it undervalues illegible activity.
I think it’s clear why legible activities are valued more. Science plays closely with empiricals after all and it’s easier to be convincing with evidence but that’s also to your point. I think the real point here is that the nuance of science should be emphasized more. Science respects the unknowns and unmeasurables rather than ignores it. Think this is a more general human problem than one of EA specifically but equally should be addressed.
EA as a cult
I think of EA as more of a mental and language framework than as a belief system. People adopt EA because of their belief systems rather than believe in EA itself. The cult aspect goes away if you don’t care about belonging to a group and whenever you care too much about belonging to a group, you’re also more likely to find yourself in a cult. However as it is a general problem that gets in the way of “doing most good” it is also something that EA folks should be wary of. Which goes back to your point of EA ideally being self-correcting.
It seems natural to me to think of the “most good” as also being effective at taking care of myself. How much resources do I need to be happy and effective is something I strive to be more efficient with over time. In a sense it is learning how to be yourself better. With the caveat that one should be cautious in becoming too optimistically lean. I do agree that it is common trap that needs to be addressed and perhaps taught earlier on but I’m not sure the core tenants of EA can be simplified any more at this stage. Perhaps something like “Striving for more good”?
I think it’s clear why legible activities are valued more. Science plays closely with empiricals after all and it’s easier to be convincing with evidence but that’s also to your point. I think the real point here is that the nuance of science should be emphasized more. Science respects the unknowns and unmeasurables rather than ignores it. Think this is a more general human problem than one of EA specifically but equally should be addressed.
I think of EA as more of a mental and language framework than as a belief system. People adopt EA because of their belief systems rather than believe in EA itself. The cult aspect goes away if you don’t care about belonging to a group and whenever you care too much about belonging to a group, you’re also more likely to find yourself in a cult. However as it is a general problem that gets in the way of “doing most good” it is also something that EA folks should be wary of. Which goes back to your point of EA ideally being self-correcting.