I’d like to push back a bit on that—it’s so common in the EA world to say, if you don’t believe in malaria nets, you must have an emotional problem. But there are many rational critiques of malaria nets. Malaria nets should not be this symbol where believing in them is a core part of the EA faith.
it’s so common in the EA world to say, if you don’t believe in malaria nets, you must have an emotional problem.
I’m not saying that.
The point I was trying to make was actually the opposite—that even for the “cold and calculating” EAs it can be emotionally difficult to choose the intervention (in this case malaria nets) which doesn’t give you the “fuzzies” or feeling of doing good that something else might.
I was trying to say that it’s normal to feel like some decisions are emotionally harder than others, and framings which focus on that may be likely to come across as dismissive of other people’s actions. (Of course, i didn’t elaborate this in the original comment)
Malaria nets should not be this symbol where believing in them is a core part of the EA faith.
I don’t make this claim in my comment—I am just using malaria nets as an example since you used it earlier, and it’s an accepted shorthand for “commonly recommended effective intervention” (but maybe we should just say that—maybe we shouldn’t use the shorthand).
I think I sit somewhere between you both- broadly we think that there shouldn’t be “one” road to impact ; whether that be bed nets or something else
Our explicit purpose is to use EA frameworks and thinking to help people reach their own conclusions. We think that common EA causes are very promising and Very likely to be highly impactful, but we err on the side of caution in being overly prescriptive.
I’d like to push back a bit on that—it’s so common in the EA world to say, if you don’t believe in malaria nets, you must have an emotional problem. But there are many rational critiques of malaria nets. Malaria nets should not be this symbol where believing in them is a core part of the EA faith.
I’m not saying that.
The point I was trying to make was actually the opposite—that even for the “cold and calculating” EAs it can be emotionally difficult to choose the intervention (in this case malaria nets) which doesn’t give you the “fuzzies” or feeling of doing good that something else might.
I was trying to say that it’s normal to feel like some decisions are emotionally harder than others, and framings which focus on that may be likely to come across as dismissive of other people’s actions. (Of course, i didn’t elaborate this in the original comment)
I don’t make this claim in my comment—I am just using malaria nets as an example since you used it earlier, and it’s an accepted shorthand for “commonly recommended effective intervention” (but maybe we should just say that—maybe we shouldn’t use the shorthand).
I think I sit somewhere between you both- broadly we think that there shouldn’t be “one” road to impact ; whether that be bed nets or something else Our explicit purpose is to use EA frameworks and thinking to help people reach their own conclusions. We think that common EA causes are very promising and Very likely to be highly impactful, but we err on the side of caution in being overly prescriptive.