TImon—the whole point of EA was to get away from the kind of vacuous, feel-good empathy-signaling that animated most charitable giving before EA.
EA focuses on causes that have large scope, but that are tractable and neglected. These three criteria are the exact opposite of what one would focus on if one simply wanted to signal being ‘warm’ and ‘empathic’—which works best when focusing on specific identifiable lives (small scope), facing problems that are commonly talked about (not neglected), and that are intractable (so the charity can keep running, without the problem actually getting solved).
In my view, it’s entirely a good thing that EA has this focus. And it’s inevitable that some people who can’t understand scope-sensitivity would feel like it’s ‘heartless’ and overly quantitative.
It’s helpful to near in mind psychologist Paul Bloom’s distinction between ‘empathy’ and ‘rational compassion’. EA, as I understand it, tries to do the latter.
I agree with all this, and I also think the OP might be speaking to some experiences in EA you might not have had which could result in you talking past each other.
Thanks Geoffrey for raising this point. I agree that emotional empathy as defined by Paul Bloom can lead to bias and poor moral judgement, and I also appreciate the usefulness of the rational EA ideas you describe. I don’t want to throw them out the window and agree with Sam Harris when he says “Reason is nothing less than the guardian of love”. I agree that it is important to focus on effectiveness when judging where to give your money. I was trying to make a very different point.
I was trying to make the point that we should not dismiss the caring part that might still be involved in well-intentioned but poorly executed interventions. And I have tried to make the case for being kind and not dismissing human qualities that do not appear to be efficient. I have tried to show how following these ideas too much, or in the wrong way, can lead to negative social consequences, and that it is important to keep a balance.
In the context of the less effective charities you describe, the problem I see is not warmth or caring, but bias and naivety. To care is to understand. To understand the cause of suffering and the best way to alleviate it. I would also like to point out that while Paul Bloom makes a clear case for the problems with emotional empathy and moral judgement, at the end of the book he emphasises its value in social contexts. Also, I was not trying to argue for this kind of empathy, but basically talking about emotional maturity, compassion and kindness. I think you can make kindness impartial, so that it is consistent with moral values, but also so that other people feel that they are dealing with a human being, not a robot.
I’m not advocating going back to being naive and prejudiced, but rather being careful not to exclude human traits like empathy in everyday social interactions just because they might lead to bias when thinking about charity. Wisdom requires emotional as well as rational maturity.
TImon—the whole point of EA was to get away from the kind of vacuous, feel-good empathy-signaling that animated most charitable giving before EA.
EA focuses on causes that have large scope, but that are tractable and neglected. These three criteria are the exact opposite of what one would focus on if one simply wanted to signal being ‘warm’ and ‘empathic’—which works best when focusing on specific identifiable lives (small scope), facing problems that are commonly talked about (not neglected), and that are intractable (so the charity can keep running, without the problem actually getting solved).
In my view, it’s entirely a good thing that EA has this focus. And it’s inevitable that some people who can’t understand scope-sensitivity would feel like it’s ‘heartless’ and overly quantitative.
It’s helpful to near in mind psychologist Paul Bloom’s distinction between ‘empathy’ and ‘rational compassion’. EA, as I understand it, tries to do the latter.
I agree with all this, and I also think the OP might be speaking to some experiences in EA you might not have had which could result in you talking past each other.
Thanks Geoffrey for raising this point. I agree that emotional empathy as defined by Paul Bloom can lead to bias and poor moral judgement, and I also appreciate the usefulness of the rational EA ideas you describe. I don’t want to throw them out the window and agree with Sam Harris when he says “Reason is nothing less than the guardian of love”.
I agree that it is important to focus on effectiveness when judging where to give your money. I was trying to make a very different point.
I was trying to make the point that we should not dismiss the caring part that might still be involved in well-intentioned but poorly executed interventions. And I have tried to make the case for being kind and not dismissing human qualities that do not appear to be efficient. I have tried to show how following these ideas too much, or in the wrong way, can lead to negative social consequences, and that it is important to keep a balance.
In the context of the less effective charities you describe, the problem I see is not warmth or caring, but bias and naivety. To care is to understand. To understand the cause of suffering and the best way to alleviate it.
I would also like to point out that while Paul Bloom makes a clear case for the problems with emotional empathy and moral judgement, at the end of the book he emphasises its value in social contexts. Also, I was not trying to argue for this kind of empathy, but basically talking about emotional maturity, compassion and kindness. I think you can make kindness impartial, so that it is consistent with moral values, but also so that other people feel that they are dealing with a human being, not a robot.
I’m not advocating going back to being naive and prejudiced, but rather being careful not to exclude human traits like empathy in everyday social interactions just because they might lead to bias when thinking about charity. Wisdom requires emotional as well as rational maturity.