“However, effective altruism really is warm and calculating.”
I can’t believe I’ve never thought of this! That’s great :)
Great post, too. I think EA has a helpful message for most people who are drawn to it, and for many people that message is overcoming status quo indifference. However, I worry that caring too much, as in overidentifying with or feeling personally responsible for the suffering of the world, is also a major EA failure mode. I have observed that most people assume their natural tendency towards either indifference or overresponsibility is shared by basically everyone else, and this assumption determines what message they think the world needs to hear. For instance, I’m someone who’s naturally overresponsible. I don’t need EA to remind me to care. I need it to remind me that the indiscriminate fucks I’m giving are wasted, because they can take a huge toll on me and aren’t particularly helping anyone else. Hence, I talk a lot about self-care and the pitfalls of trying to be too morally perfect within EA. When spreading the word about EA, I emphasize the moral value of prioritization and effectiveness because that’s what was missing for me.
EA introduced me to many new things to care about, but I only didn’t care about them before because I hadn’t realized they were actionable. This might be quibbling, but I wouldn’t say I was indifferent before—I just had limiting assumptions about how I could help. I side more with Aaron’s “unawareness” frame on this.
“However, effective altruism really is warm and calculating.”
I can’t believe I’ve never thought of this! That’s great :)
Great post, too. I think EA has a helpful message for most people who are drawn to it, and for many people that message is overcoming status quo indifference. However, I worry that caring too much, as in overidentifying with or feeling personally responsible for the suffering of the world, is also a major EA failure mode. I have observed that most people assume their natural tendency towards either indifference or overresponsibility is shared by basically everyone else, and this assumption determines what message they think the world needs to hear. For instance, I’m someone who’s naturally overresponsible. I don’t need EA to remind me to care. I need it to remind me that the indiscriminate fucks I’m giving are wasted, because they can take a huge toll on me and aren’t particularly helping anyone else. Hence, I talk a lot about self-care and the pitfalls of trying to be too morally perfect within EA. When spreading the word about EA, I emphasize the moral value of prioritization and effectiveness because that’s what was missing for me.
EA introduced me to many new things to care about, but I only didn’t care about them before because I hadn’t realized they were actionable. This might be quibbling, but I wouldn’t say I was indifferent before—I just had limiting assumptions about how I could help. I side more with Aaron’s “unawareness” frame on this.
+6 on “warm and calculating”