To me, four important effective altruism barriers are cognitive dissonance, akrasia, arrogance and value erosion. More precisely:
cognitive dissonance as deliberately choosing to be nasty so as to gain some small amount of fungible resource which can be spent on effective charity
akrasia as choosing not to give because somehow you don’t feel like it, hoarding your money because you can always spend it later
arrogance as believing that because you have access to and trust in a specific piece of knowledge (in this case charity effectiveness), you will have vastly more effect on the world than an average person
value erosion as future selves deciding they don’t care about animals after all.
I think cognitive dissonance and value erosion work similarly here, and both point in favour of veganism.
Arrogance is a complicated one because it might actually be true that you have a huge positive effect compared to an average person (it’s kind of what we’re striving for). But actually alieving it might be problematic, and it might make sense to just be a vegan and unplug your phone chargers when not in use in order to feel more normal.
Akrasia could work both ways—there’s a possibility that veganism could “use up” your charitableness, which would certainly be a bad thing. But on the other hand veganism might help you integrate socially with other vegan activists, which might be a motivating factor to give.
deliberately choosing to be nasty so as to gain some small amount of fungible resource which can be spent on effective charity
I’m sympathetic to this idea, but I’m not sure when to apply it. For example, if someone comes to my door asking for money for a charity I think is inefficient, am I “deliberately choosing to be nasty” in the way you describe?
The proposed effect is psychological, so presumably the distinction should be psychological—that one shouldn’t do things one feels are nasty?
I don’t think most people really alief that eating meat is nasty; at least, I didn’t until I became vegetarian and internalized those feelings over the course of about a month. Does whether a person aliefs that eating meat is nasty matter to this effect?
Good questions! I guess there are times when our feeling of nastiness can be exploited, and in those cases we have to bypass it. If you always give money to people at the door, they could just turn up the next day asking for more—it may or may not be a “nice feeling” strategy but it wouldn’t be a successful one.
I think that someone’s aliefs about eating meat are relevant to the cognitive dissonance concept. In the case where somebody eats meat and doesn’t alief that eating meat is nasty, I can imagine three subcases:
Person doesn’t care about nonhuman animals or is unaware of cruelty issue
Compartmentalization
Eating meat is actually the EA thing to do, and all the for/against arguments have been internalized
In the case where somebody eats meat and does alief that eating meat is nasty, I can imagine:
I think cognitive dissonance and value erosion work similarly here, and both point in favour of veganism.
But they may point against the spreading of veg*nism. By spreading the idea that animals in factory farms suffer, you may cause people to decide that they don’t care about animals after all. (It seems like every person who is unconvinced of your arguments for veg*nism is at risk for this.)
cognitive dissonance as deliberately choosing to be nasty so as to gain some small amount of fungible resource which can be spent on effective charity
Do you mean that choosing to be nasty can cause us to come to prefer nastiness (as a result of cognitive dissonance), and that this is an argument in favor of being nice?
Akrasia could work both ways—there’s a possibility that veganism could “use up” your charitableness, which would certainly be a bad thing. But on the other hand veganism might help you integrate socially with other vegan activists, which might be a motivating factor to give.
Alternatively, behaving altruistically could motivate you to pursue additional altruistic behaviors, creating a positive feedback cycle. This seems most plausible to me.
Do you mean that choosing to be nasty can cause us to come to prefer nastiness
Being nasty in order to achieve some greater good requires complicated reasoning which can feel wrong. I’d argue that it’s best to limit the amount of that kind of reasoning that we subscribe to—it feels like it could be demotivating, or that we could become desensitized to the feeling of wrongness, or something.
To me, four important effective altruism barriers are cognitive dissonance, akrasia, arrogance and value erosion. More precisely:
cognitive dissonance as deliberately choosing to be nasty so as to gain some small amount of fungible resource which can be spent on effective charity
akrasia as choosing not to give because somehow you don’t feel like it, hoarding your money because you can always spend it later
arrogance as believing that because you have access to and trust in a specific piece of knowledge (in this case charity effectiveness), you will have vastly more effect on the world than an average person
value erosion as future selves deciding they don’t care about animals after all.
I think cognitive dissonance and value erosion work similarly here, and both point in favour of veganism.
Arrogance is a complicated one because it might actually be true that you have a huge positive effect compared to an average person (it’s kind of what we’re striving for). But actually alieving it might be problematic, and it might make sense to just be a vegan and unplug your phone chargers when not in use in order to feel more normal.
Akrasia could work both ways—there’s a possibility that veganism could “use up” your charitableness, which would certainly be a bad thing. But on the other hand veganism might help you integrate socially with other vegan activists, which might be a motivating factor to give.
I’m sympathetic to this idea, but I’m not sure when to apply it. For example, if someone comes to my door asking for money for a charity I think is inefficient, am I “deliberately choosing to be nasty” in the way you describe?
The proposed effect is psychological, so presumably the distinction should be psychological—that one shouldn’t do things one feels are nasty?
I don’t think most people really alief that eating meat is nasty; at least, I didn’t until I became vegetarian and internalized those feelings over the course of about a month. Does whether a person aliefs that eating meat is nasty matter to this effect?
Good questions! I guess there are times when our feeling of nastiness can be exploited, and in those cases we have to bypass it. If you always give money to people at the door, they could just turn up the next day asking for more—it may or may not be a “nice feeling” strategy but it wouldn’t be a successful one.
I think that someone’s aliefs about eating meat are relevant to the cognitive dissonance concept. In the case where somebody eats meat and doesn’t alief that eating meat is nasty, I can imagine three subcases:
Person doesn’t care about nonhuman animals or is unaware of cruelty issue
Compartmentalization
Eating meat is actually the EA thing to do, and all the for/against arguments have been internalized
In the case where somebody eats meat and does alief that eating meat is nasty, I can imagine:
Cognitive dissonance
Compartmentalization
But they may point against the spreading of veg*nism. By spreading the idea that animals in factory farms suffer, you may cause people to decide that they don’t care about animals after all. (It seems like every person who is unconvinced of your arguments for veg*nism is at risk for this.)
Do you mean that choosing to be nasty can cause us to come to prefer nastiness (as a result of cognitive dissonance), and that this is an argument in favor of being nice?
Alternatively, behaving altruistically could motivate you to pursue additional altruistic behaviors, creating a positive feedback cycle. This seems most plausible to me.
Being nasty in order to achieve some greater good requires complicated reasoning which can feel wrong. I’d argue that it’s best to limit the amount of that kind of reasoning that we subscribe to—it feels like it could be demotivating, or that we could become desensitized to the feeling of wrongness, or something.
I agree.