I’m not an effective altruist and don’t think I ever will be one. I’m here only out of curiosity and intellectual entertainment. Perhaps this allows me to give you an honest “outside” perspective. My main reasons for not donating 10% of my income or make other similar commitments:
I am instinctively too egoistic and I don’t like the cognitive dissonance from being a little altruistic, but not as much as I reasonably could be. I feel best when I “play for my own side” in life, am productive to get only what I want and don’t think about the suffering of others. I feel better when I don’t care, and I prefer feeling better over caring. They say giving makes happy, but I find it brings me no equivalent pleasure.
Society and the law already demand a great deal of altruism and (what they think of as) morality from me. Some of it in the form of taxes, some of it in the form of restrictions on what I can do, some of it in the form of implicit status attacks. Of course, I get a lot in return, but subjetively it doesn’t feel balanced. Perhaps if I were richer or had higher life satisfaction, I might be more generous in addition to what is already demanded.
In many morally relevant domains, there is a discrepancy between what I feel is important and what people in general feel is important. In addition, I have given up on convincing people through value talk. Most people will never value what I value, and vice versa. There are no cost-effective ways to change these discrepancies, and even though EA is a multi-domain endeavor, it is ultimately about empowering humanity to fulfill its preferences, half of which are more or less opposed to mine.
Psychologically, uncertainty cripples my motivation. I am not an “expected utility maximizer”. But in EA, certainty of impact and scope of impact are somewhat negatively correlated. And where positive effects are really certain, I expect the most cost-effective ground will eventually be covered by the EA movement without me, and I’d rather other people pay than me (donor’s dilemma).
These are the core reasons why I have decided against EA in my personal life. It does not preclude small donations or ethical consumption, both of which I make, but it makes me recoil from real commitments, unless I have an unexpected windfall or something.
Thanks for being so honest, Nicholas, really useful to hear your perspective—especially as it sounds like you’ve been paying a fair amount of attention to what’s going on in the EA movement. I can empathise with your point 4. quite a bit and I think a fair number of others do too—it’s so hard to be motivated if you’re unsure about whether you’re actually making a difference, and it’s so hard to be sure you’re making a difference, especially when we start questioning everything. For me it doesn’t stop me wanting to try, but it does affect my motivation sometimes, and I’d love to know of better ways to deal with uncertainty.
This is a thoughtful post, so thanks for making it. On the other hand, from an EA perspective I hope we don’t waste too much time worrying about people like you.
Put simply, you are weird. ( that is not an insult! - EA is weird and the founding communities of philosophers, students and AI/LW people who primarily make us up are even wierder).
I suspect, for those of use who want to expand and grow the reach of EA we should worry about how to expand our ideas to people who cannot use “expected utility maximiser” in a sentence and would never dream of admitting that they are “too egoistic” to prefer helping others over their own feelings. There is much more potential in talking to people who have never thought hard about an issue before than those who have thought hard about it and decided to go another way.
The language is certainly atypical, but I don’t actually think these reasons are weird at all; they’re what I would consider pretty common/core objections to EA and I’ve heard versions of all of these before. I think they’re worth answering, or at least having answers available.
I wouldn’t write it off. These reasons may apply to a lot of people, even if they wouldn’t express them in those words. I found point 2 particularly interesting.
Nicholas, can you please elaborate on point 3 - your thoughts and what you care about might genuinely be things that I +others collected here should be caring about or taking into account. I’m interested. Please just email me at tomstocker88@gmail.com if they’re so different from humanity’s preferences that they’re problematic to share publicly. Thank you for posting this, even if you are egoistic (I don’t know many people that aren’t to some extent), you have courage / honesty, which is awesome.
I’m not an effective altruist and don’t think I ever will be one. I’m here only out of curiosity and intellectual entertainment. Perhaps this allows me to give you an honest “outside” perspective. My main reasons for not donating 10% of my income or make other similar commitments:
I am instinctively too egoistic and I don’t like the cognitive dissonance from being a little altruistic, but not as much as I reasonably could be. I feel best when I “play for my own side” in life, am productive to get only what I want and don’t think about the suffering of others. I feel better when I don’t care, and I prefer feeling better over caring. They say giving makes happy, but I find it brings me no equivalent pleasure.
Society and the law already demand a great deal of altruism and (what they think of as) morality from me. Some of it in the form of taxes, some of it in the form of restrictions on what I can do, some of it in the form of implicit status attacks. Of course, I get a lot in return, but subjetively it doesn’t feel balanced. Perhaps if I were richer or had higher life satisfaction, I might be more generous in addition to what is already demanded.
In many morally relevant domains, there is a discrepancy between what I feel is important and what people in general feel is important. In addition, I have given up on convincing people through value talk. Most people will never value what I value, and vice versa. There are no cost-effective ways to change these discrepancies, and even though EA is a multi-domain endeavor, it is ultimately about empowering humanity to fulfill its preferences, half of which are more or less opposed to mine.
Psychologically, uncertainty cripples my motivation. I am not an “expected utility maximizer”. But in EA, certainty of impact and scope of impact are somewhat negatively correlated. And where positive effects are really certain, I expect the most cost-effective ground will eventually be covered by the EA movement without me, and I’d rather other people pay than me (donor’s dilemma).
These are the core reasons why I have decided against EA in my personal life. It does not preclude small donations or ethical consumption, both of which I make, but it makes me recoil from real commitments, unless I have an unexpected windfall or something.
Thanks for being so honest, Nicholas, really useful to hear your perspective—especially as it sounds like you’ve been paying a fair amount of attention to what’s going on in the EA movement. I can empathise with your point 4. quite a bit and I think a fair number of others do too—it’s so hard to be motivated if you’re unsure about whether you’re actually making a difference, and it’s so hard to be sure you’re making a difference, especially when we start questioning everything. For me it doesn’t stop me wanting to try, but it does affect my motivation sometimes, and I’d love to know of better ways to deal with uncertainty.
This is a thoughtful post, so thanks for making it. On the other hand, from an EA perspective I hope we don’t waste too much time worrying about people like you.
Put simply, you are weird. ( that is not an insult! - EA is weird and the founding communities of philosophers, students and AI/LW people who primarily make us up are even wierder).
I suspect, for those of use who want to expand and grow the reach of EA we should worry about how to expand our ideas to people who cannot use “expected utility maximiser” in a sentence and would never dream of admitting that they are “too egoistic” to prefer helping others over their own feelings. There is much more potential in talking to people who have never thought hard about an issue before than those who have thought hard about it and decided to go another way.
The language is certainly atypical, but I don’t actually think these reasons are weird at all; they’re what I would consider pretty common/core objections to EA and I’ve heard versions of all of these before. I think they’re worth answering, or at least having answers available.
I wouldn’t write it off. These reasons may apply to a lot of people, even if they wouldn’t express them in those words. I found point 2 particularly interesting.
Nicholas, can you please elaborate on point 3 - your thoughts and what you care about might genuinely be things that I +others collected here should be caring about or taking into account. I’m interested. Please just email me at tomstocker88@gmail.com if they’re so different from humanity’s preferences that they’re problematic to share publicly. Thank you for posting this, even if you are egoistic (I don’t know many people that aren’t to some extent), you have courage / honesty, which is awesome.