I think the right stance here is a question of “should EA be praising such people or get annoyed they’re not giving up more if it wants to keep a sufficient filter for who it calls true believers”, and the answer there is obviously both groups are great & true believers and it seems dumb to get annoyed at either.
The 10% number was notably chosen for these practical reasons (there is nothing magic about that number), and to back-justify that decision with bad moral philosophy about “discharge of moral duty” is absurd.
I’m not going to defend my whole view here, but I want to give a though experiment as to why I don’t think that “shadow donations”—the delta between what you could earn if you were income-maximizing, and what you’re actually earning in your direct work job—are a great measure for the purposes of practical philosophy (though I agree they’re both a relevant consideration and a genuine sacrifice).
Imagine two twins, Anna and Belinda. Both have just graduated with identical grades, skills, degrees, etc. Anna goes directly from college work on AI safety at Safety Org, making $75,000 / year. Belinda goes to work for OpenMind doing safety-neutral work, making $1M per year total compensation. Belinda learns more marketable skills; she could make at least $1M / year indefinitely. Anna, on the other hand, has studiously plugged away at AI safety work, but since her work is niche, she can’t easily transfer these skills to do something that pays better.
Then imagine that, after three years, Belinda joins Anna at Safety Org. Belinda was not fired; she could have stayed at OpenMind and made $1M per year indefinitely. At this point, Anna has gotten a few raises and is making $100,000, and donating 3% of her salary. Belinda gets the same job on the same pay scale, and does equally good work, but donates nothing. Belinda reasons that, because she could still be $1M per year, she has “really” donated $900,000 of labor to Safety Org, and so has sacrificed roughly 90% of her income.
Is Belinda more altruistic than Anna? Which attitude should EAs aspire to?
To give some more color on my general view:
I don’t really think there’s a first-order fact of the matter as to who of these two (or anyone) is “more altruistic,” or what one’s “obligations” are. At bottom, there are just worlds with more or less value in them.
My view mostly comes from a practical view of how the EA community and project can be most impactful, credible, and healthy. I think the best attitude is closer to Anna’s than Belinda’s.
Donating also has other virtues over salary reductions, since it is concrete, measurable, and helps create a more diversified funding ecosystem.
To be clear, I think it’s great that people like Belinda exist, and they should be welcomed and celebrated in the community. But I don’t think the particular mindset of “well I have really sacrificed a lot because if I was purely selfish I could have made a lot more money” is one that we ought to recognize as particularly good or healthy.
I will note that my comment made no reference to who is “more altruistic”. I don’t know what that term means personally, and I’d rather not get into a semantics argument.
If you give the definition you have in mind, then we can argue over whether its smart to advocate that someone ought to be more altruistic in various situations, and whether it gets at intuitive notions of credit assignment.
I will also note that given the situation, its not clear to me Anna’s proper counterfactual here isn’t making $1M and getting nice marketable skills, since she and Belinda are twins, and so have the same work capacity & aptitudes.
To be clear, I think it’s great that people like Belinda exist, and they should be welcomed and celebrated in the community. But I don’t think the particular mindset of “well I have really sacrificed a lot because if I was purely selfish I could have made a lot more money” is one that we ought to recognize as particularly good or healthy.
I think this is the crux personally. This seems very healthy to me, in particular because it creates strong boundaries between the relevant person and EA. Note that burnout & overwork is not uncommon in EA circles! EAs are not healthy, and (imo) already give too much of themselves!
Why do you think its unhealthy? This seems to imply negative effects on the person reasoning in the relevant way, which seems pretty unlikely to me.
Suppose they’re triplets, and Charlotte, also initially identical, earns $1M/year just like Belinda, but can’t/doesn’t want to switch to safety. How much of Charlotte’s income should she donate in your worldview? What is the best attitude for the EA community?
I didn’t read Cullen’s comment as about 10%, and I think almost all of us would agree that this isn’t a magic number. Most would probably agree that it is too demanding for some and not demanding enough for others. I also don’t see anything in Cullen’s response about whether we should throw shade at people for not being generous enough or label them as not “true believers.”
Rather, Cullen commented on “donation expectations” grounded in “a practical moral philosophy.” They wrote about measuring an “obligation to donate.”
You may think that’s “bad moral philosophy,” but there’s no evidence of it being a post hoc rationalization of a 10% or other community giving norm here.
I think the right stance here is a question of “should EA be praising such people or get annoyed they’re not giving up more if it wants to keep a sufficient filter for who it calls true believers”, and the answer there is obviously both groups are great & true believers and it seems dumb to get annoyed at either.
The 10% number was notably chosen for these practical reasons (there is nothing magic about that number), and to back-justify that decision with bad moral philosophy about “discharge of moral duty” is absurd.
I’m not going to defend my whole view here, but I want to give a though experiment as to why I don’t think that “shadow donations”—the delta between what you could earn if you were income-maximizing, and what you’re actually earning in your direct work job—are a great measure for the purposes of practical philosophy (though I agree they’re both a relevant consideration and a genuine sacrifice).
Imagine two twins, Anna and Belinda. Both have just graduated with identical grades, skills, degrees, etc. Anna goes directly from college work on AI safety at Safety Org, making $75,000 / year. Belinda goes to work for OpenMind doing safety-neutral work, making $1M per year total compensation. Belinda learns more marketable skills; she could make at least $1M / year indefinitely. Anna, on the other hand, has studiously plugged away at AI safety work, but since her work is niche, she can’t easily transfer these skills to do something that pays better.
Then imagine that, after three years, Belinda joins Anna at Safety Org. Belinda was not fired; she could have stayed at OpenMind and made $1M per year indefinitely. At this point, Anna has gotten a few raises and is making $100,000, and donating 3% of her salary. Belinda gets the same job on the same pay scale, and does equally good work, but donates nothing. Belinda reasons that, because she could still be $1M per year, she has “really” donated $900,000 of labor to Safety Org, and so has sacrificed roughly 90% of her income.
Anna, on the other hand, thinks that it is an immense privilege to be able to have a comfortable job where she can use her skills to do good, while still earning more than 99% of all people in the world. She knows that, if she had made different choices in life, she probably could have a higher earning potential. But that has never been her goal in life. Anna knows that the average person in her income bracket donate around 3% regardless of their outside job options, so it seems reasonable for her to at least match that.
Is Belinda more altruistic than Anna? Which attitude should EAs aspire to?
To give some more color on my general view:
I don’t really think there’s a first-order fact of the matter as to who of these two (or anyone) is “more altruistic,” or what one’s “obligations” are. At bottom, there are just worlds with more or less value in them.
My view mostly comes from a practical view of how the EA community and project can be most impactful, credible, and healthy. I think the best attitude is closer to Anna’s than Belinda’s.
Donating also has other virtues over salary reductions, since it is concrete, measurable, and helps create a more diversified funding ecosystem.
To be clear, I think it’s great that people like Belinda exist, and they should be welcomed and celebrated in the community. But I don’t think the particular mindset of “well I have really sacrificed a lot because if I was purely selfish I could have made a lot more money” is one that we ought to recognize as particularly good or healthy.
I will note that my comment made no reference to who is “more altruistic”. I don’t know what that term means personally, and I’d rather not get into a semantics argument.
If you give the definition you have in mind, then we can argue over whether its smart to advocate that someone ought to be more altruistic in various situations, and whether it gets at intuitive notions of credit assignment.
I will also note that given the situation, its not clear to me Anna’s proper counterfactual here isn’t making $1M and getting nice marketable skills, since she and Belinda are twins, and so have the same work capacity & aptitudes.
I think this is the crux personally. This seems very healthy to me, in particular because it creates strong boundaries between the relevant person and EA. Note that burnout & overwork is not uncommon in EA circles! EAs are not healthy, and (imo) already give too much of themselves!
Why do you think its unhealthy? This seems to imply negative effects on the person reasoning in the relevant way, which seems pretty unlikely to me.
Suppose they’re triplets, and Charlotte, also initially identical, earns $1M/year just like Belinda, but can’t/doesn’t want to switch to safety. How much of Charlotte’s income should she donate in your worldview? What is the best attitude for the EA community?
I didn’t read Cullen’s comment as about 10%, and I think almost all of us would agree that this isn’t a magic number. Most would probably agree that it is too demanding for some and not demanding enough for others. I also don’t see anything in Cullen’s response about whether we should throw shade at people for not being generous enough or label them as not “true believers.”
Rather, Cullen commented on “donation expectations” grounded in “a practical moral philosophy.” They wrote about measuring an “obligation to donate.”
You may think that’s “bad moral philosophy,” but there’s no evidence of it being a post hoc rationalization of a 10% or other community giving norm here.