“Its the relative self vs. others prioritization. That’s a tension that is always going to exist between the FIRE community and the EA community.”—I agree with this statement. However, do you think the selfish vs. altruistic trait is a bi-modal or a normal-ish distribution? My intuition is the latter, that most people want to do some good but are also somewhat selfish.
This actually leads into outreach strategy. I’m not a community organizer but I know it’s hard work, so kudos to you for doing the meta-work. I want to challenge the “success metric” for the outreach. It sounds like you’re using “who’s coming to a EA meetup” as a proxy. In my opinion, the real beneficiaries of the EA movement are other people, animals, and perhaps future non-biological sentient beings. So I think a better proxy metric would be something like “how much money donated to these beneficiaries”—one doesn’t need to attend any EA meetup or know about this forum to donate to, e.g., the Humane League. Anecdotally, I have a few “self-interested libertarian” friends and I was able to convince them to donate to the Shrimp Welfare Project recently.
True/pure altruism is indeed rare but I believe most people are at least semi-altruistic. They (perhaps including me) may not be a good fit for the core EA community but they’re open to support EA causes.
For what it’s worth, Peter Singer’s organization The Life You Can Save has a donation pledge that adjusts the percentage based on your income. You can type in your income and will give you a percentage back. At $10,000, it’s 0%. At $50,000, it’s 1%. At $100,000, it’s 1.8%. At $500,000, it’s 10%. And at $1,000,000, it’s 15%.
So, this pledge is less demanding than the Giving What We Can pledge and, also, nobody is saying you have to take either pledge to be a part of EA.
Most people on the EA Forum don’t seem to have the little blue or orange diamonds next to their usernames. Probably at least a few just haven’t added a diamond even though they haven’t taken the Giving What We Can pledge, but as far as I know, a lot of people genuinely haven’t taken it. Maybe even the majority, who knows. When I ran an EA group at my university, I think at least about half of the regular, active members didn’t take the GWWC pledge, and I’d guess it was probably more than half. (It was a long time ago, and it’s not something we kept track of.)
In my personal experience with EA, I’ve never seen or heard anyone say anything like, “You should/need to take the pledge!” or “Why haven’t you taken the pledge yet?” I’ve never seen anyone try to give someone the hard sell for the GWWC pledge or, for that matter, even try to convince them to take it at all.
Personally, I’m very much a proponent of not telling people what to do, and not trying to pressure people into doing anything. My approach has always been to respect people’s autonomy and simply talk about why I donate, or why I think donating in general is a good idea, to the extent they’re curious and want to know more about those things.
I think where Matthew’s comments resonate is just that it’s hard to understand how your math checks out. For example, the average lifetime earnings of Americans with a graduate degree (which is significantly higher than for all other educational cohorts, including those with only bachelor’s degrees) from age 20 to 69 is $3.05 million (adjusted for inflation from 2015, when this data was collected, to 2025). If you’re earning around $1 million a year, then within about 3 years at that income level, your lifetime earnings will match the average lifetime earnings of Americans with a graduate degree. It’s hard to square the idea that you only want to live a frugal lifestyle, comparable to someone around the U.S. poverty line, or even the lifestyle equivalent to someone with U.S. median income with the idea that you earn around $1 million a year and that donating 10% of your income is too demanding, even accounting for the fact that you want to retire extremely early.
And retiring before age 30 is itself a sort of luxury good. Even if donating 10% of your income would cause you to overshoot your goal by, say, 2 years and retire at age 31 instead of age 29, is that really a flaw in the concept of donating 10% of your income to help the world’s poorest people or animals in factory farms? If it is correct to think of extremely early retirement as a kind of luxury good, then is it all that different for someone to say the 10% pledge asks too much because it would require them to retire at 31 instead of 29 than it would be for someone to say the pledge asks too much because they want to buy a $600,000 Lamborghini? I’m not passing judgment on anyone’s personal choices, but I am questioning if it’s a valid criticism of the GWWC pledge that it might be incompatible with some people acquiring certain luxury goods reserved for the wealthiest 1% of people in high-income countries. So what if it is? Why is that a problem? Why should people in EA want to change that?
But in any case, it’s up to you to decide what percentage you want to donate out of your current income or your investment income after you retire early. If 10% is too onerous, you can donate less than 10%. You could put whatever you expect your income during retirement to be in The Life You Can Save’s calculator and see if you think that would be an amount you’d be comfortable giving after you retire. Every additional dollar donated is a better outcome than one dollar less than that being donated. So, just think about what you want to donate, and donate that.
People in EA already do tend to think in marginal terms and to wonder what the equivalent of the Laffer curve for effective altruism might be. Nobody has ever gotten this down to an economic science, or anything close, but it’s something people have been thinking about and talking about for a long time. My general impression is that most people in EA have been very open to people coming into EA with various levels of commitment, involvement, or donating.
The only real counterexample I can think of this is when one person who has since (I believe) disassociated themselves from EA argued in defense of the parent organization of the Centre for Effective Altruism purchasing Wytham Abbey. Their argument was that it’s all the better if normal people find this repugnant, since it signals (or countersignals) that EA has weird ideas and morals, and this helps attract the weird people that EA needs to attract to, I don’t know, solve the problems with technical AI alignment research and save the world from an imminent apocalypse and usher in a post-scarcity utopia. I find this ridiculous and quite a troubling way to think, and I’m glad most people in EA seem to disagree with this view on the Wytham Abbey purchase, and with this kind of view in general about signaling (or countersignaling) correctly so as to attract only the pure minds EA needs.
Maybe there’s still some of that going around, I don’t know, maybe there’s a lot of it, but somehow I get the impression that most people in EA aren’t into gatekeeping or purity of that kind. On the other hand, I’m only really thinking here about joining the movement at the entry level, and if you want a job at an EA organization or something like that, people will probably start to gatekeep and apply purity tests.
“Its the relative self vs. others prioritization. That’s a tension that is always going to exist between the FIRE community and the EA community.”—I agree with this statement. However, do you think the selfish vs. altruistic trait is a bi-modal or a normal-ish distribution? My intuition is the latter, that most people want to do some good but are also somewhat selfish.
This actually leads into outreach strategy. I’m not a community organizer but I know it’s hard work, so kudos to you for doing the meta-work. I want to challenge the “success metric” for the outreach. It sounds like you’re using “who’s coming to a EA meetup” as a proxy. In my opinion, the real beneficiaries of the EA movement are other people, animals, and perhaps future non-biological sentient beings. So I think a better proxy metric would be something like “how much money donated to these beneficiaries”—one doesn’t need to attend any EA meetup or know about this forum to donate to, e.g., the Humane League. Anecdotally, I have a few “self-interested libertarian” friends and I was able to convince them to donate to the Shrimp Welfare Project recently.
True/pure altruism is indeed rare but I believe most people are at least semi-altruistic. They (perhaps including me) may not be a good fit for the core EA community but they’re open to support EA causes.
For what it’s worth, Peter Singer’s organization The Life You Can Save has a donation pledge that adjusts the percentage based on your income. You can type in your income and will give you a percentage back. At $10,000, it’s 0%. At $50,000, it’s 1%. At $100,000, it’s 1.8%. At $500,000, it’s 10%. And at $1,000,000, it’s 15%.
So, this pledge is less demanding than the Giving What We Can pledge and, also, nobody is saying you have to take either pledge to be a part of EA.
Most people on the EA Forum don’t seem to have the little blue or orange diamonds next to their usernames. Probably at least a few just haven’t added a diamond even though they haven’t taken the Giving What We Can pledge, but as far as I know, a lot of people genuinely haven’t taken it. Maybe even the majority, who knows. When I ran an EA group at my university, I think at least about half of the regular, active members didn’t take the GWWC pledge, and I’d guess it was probably more than half. (It was a long time ago, and it’s not something we kept track of.)
In my personal experience with EA, I’ve never seen or heard anyone say anything like, “You should/need to take the pledge!” or “Why haven’t you taken the pledge yet?” I’ve never seen anyone try to give someone the hard sell for the GWWC pledge or, for that matter, even try to convince them to take it at all.
Personally, I’m very much a proponent of not telling people what to do, and not trying to pressure people into doing anything. My approach has always been to respect people’s autonomy and simply talk about why I donate, or why I think donating in general is a good idea, to the extent they’re curious and want to know more about those things.
I think where Matthew’s comments resonate is just that it’s hard to understand how your math checks out. For example, the average lifetime earnings of Americans with a graduate degree (which is significantly higher than for all other educational cohorts, including those with only bachelor’s degrees) from age 20 to 69 is $3.05 million (adjusted for inflation from 2015, when this data was collected, to 2025). If you’re earning around $1 million a year, then within about 3 years at that income level, your lifetime earnings will match the average lifetime earnings of Americans with a graduate degree. It’s hard to square the idea that you only want to live a frugal lifestyle, comparable to someone around the U.S. poverty line, or even the lifestyle equivalent to someone with U.S. median income with the idea that you earn around $1 million a year and that donating 10% of your income is too demanding, even accounting for the fact that you want to retire extremely early.
And retiring before age 30 is itself a sort of luxury good. Even if donating 10% of your income would cause you to overshoot your goal by, say, 2 years and retire at age 31 instead of age 29, is that really a flaw in the concept of donating 10% of your income to help the world’s poorest people or animals in factory farms? If it is correct to think of extremely early retirement as a kind of luxury good, then is it all that different for someone to say the 10% pledge asks too much because it would require them to retire at 31 instead of 29 than it would be for someone to say the pledge asks too much because they want to buy a $600,000 Lamborghini? I’m not passing judgment on anyone’s personal choices, but I am questioning if it’s a valid criticism of the GWWC pledge that it might be incompatible with some people acquiring certain luxury goods reserved for the wealthiest 1% of people in high-income countries. So what if it is? Why is that a problem? Why should people in EA want to change that?
But in any case, it’s up to you to decide what percentage you want to donate out of your current income or your investment income after you retire early. If 10% is too onerous, you can donate less than 10%. You could put whatever you expect your income during retirement to be in The Life You Can Save’s calculator and see if you think that would be an amount you’d be comfortable giving after you retire. Every additional dollar donated is a better outcome than one dollar less than that being donated. So, just think about what you want to donate, and donate that.
People in EA already do tend to think in marginal terms and to wonder what the equivalent of the Laffer curve for effective altruism might be. Nobody has ever gotten this down to an economic science, or anything close, but it’s something people have been thinking about and talking about for a long time. My general impression is that most people in EA have been very open to people coming into EA with various levels of commitment, involvement, or donating.
The only real counterexample I can think of this is when one person who has since (I believe) disassociated themselves from EA argued in defense of the parent organization of the Centre for Effective Altruism purchasing Wytham Abbey. Their argument was that it’s all the better if normal people find this repugnant, since it signals (or countersignals) that EA has weird ideas and morals, and this helps attract the weird people that EA needs to attract to, I don’t know, solve the problems with technical AI alignment research and save the world from an imminent apocalypse and usher in a post-scarcity utopia. I find this ridiculous and quite a troubling way to think, and I’m glad most people in EA seem to disagree with this view on the Wytham Abbey purchase, and with this kind of view in general about signaling (or countersignaling) correctly so as to attract only the pure minds EA needs.
Maybe there’s still some of that going around, I don’t know, maybe there’s a lot of it, but somehow I get the impression that most people in EA aren’t into gatekeeping or purity of that kind. On the other hand, I’m only really thinking here about joining the movement at the entry level, and if you want a job at an EA organization or something like that, people will probably start to gatekeep and apply purity tests.