There was a great thread in the facebook group on whether people making a modest wage (around or below $30k/yr in US terms) should be donating to effective charities or saving money. I’d like to weigh in on this but that thread is already pretty crowded and unstructured.
The proposition here is “People with average or below average income should save money rather than donate to effective charities”
One thing that it looks like almost nobody mentioned is the opportunity cost of worrying about other people over yourself and how this corresponds to effective altruistic output. It seemed from facebook that most EA’s were against the proposition, claiming that most people in the developed world are still far better off than X% of the global population and therefore they should still be donating some percentage of their wealth. I believe there is a strong case to be made that focusing on optimizing one’s own career capital, not just making smart personal finance decisions, will enable one to earn substantially higher income in the future and thus be a more “E” EA. Any intellectual power devoted to understanding the EA argument (doing the relevant research, picking an EA organization or EA-organization-recommended charity to donate to, and “stretching your EA muscles” by donating a small amount over a regular period of time) is a small investment in terms of money but a large investment in terms of intellectual capital that I think dedicated EA’s tend to discount because they have already invested this capital. This is a CFAR-esque argument that advocates focusing on personal development and improvement until one is at a level to reasonably maximize one’s own output both in terms of income and effectiveness of donations.
I am still uncertain about my position in this debate, but it seemed that most EA’s (at least on facebook) were strongly against the proposition so I would like to see more discussion taking the above points into consideration.
I was involved in the initial facebook thread on the topic. At the time, I made less than 30k, didn’t ever expect to make much more than $30k (I’m a nanny), and was highly turned off by the conversation.
Two cross-country moves later, I have actually doubled my income, but I still am highly turned off by elitist EA conversations that assume that all the readers are high-potential-earners in their 20s with strong social safety nets.
It would have been much easier to convince me to donate 10% of a $30k income, than to upend my life in order to make some kind of career change.
“I still am highly turned off by elitist EA conversations that assume that all the readers are high-potential-earners in their 20s with strong social safety nets.”
Sure, but I think your use of the term elitist is a bit unfair here. I personally know many friends that view my own identification as an EA itself elitist, because by trying to help with things like alleviating global poverty through targeted donation I am putting myself on a pedestal above people living in the developing world (or so the argument goes). To these friends, it’s less elitist to try and focus on pursuing their own happiness rather than thinking you can solve other people’s problems better than they can. Maybe this is why I am arguing for this angle of attack; I have friends that have different off-putting triggers as you.
I agree that we shouldn’t make broad generalizations about EA demographics, but at the same time we shouldn’t misrepresent them; I would wager that a large number, if not the majority, of prospective EAs would fall under the high-potential-earners-in-their-20′s demographic and this is very relevant in the discussion of how to advise people who are just getting into EA. I definitely agree that the same advice wouldn’t work equally well when addressing every person, and sometimes it’s correct to give two different people completely opposite advice. That being said, if I had to give 1 piece of advice in a generalizing way, I would want to consider the demographic of who I am giving this advice to rather than assuming that it is directed towards the median US citizen, for instance.
I think this is what Ryan is saying, but I want to say it again and say more, because I feel strongly and because Ryan left a lot of inferential distance in his post.
I dislike the idea that EA is mostly attractive or mostly applicable to it’s current dominant demographic of math/econ/tech interested people in their 20s. I think the core ideas of EA are compelling to a wide variety of people, and that EA can benefit from skills outside of it’s current mainstream. It seems likely to me that the current situation is more the result of network effects than that EA is not interesting to people outside of this cluster.
Catering our “general” advice to only one sort of person makes it more likely that other types of people will feel lost or unwelcome and not pursue their interest in EA; I take it Erica has felt this way. While the statement Alex made in his last paragraph is reasonable as stated, we are not in the position of only being able to give one piece of advice.
Do you have any idea how we might go about fixing the situation? It seems to me like math/econ/tech people in their 20s (including me) don’t know what it takes to make other demographics feel welcome. The best thing I can think of is encourage some people from other demographics to write about and actively discuss EA, and to spread the writings of people who already do.
That’s a really good question, and as another 20-something in tech, I also definitely don’t have all the answers. I have an in-progress draft of a post more generally on outreach, to be posted somewhere (not sure where), but I’ll briefly list some of my thoughts directly related to making a wider variety of people feel welcome.
Expand our models of EA dedication beyond earning to give. This model doesn’t fit most people well, but it’s by far the most prominent idea of what living an EA life looks like.
People want to see people like them in communities they’re part of (I don’t endorse this state of affairs, but I think it’s often true). This may seem discouraging, because it most obviously says “to get more of x type of people, you need to already have x type of people.” I think it’s not totally unactionable though—if cultural minorities make themselves more visible by posting an commenting on the forum, coming to meetups, etc., new people in the same cultural minorities will see them and know they are welcome.
Do your best not to assume that people are in your cluster. The career advice example is good. Another example is to explain math or econ jargon when you use them in a post. I think this has an outsize effect. The experience of being in a community but having the content aimed at different sorts of people is a little like going to a social dance and having no one ask you to be their partner—it’s hard to believe that you’re wanted, even when people keep telling you so. And it feels really crummy.
Note that I don’t know anyone who has said that they were interested in EA but felt unwelcome there. I think at least part of it is that EA is something that very few people outside of this cluster have even heard of, much less have taken steps towards getting involved in.
I definitely agree, and as a result I wouldn’t cater my advice to only one sort of person. I think it’s best to take an approach where you change the advice you give based on who you are talking to. Perhaps we should have some sort of portfolio of starting advice to give based on simple diagnostics. I’m sure 80,000 hours does something like this, so it’s not new ground. I think this is way better than saying “everybody should donate 10% of their income right now if you can afford it or you’re not a real EA.” And yes some people have said this. I find this to be a huge turn off personally.
ruthie: “It seems likely to me that the current situation is more the result of network effects than that EA is not interesting to people outside of this cluster.” I’m not sure I agree with this. I know surprisingly few people that are both actively altruistic and who actually think critically and examine evidence in their every day lives. I wish this was everyone, but realistically it’s not. I do believe there are a ton of people who would be interested EA that haven’t discovered it yet, but I think that the people who will ultimately be drawn in won’t be totally shunned by the fact that a lot of the info is catered to demographics that aren’t exactly like them. Especially since there is such a large range of socioeconomic status that a person could reside in, and each one might have a totally different EA approach that works best for them (and I’m not even talking about cause selection yet).
What if somebody has no interest in donating, but they are interested in career choice? Or interested in lifestyle change? Or interested in saving, researching, and donating later? Or interested in advocacy? Or interested in personal development? There are a lot of options, and I think telling everyone the blanket advice “just start donating now to GiveWell’s top charities and don’t worry about the meta stuff” will turn off many people in the same way that “focus on yourself until you have more income leverage” might turn people off. I haven’t seen any real evidence either way, just some armchair arguments and half-baked anecdotes, so I don’t understand why everyone is so confident in this.
I’m sure 80,000 hours does something like this, so it’s not new ground.
Are you sure you’re sure? I don’t mean to nitpick, but unless someone from 80,000 Hours has shown, or told us, and they’re reading this, we don’t know for sure. I was going to write something to this affect, but your framing of the idea is even better, so 80,000 Hours should be addressed.
The thing about effective altruism is we don’t need preexisting status to have organizations pay attention to us. They pay attention to our merit, arguments, mettle, and records.
Although that can be self-perpetuating. For example, few would be willing to bite the bullet and say that they should give male-focused advice if 60% of effective altruists were male.
I personally know many friends that view my own identification as an EA itself elitist, because by trying to help with things like alleviating global poverty through targeted donation I am putting myself on a pedestal above people living in the developing world (or so the argument goes). To these friends, it’s less elitist to try and focus on pursuing their own happiness rather than thinking you can solve other people’s problems better than they can.
[tangent] Have you tried describing GiveDirectly to these friends, and if so how did they react?
I think that if the Standard EA Recommendation for middle- to low-income people is “come back when you make more money”, no middle- to low-income people (to a first approximation) will ever become interested in EA.
I think if I made 30k a year and asked someone what EA-related things I could do and they told me “you don’t make enough to worry about donating, try to optimize your income some more and then we’ll talk,” my reaction would be “Ack! I don’t want to upend my entire life! I just want to help some people! These guys are mean.” And then I would stop paying attention to effective altruism.
My general heuristic for stuff like this is that it’s more important for general recommendations to look reasonable than for them to be optimal (within reason). This is because by the time someone is wondering whether your policy is actually optimal, they care enough to be thinking like an effective altruist already, and are less likely to be scared off by a wrong answer than someone who’s evaluating the surface-reasonableness.
Agreed—and there are plenty of ways for people to contribute to EA besides donating. Writing articles, helping organize EA events, and offering support and encouragement to people who are working on more direct things are just the three first things that come to mind.
Any large group working on something needs both people working directly on things, and people who are in support roles and take care of the day-to-day needs of the organization. The notion that all EAs should be working directly on something (I’m counting earning-to-give as “working directly on something”, here) seems clearly wrong.
I think we can both agree that the way you say things is very important. Saying “come back when you make more money” is very different from saying “if you are interested in helping people as effectively as possible it may be wise to consider looking out for yourself first before turning your motivations outward.” There are a lot of reasons for people to worry that their lives are too good in comparison to others’ and therefore they have a moral obligation to help. I think a lot of EA’s have felt this way before. When faced with this sentiment, I think it can be a mistake with regard to actually being effective to devote significant effort into explicit donation rather than personal development.
I think you are also framing the argument to make “making more money” sound like a bad thing that most people don’t want to do. A lot of people already want to make more money, and they feel a conflict between trying their best to become successful VS using the resources / leverage they already have to help others. My argument is that focusing on personal goals and development could kill two birds with one stone for a lot of people and I don’t think it’s as off-putting as you make it sound.
Speaking from my own experience, I have a very high propensity to think about others before myself and I think this can be a flaw and limit productivity in many ways. I think I would ultimately be a more effective altruist if I had spent more of my time pondering “how can I become really good at something / develop valuable skills” rather than “how can I do the most good.”
A lot of people already want to make more money, and they feel a conflict between trying their best to become successful VS using the resources / leverage they already have to help others.
True, but a lot of people are also struggling just to find a job that would be both enjoyable and provide a sufficient wage to pay the bills. Emphasizing making more money could cause them to feel a conflict between finding a job that doesn’t feel soul-crushing VS feeling guilty about being unable to donate much. (Full disclosure: I feel a bit of this, since the career path that I’m currently considering the most isn’t one that I’d expect to make a lot of money.)
“True, but a lot of people are also struggling just to find a job that would be both enjoyable and provide a sufficient wage to pay the bills.”
Agreed, so in that context, how does it make more sense to tell somebody that they should care about helping other people as much as they possibly can? I don’t see that train of thought getting through to many people in this situation.
I don’t want to tell anyone that they should care about helping as many people as possible. I want to tell them that they have a fantastic, exciting opportunity to help lots of people and have a big impact on the world, if they want to.
Someone who is struggling to find a meaningful job might also be someone who’s struggling to find some purpose for their life in general. (This has been true for me.) That might make them exceptionally receptive to a cause that does offer such a purpose.
Yes, this seems right. A lot of people could usefully contribute to effective altruism seem turned off by moralisation. And some effective altruists are demotivated by it. It’s generally pretty easy to make a point about how people can help without using the word ‘should’, ‘ought’ or ‘obligated’. I think it’s better to engage our intuitive and emotional mind with this talk of excitement.
A strong consideration in favour of donating on that sort of modest wage is that it gets
you into the habit of doing so, rather keeping on putting it off until you’re richer. It also makes you better able to influence others to donate.
Also, an admittedly cursory look at Wikipedia suggests that the median adult income in the US is $24k/year, which’d suggest that telling these people not to donate would exclude half of all adults. (I expect that $24k/year is not the most relevant figure to use here, but the point stands that it’s easy to underestimate how wealthy we are even compared to others in the developed world.)
Yeah, this is a problem in gauging the majority opinions of effective altruists on anything. The best assessment of that for real will come out with the results of the 2014 effective altruism survey, which are being processed now. Even still, though, issues like this are still too new, specific, and narrow within effective altruism for a reliable record of consensus to be known. The issue is that the people who post, or comment, regularly in the Facebook group select themselves to be people who have fun discussing conundrums in giant forums. I am like this. Notice how I am commenting on everything, rather than spending my time earning more money, and then giving it away.
I would estimate that only 10% of effective altruists regularly discuss it on social media, and I don’t believe the few major perspectives put forward in any one discussion thread can be reliably thought of as representative of all positions effective altruists might take on that discussion. I believe to curb this issue is the reason, in part, this forum was started.
There was a great thread in the facebook group on whether people making a modest wage (around or below $30k/yr in US terms) should be donating to effective charities or saving money. I’d like to weigh in on this but that thread is already pretty crowded and unstructured.
The proposition here is “People with average or below average income should save money rather than donate to effective charities”
One thing that it looks like almost nobody mentioned is the opportunity cost of worrying about other people over yourself and how this corresponds to effective altruistic output. It seemed from facebook that most EA’s were against the proposition, claiming that most people in the developed world are still far better off than X% of the global population and therefore they should still be donating some percentage of their wealth. I believe there is a strong case to be made that focusing on optimizing one’s own career capital, not just making smart personal finance decisions, will enable one to earn substantially higher income in the future and thus be a more “E” EA. Any intellectual power devoted to understanding the EA argument (doing the relevant research, picking an EA organization or EA-organization-recommended charity to donate to, and “stretching your EA muscles” by donating a small amount over a regular period of time) is a small investment in terms of money but a large investment in terms of intellectual capital that I think dedicated EA’s tend to discount because they have already invested this capital. This is a CFAR-esque argument that advocates focusing on personal development and improvement until one is at a level to reasonably maximize one’s own output both in terms of income and effectiveness of donations.
I am still uncertain about my position in this debate, but it seemed that most EA’s (at least on facebook) were strongly against the proposition so I would like to see more discussion taking the above points into consideration.
I was involved in the initial facebook thread on the topic. At the time, I made less than 30k, didn’t ever expect to make much more than $30k (I’m a nanny), and was highly turned off by the conversation.
Two cross-country moves later, I have actually doubled my income, but I still am highly turned off by elitist EA conversations that assume that all the readers are high-potential-earners in their 20s with strong social safety nets.
It would have been much easier to convince me to donate 10% of a $30k income, than to upend my life in order to make some kind of career change.
“I still am highly turned off by elitist EA conversations that assume that all the readers are high-potential-earners in their 20s with strong social safety nets.”
Sure, but I think your use of the term elitist is a bit unfair here. I personally know many friends that view my own identification as an EA itself elitist, because by trying to help with things like alleviating global poverty through targeted donation I am putting myself on a pedestal above people living in the developing world (or so the argument goes). To these friends, it’s less elitist to try and focus on pursuing their own happiness rather than thinking you can solve other people’s problems better than they can. Maybe this is why I am arguing for this angle of attack; I have friends that have different off-putting triggers as you.
I agree that we shouldn’t make broad generalizations about EA demographics, but at the same time we shouldn’t misrepresent them; I would wager that a large number, if not the majority, of prospective EAs would fall under the high-potential-earners-in-their-20′s demographic and this is very relevant in the discussion of how to advise people who are just getting into EA. I definitely agree that the same advice wouldn’t work equally well when addressing every person, and sometimes it’s correct to give two different people completely opposite advice. That being said, if I had to give 1 piece of advice in a generalizing way, I would want to consider the demographic of who I am giving this advice to rather than assuming that it is directed towards the median US citizen, for instance.
I think this is what Ryan is saying, but I want to say it again and say more, because I feel strongly and because Ryan left a lot of inferential distance in his post.
I dislike the idea that EA is mostly attractive or mostly applicable to it’s current dominant demographic of math/econ/tech interested people in their 20s. I think the core ideas of EA are compelling to a wide variety of people, and that EA can benefit from skills outside of it’s current mainstream. It seems likely to me that the current situation is more the result of network effects than that EA is not interesting to people outside of this cluster.
Catering our “general” advice to only one sort of person makes it more likely that other types of people will feel lost or unwelcome and not pursue their interest in EA; I take it Erica has felt this way. While the statement Alex made in his last paragraph is reasonable as stated, we are not in the position of only being able to give one piece of advice.
Do you have any idea how we might go about fixing the situation? It seems to me like math/econ/tech people in their 20s (including me) don’t know what it takes to make other demographics feel welcome. The best thing I can think of is encourage some people from other demographics to write about and actively discuss EA, and to spread the writings of people who already do.
That’s a really good question, and as another 20-something in tech, I also definitely don’t have all the answers. I have an in-progress draft of a post more generally on outreach, to be posted somewhere (not sure where), but I’ll briefly list some of my thoughts directly related to making a wider variety of people feel welcome.
Expand our models of EA dedication beyond earning to give. This model doesn’t fit most people well, but it’s by far the most prominent idea of what living an EA life looks like.
People want to see people like them in communities they’re part of (I don’t endorse this state of affairs, but I think it’s often true). This may seem discouraging, because it most obviously says “to get more of x type of people, you need to already have x type of people.” I think it’s not totally unactionable though—if cultural minorities make themselves more visible by posting an commenting on the forum, coming to meetups, etc., new people in the same cultural minorities will see them and know they are welcome.
Do your best not to assume that people are in your cluster. The career advice example is good. Another example is to explain math or econ jargon when you use them in a post. I think this has an outsize effect. The experience of being in a community but having the content aimed at different sorts of people is a little like going to a social dance and having no one ask you to be their partner—it’s hard to believe that you’re wanted, even when people keep telling you so. And it feels really crummy.
Note that I don’t know anyone who has said that they were interested in EA but felt unwelcome there. I think at least part of it is that EA is something that very few people outside of this cluster have even heard of, much less have taken steps towards getting involved in.
I definitely agree, and as a result I wouldn’t cater my advice to only one sort of person. I think it’s best to take an approach where you change the advice you give based on who you are talking to. Perhaps we should have some sort of portfolio of starting advice to give based on simple diagnostics. I’m sure 80,000 hours does something like this, so it’s not new ground. I think this is way better than saying “everybody should donate 10% of their income right now if you can afford it or you’re not a real EA.” And yes some people have said this. I find this to be a huge turn off personally.
ruthie: “It seems likely to me that the current situation is more the result of network effects than that EA is not interesting to people outside of this cluster.” I’m not sure I agree with this. I know surprisingly few people that are both actively altruistic and who actually think critically and examine evidence in their every day lives. I wish this was everyone, but realistically it’s not. I do believe there are a ton of people who would be interested EA that haven’t discovered it yet, but I think that the people who will ultimately be drawn in won’t be totally shunned by the fact that a lot of the info is catered to demographics that aren’t exactly like them. Especially since there is such a large range of socioeconomic status that a person could reside in, and each one might have a totally different EA approach that works best for them (and I’m not even talking about cause selection yet).
What if somebody has no interest in donating, but they are interested in career choice? Or interested in lifestyle change? Or interested in saving, researching, and donating later? Or interested in advocacy? Or interested in personal development? There are a lot of options, and I think telling everyone the blanket advice “just start donating now to GiveWell’s top charities and don’t worry about the meta stuff” will turn off many people in the same way that “focus on yourself until you have more income leverage” might turn people off. I haven’t seen any real evidence either way, just some armchair arguments and half-baked anecdotes, so I don’t understand why everyone is so confident in this.
Are you sure you’re sure? I don’t mean to nitpick, but unless someone from 80,000 Hours has shown, or told us, and they’re reading this, we don’t know for sure. I was going to write something to this affect, but your framing of the idea is even better, so 80,000 Hours should be addressed.
The thing about effective altruism is we don’t need preexisting status to have organizations pay attention to us. They pay attention to our merit, arguments, mettle, and records.
Although that can be self-perpetuating. For example, few would be willing to bite the bullet and say that they should give male-focused advice if 60% of effective altruists were male.
[tangent] Have you tried describing GiveDirectly to these friends, and if so how did they react?
I think that if the Standard EA Recommendation for middle- to low-income people is “come back when you make more money”, no middle- to low-income people (to a first approximation) will ever become interested in EA.
I think if I made 30k a year and asked someone what EA-related things I could do and they told me “you don’t make enough to worry about donating, try to optimize your income some more and then we’ll talk,” my reaction would be “Ack! I don’t want to upend my entire life! I just want to help some people! These guys are mean.” And then I would stop paying attention to effective altruism.
My general heuristic for stuff like this is that it’s more important for general recommendations to look reasonable than for them to be optimal (within reason). This is because by the time someone is wondering whether your policy is actually optimal, they care enough to be thinking like an effective altruist already, and are less likely to be scared off by a wrong answer than someone who’s evaluating the surface-reasonableness.
Agreed—and there are plenty of ways for people to contribute to EA besides donating. Writing articles, helping organize EA events, and offering support and encouragement to people who are working on more direct things are just the three first things that come to mind.
Any large group working on something needs both people working directly on things, and people who are in support roles and take care of the day-to-day needs of the organization. The notion that all EAs should be working directly on something (I’m counting earning-to-give as “working directly on something”, here) seems clearly wrong.
I think we can both agree that the way you say things is very important. Saying “come back when you make more money” is very different from saying “if you are interested in helping people as effectively as possible it may be wise to consider looking out for yourself first before turning your motivations outward.” There are a lot of reasons for people to worry that their lives are too good in comparison to others’ and therefore they have a moral obligation to help. I think a lot of EA’s have felt this way before. When faced with this sentiment, I think it can be a mistake with regard to actually being effective to devote significant effort into explicit donation rather than personal development.
I think you are also framing the argument to make “making more money” sound like a bad thing that most people don’t want to do. A lot of people already want to make more money, and they feel a conflict between trying their best to become successful VS using the resources / leverage they already have to help others. My argument is that focusing on personal goals and development could kill two birds with one stone for a lot of people and I don’t think it’s as off-putting as you make it sound.
Speaking from my own experience, I have a very high propensity to think about others before myself and I think this can be a flaw and limit productivity in many ways. I think I would ultimately be a more effective altruist if I had spent more of my time pondering “how can I become really good at something / develop valuable skills” rather than “how can I do the most good.”
True, but a lot of people are also struggling just to find a job that would be both enjoyable and provide a sufficient wage to pay the bills. Emphasizing making more money could cause them to feel a conflict between finding a job that doesn’t feel soul-crushing VS feeling guilty about being unable to donate much. (Full disclosure: I feel a bit of this, since the career path that I’m currently considering the most isn’t one that I’d expect to make a lot of money.)
“True, but a lot of people are also struggling just to find a job that would be both enjoyable and provide a sufficient wage to pay the bills.”
Agreed, so in that context, how does it make more sense to tell somebody that they should care about helping other people as much as they possibly can? I don’t see that train of thought getting through to many people in this situation.
I don’t want to tell anyone that they should care about helping as many people as possible. I want to tell them that they have a fantastic, exciting opportunity to help lots of people and have a big impact on the world, if they want to.
Someone who is struggling to find a meaningful job might also be someone who’s struggling to find some purpose for their life in general. (This has been true for me.) That might make them exceptionally receptive to a cause that does offer such a purpose.
Yes, this seems right. A lot of people could usefully contribute to effective altruism seem turned off by moralisation. And some effective altruists are demotivated by it. It’s generally pretty easy to make a point about how people can help without using the word ‘should’, ‘ought’ or ‘obligated’. I think it’s better to engage our intuitive and emotional mind with this talk of excitement.
A strong consideration in favour of donating on that sort of modest wage is that it gets you into the habit of doing so, rather keeping on putting it off until you’re richer. It also makes you better able to influence others to donate.
Also, an admittedly cursory look at Wikipedia suggests that the median adult income in the US is $24k/year, which’d suggest that telling these people not to donate would exclude half of all adults. (I expect that $24k/year is not the most relevant figure to use here, but the point stands that it’s easy to underestimate how wealthy we are even compared to others in the developed world.)
Yeah, this is a problem in gauging the majority opinions of effective altruists on anything. The best assessment of that for real will come out with the results of the 2014 effective altruism survey, which are being processed now. Even still, though, issues like this are still too new, specific, and narrow within effective altruism for a reliable record of consensus to be known. The issue is that the people who post, or comment, regularly in the Facebook group select themselves to be people who have fun discussing conundrums in giant forums. I am like this. Notice how I am commenting on everything, rather than spending my time earning more money, and then giving it away.
I would estimate that only 10% of effective altruists regularly discuss it on social media, and I don’t believe the few major perspectives put forward in any one discussion thread can be reliably thought of as representative of all positions effective altruists might take on that discussion. I believe to curb this issue is the reason, in part, this forum was started.
Context here (fixed)
I’m not sure if this is what Ryan meant to link to or not, but here’s the Facebook thread on donating on $30k/year that Alex refers to in his original comment.