I was involved in the initial facebook thread on the topic. At the time, I made less than 30k, didn’t ever expect to make much more than $30k (I’m a nanny), and was highly turned off by the conversation.
Two cross-country moves later, I have actually doubled my income, but I still am highly turned off by elitist EA conversations that assume that all the readers are high-potential-earners in their 20s with strong social safety nets.
It would have been much easier to convince me to donate 10% of a $30k income, than to upend my life in order to make some kind of career change.
“I still am highly turned off by elitist EA conversations that assume that all the readers are high-potential-earners in their 20s with strong social safety nets.”
Sure, but I think your use of the term elitist is a bit unfair here. I personally know many friends that view my own identification as an EA itself elitist, because by trying to help with things like alleviating global poverty through targeted donation I am putting myself on a pedestal above people living in the developing world (or so the argument goes). To these friends, it’s less elitist to try and focus on pursuing their own happiness rather than thinking you can solve other people’s problems better than they can. Maybe this is why I am arguing for this angle of attack; I have friends that have different off-putting triggers as you.
I agree that we shouldn’t make broad generalizations about EA demographics, but at the same time we shouldn’t misrepresent them; I would wager that a large number, if not the majority, of prospective EAs would fall under the high-potential-earners-in-their-20′s demographic and this is very relevant in the discussion of how to advise people who are just getting into EA. I definitely agree that the same advice wouldn’t work equally well when addressing every person, and sometimes it’s correct to give two different people completely opposite advice. That being said, if I had to give 1 piece of advice in a generalizing way, I would want to consider the demographic of who I am giving this advice to rather than assuming that it is directed towards the median US citizen, for instance.
I think this is what Ryan is saying, but I want to say it again and say more, because I feel strongly and because Ryan left a lot of inferential distance in his post.
I dislike the idea that EA is mostly attractive or mostly applicable to it’s current dominant demographic of math/econ/tech interested people in their 20s. I think the core ideas of EA are compelling to a wide variety of people, and that EA can benefit from skills outside of it’s current mainstream. It seems likely to me that the current situation is more the result of network effects than that EA is not interesting to people outside of this cluster.
Catering our “general” advice to only one sort of person makes it more likely that other types of people will feel lost or unwelcome and not pursue their interest in EA; I take it Erica has felt this way. While the statement Alex made in his last paragraph is reasonable as stated, we are not in the position of only being able to give one piece of advice.
Do you have any idea how we might go about fixing the situation? It seems to me like math/econ/tech people in their 20s (including me) don’t know what it takes to make other demographics feel welcome. The best thing I can think of is encourage some people from other demographics to write about and actively discuss EA, and to spread the writings of people who already do.
That’s a really good question, and as another 20-something in tech, I also definitely don’t have all the answers. I have an in-progress draft of a post more generally on outreach, to be posted somewhere (not sure where), but I’ll briefly list some of my thoughts directly related to making a wider variety of people feel welcome.
Expand our models of EA dedication beyond earning to give. This model doesn’t fit most people well, but it’s by far the most prominent idea of what living an EA life looks like.
People want to see people like them in communities they’re part of (I don’t endorse this state of affairs, but I think it’s often true). This may seem discouraging, because it most obviously says “to get more of x type of people, you need to already have x type of people.” I think it’s not totally unactionable though—if cultural minorities make themselves more visible by posting an commenting on the forum, coming to meetups, etc., new people in the same cultural minorities will see them and know they are welcome.
Do your best not to assume that people are in your cluster. The career advice example is good. Another example is to explain math or econ jargon when you use them in a post. I think this has an outsize effect. The experience of being in a community but having the content aimed at different sorts of people is a little like going to a social dance and having no one ask you to be their partner—it’s hard to believe that you’re wanted, even when people keep telling you so. And it feels really crummy.
Note that I don’t know anyone who has said that they were interested in EA but felt unwelcome there. I think at least part of it is that EA is something that very few people outside of this cluster have even heard of, much less have taken steps towards getting involved in.
I definitely agree, and as a result I wouldn’t cater my advice to only one sort of person. I think it’s best to take an approach where you change the advice you give based on who you are talking to. Perhaps we should have some sort of portfolio of starting advice to give based on simple diagnostics. I’m sure 80,000 hours does something like this, so it’s not new ground. I think this is way better than saying “everybody should donate 10% of their income right now if you can afford it or you’re not a real EA.” And yes some people have said this. I find this to be a huge turn off personally.
ruthie: “It seems likely to me that the current situation is more the result of network effects than that EA is not interesting to people outside of this cluster.” I’m not sure I agree with this. I know surprisingly few people that are both actively altruistic and who actually think critically and examine evidence in their every day lives. I wish this was everyone, but realistically it’s not. I do believe there are a ton of people who would be interested EA that haven’t discovered it yet, but I think that the people who will ultimately be drawn in won’t be totally shunned by the fact that a lot of the info is catered to demographics that aren’t exactly like them. Especially since there is such a large range of socioeconomic status that a person could reside in, and each one might have a totally different EA approach that works best for them (and I’m not even talking about cause selection yet).
What if somebody has no interest in donating, but they are interested in career choice? Or interested in lifestyle change? Or interested in saving, researching, and donating later? Or interested in advocacy? Or interested in personal development? There are a lot of options, and I think telling everyone the blanket advice “just start donating now to GiveWell’s top charities and don’t worry about the meta stuff” will turn off many people in the same way that “focus on yourself until you have more income leverage” might turn people off. I haven’t seen any real evidence either way, just some armchair arguments and half-baked anecdotes, so I don’t understand why everyone is so confident in this.
I’m sure 80,000 hours does something like this, so it’s not new ground.
Are you sure you’re sure? I don’t mean to nitpick, but unless someone from 80,000 Hours has shown, or told us, and they’re reading this, we don’t know for sure. I was going to write something to this affect, but your framing of the idea is even better, so 80,000 Hours should be addressed.
The thing about effective altruism is we don’t need preexisting status to have organizations pay attention to us. They pay attention to our merit, arguments, mettle, and records.
Although that can be self-perpetuating. For example, few would be willing to bite the bullet and say that they should give male-focused advice if 60% of effective altruists were male.
I personally know many friends that view my own identification as an EA itself elitist, because by trying to help with things like alleviating global poverty through targeted donation I am putting myself on a pedestal above people living in the developing world (or so the argument goes). To these friends, it’s less elitist to try and focus on pursuing their own happiness rather than thinking you can solve other people’s problems better than they can.
[tangent] Have you tried describing GiveDirectly to these friends, and if so how did they react?
I was involved in the initial facebook thread on the topic. At the time, I made less than 30k, didn’t ever expect to make much more than $30k (I’m a nanny), and was highly turned off by the conversation.
Two cross-country moves later, I have actually doubled my income, but I still am highly turned off by elitist EA conversations that assume that all the readers are high-potential-earners in their 20s with strong social safety nets.
It would have been much easier to convince me to donate 10% of a $30k income, than to upend my life in order to make some kind of career change.
“I still am highly turned off by elitist EA conversations that assume that all the readers are high-potential-earners in their 20s with strong social safety nets.”
Sure, but I think your use of the term elitist is a bit unfair here. I personally know many friends that view my own identification as an EA itself elitist, because by trying to help with things like alleviating global poverty through targeted donation I am putting myself on a pedestal above people living in the developing world (or so the argument goes). To these friends, it’s less elitist to try and focus on pursuing their own happiness rather than thinking you can solve other people’s problems better than they can. Maybe this is why I am arguing for this angle of attack; I have friends that have different off-putting triggers as you.
I agree that we shouldn’t make broad generalizations about EA demographics, but at the same time we shouldn’t misrepresent them; I would wager that a large number, if not the majority, of prospective EAs would fall under the high-potential-earners-in-their-20′s demographic and this is very relevant in the discussion of how to advise people who are just getting into EA. I definitely agree that the same advice wouldn’t work equally well when addressing every person, and sometimes it’s correct to give two different people completely opposite advice. That being said, if I had to give 1 piece of advice in a generalizing way, I would want to consider the demographic of who I am giving this advice to rather than assuming that it is directed towards the median US citizen, for instance.
I think this is what Ryan is saying, but I want to say it again and say more, because I feel strongly and because Ryan left a lot of inferential distance in his post.
I dislike the idea that EA is mostly attractive or mostly applicable to it’s current dominant demographic of math/econ/tech interested people in their 20s. I think the core ideas of EA are compelling to a wide variety of people, and that EA can benefit from skills outside of it’s current mainstream. It seems likely to me that the current situation is more the result of network effects than that EA is not interesting to people outside of this cluster.
Catering our “general” advice to only one sort of person makes it more likely that other types of people will feel lost or unwelcome and not pursue their interest in EA; I take it Erica has felt this way. While the statement Alex made in his last paragraph is reasonable as stated, we are not in the position of only being able to give one piece of advice.
Do you have any idea how we might go about fixing the situation? It seems to me like math/econ/tech people in their 20s (including me) don’t know what it takes to make other demographics feel welcome. The best thing I can think of is encourage some people from other demographics to write about and actively discuss EA, and to spread the writings of people who already do.
That’s a really good question, and as another 20-something in tech, I also definitely don’t have all the answers. I have an in-progress draft of a post more generally on outreach, to be posted somewhere (not sure where), but I’ll briefly list some of my thoughts directly related to making a wider variety of people feel welcome.
Expand our models of EA dedication beyond earning to give. This model doesn’t fit most people well, but it’s by far the most prominent idea of what living an EA life looks like.
People want to see people like them in communities they’re part of (I don’t endorse this state of affairs, but I think it’s often true). This may seem discouraging, because it most obviously says “to get more of x type of people, you need to already have x type of people.” I think it’s not totally unactionable though—if cultural minorities make themselves more visible by posting an commenting on the forum, coming to meetups, etc., new people in the same cultural minorities will see them and know they are welcome.
Do your best not to assume that people are in your cluster. The career advice example is good. Another example is to explain math or econ jargon when you use them in a post. I think this has an outsize effect. The experience of being in a community but having the content aimed at different sorts of people is a little like going to a social dance and having no one ask you to be their partner—it’s hard to believe that you’re wanted, even when people keep telling you so. And it feels really crummy.
Note that I don’t know anyone who has said that they were interested in EA but felt unwelcome there. I think at least part of it is that EA is something that very few people outside of this cluster have even heard of, much less have taken steps towards getting involved in.
I definitely agree, and as a result I wouldn’t cater my advice to only one sort of person. I think it’s best to take an approach where you change the advice you give based on who you are talking to. Perhaps we should have some sort of portfolio of starting advice to give based on simple diagnostics. I’m sure 80,000 hours does something like this, so it’s not new ground. I think this is way better than saying “everybody should donate 10% of their income right now if you can afford it or you’re not a real EA.” And yes some people have said this. I find this to be a huge turn off personally.
ruthie: “It seems likely to me that the current situation is more the result of network effects than that EA is not interesting to people outside of this cluster.” I’m not sure I agree with this. I know surprisingly few people that are both actively altruistic and who actually think critically and examine evidence in their every day lives. I wish this was everyone, but realistically it’s not. I do believe there are a ton of people who would be interested EA that haven’t discovered it yet, but I think that the people who will ultimately be drawn in won’t be totally shunned by the fact that a lot of the info is catered to demographics that aren’t exactly like them. Especially since there is such a large range of socioeconomic status that a person could reside in, and each one might have a totally different EA approach that works best for them (and I’m not even talking about cause selection yet).
What if somebody has no interest in donating, but they are interested in career choice? Or interested in lifestyle change? Or interested in saving, researching, and donating later? Or interested in advocacy? Or interested in personal development? There are a lot of options, and I think telling everyone the blanket advice “just start donating now to GiveWell’s top charities and don’t worry about the meta stuff” will turn off many people in the same way that “focus on yourself until you have more income leverage” might turn people off. I haven’t seen any real evidence either way, just some armchair arguments and half-baked anecdotes, so I don’t understand why everyone is so confident in this.
Are you sure you’re sure? I don’t mean to nitpick, but unless someone from 80,000 Hours has shown, or told us, and they’re reading this, we don’t know for sure. I was going to write something to this affect, but your framing of the idea is even better, so 80,000 Hours should be addressed.
The thing about effective altruism is we don’t need preexisting status to have organizations pay attention to us. They pay attention to our merit, arguments, mettle, and records.
Although that can be self-perpetuating. For example, few would be willing to bite the bullet and say that they should give male-focused advice if 60% of effective altruists were male.
[tangent] Have you tried describing GiveDirectly to these friends, and if so how did they react?