First things first, I’m also relatively new to EA (approximately 8 months) and I think that it’s of great value to take into consideration the ideas of new community members who still have a kind of ‘outsider view’ on things.
By in large, I agree and I actually started working on strategies to target people who are involved in relevant cause areas or might be more open to EA’s concepts of expanding the circle of morality.
There a few assumptions that we can be the base of building this strategy:
Communities that have a moral underpinning:
Might be more inclined to be interested in effective altruism in general.
Might be more open to long-term moral arguments, and possibly more easily convinced with them.
Might already have a relatively ‘expanded moral circle’ (e.g, animal welfare activists, climate change activists). This can make expanding their moral circle easier than with other people.
Attracting people that are already interested in one of EA’s cause areas with content that relates to that cause area can help build credibility with them, and make them feel more comfortable, This, in turn, can enhance the openness and willing of those people to read further about other EA causes
Existing communities enable us to reach a great number of people semi-organically and with a low cost.
Having said that, I think we should be careful with popular causes like climate change and animal welfare, the reason is that a respectful amount of the people who support these causes do so for reasons that are not suitable with EA, don’t really have reasoning for their views, or are even aggressive towards people who think differently.
It’s completely anecdotal but yesterday when I mapped relevant facebook communities I noticed some groups explicitly state they do shaming to meat-eaters, or are conspiracy-based.
I agree with the points you make. Additionally, I think it is easier to find EA’s among altruism-related-communities (e.g. climate change, factory farming, etc) rather than effective/logic-related-communities (e.g. philosophers, engineers, scientists). This is because people willing to devote their career to altruistic causes are rare, while quite a lot of people think and reason logically.
Also, I’d love to know of any surveys or research that tries to find correlations between what EAs were doing pre-knowledge-of-EA and post-knowledge-of-EA. Or, what are the opinions on EA among people from different industries or subject-of-study.
Hi Prabhat!
First things first, I’m also relatively new to EA (approximately 8 months) and I think that it’s of great value to take into consideration the ideas of new community members who still have a kind of ‘outsider view’ on things.
By in large, I agree and I actually started working on strategies to target people who are involved in relevant cause areas or might be more open to EA’s concepts of expanding the circle of morality.
There a few assumptions that we can be the base of building this strategy:
Communities that have a moral underpinning:
Might be more inclined to be interested in effective altruism in general.
Might be more open to long-term moral arguments, and possibly more easily convinced with them.
Might already have a relatively ‘expanded moral circle’ (e.g, animal welfare activists, climate change activists). This can make expanding their moral circle easier than with other people.
Attracting people that are already interested in one of EA’s cause areas with content that relates to that cause area can help build credibility with them, and make them feel more comfortable, This, in turn, can enhance the openness and willing of those people to read further about other EA causes
Existing communities enable us to reach a great number of people semi-organically and with a low cost.
Having said that, I think we should be careful with popular causes like climate change and animal welfare, the reason is that a respectful amount of the people who support these causes do so for reasons that are not suitable with EA, don’t really have reasoning for their views, or are even aggressive towards people who think differently.
It’s completely anecdotal but yesterday when I mapped relevant facebook communities I noticed some groups explicitly state they do shaming to meat-eaters, or are conspiracy-based.
Hi Asaf,
I agree with the points you make. Additionally, I think it is easier to find EA’s among altruism-related-communities (e.g. climate change, factory farming, etc) rather than effective/logic-related-communities (e.g. philosophers, engineers, scientists). This is because people willing to devote their career to altruistic causes are rare, while quite a lot of people think and reason logically.
Also, I’d love to know of any surveys or research that tries to find correlations between what EAs were doing pre-knowledge-of-EA and post-knowledge-of-EA. Or, what are the opinions on EA among people from different industries or subject-of-study.