“someone who is motivated in part by an impartial care for others, is thinking very carefully about how they can best help others, and who is taking significant actions to help (most likely through their careers). In practice, this might look like selecting a job or degree program, donating a substantial portion of their income, working on EA-related projects, etc.
Why would you not want >1% of the population to fit this description? I think even prominent EA haters would be in favor, if you left out the name “EA” out.
People often argue for ‘Narrow EA’. Here is an example of where I suggested this strategy might not be wise and people disagreed.
Although of course, there’s an ‘at the current margin’ thing going on here. I.e., maybe the ideal size is huge, but since we’ve got limited time and resources we should not aim for that and instead focus on keeping it small and high quality.
Perhaps a more informative question would be something like, “For the next 5 years, should the Dutch EA community aim for broad growth or narrow specialisation?” (in other words, something similar to this Q from the MCF survey).
Yeah, I think you ended up asking “would it be good for a lot of people to share our values”, instead of “should we try to actively recruit tons of people to our specific community”
I asked, “As we plan our future initiatives, it’s useful to understand where our community believes we should focus our efforts. Please share your opinion on which of the following we should prioritise.
Growing the Community: Focus on increasing our membership and raising broader awareness of EA.
Developing Community Depth: Concentrate on deepening understanding and engagement.
Taking a Balanced Approach: Allocate our efforts equally between growing and deepening.
Other (Please specify): If you have a different perspective, we’d love to hear it.
I don’t know”
27 people voted, 16 voted for ‘taking a balanced approach’, 6 for ‘growing the community’, 1 for ‘developing community depth’, and 4 for ‘I don’t know’.
‘Narrow EA’ and having >1% of the population fitting the above description aren’t opposite strategies.
Maybe it’s similar to someone interested in animal welfare thinking alt protein coordination should focus on scientists, entrepreneurs, funders and policy makers but also thinking it would be good for there to be lots of people interested in veganism.
Aren’t they? Like, if I’m aiming for >1% of the population I ought to spend a lot of my resources on marketing and building a network of organisers. If I’m aiming for something smaller I ought to spend my time investing in the community I’ve already got and maybe some field building.
To make it more concrete, in Q1 of 2024 I could spend 15% of my time investing in our marketing so that we double the number of intro programme sign-ups; alternatively, I could put that time into developing a Dutch Existential Risk Initiative. One is big EA, one is narrow EA.
I think it depends on how you define ‘narrow EA’, if you focus on getting 1% of the population to give effectively, that’s different to helping 100 people make impactful career switches but both could be defined as narrow in different ways.
One being narrow as it focuses on a small number of people, one being narrow as it spreads a subset of EA ideas.
Taking the Dutch Existential Risk Initiative example, it will be narrow in terms of cause focus but the strategy could still vary between focusing on top academics or a mass media campaign.
I’m pretty sure Narrow EA is usually used to refer to the strategy of influencing a small number of particularly influential people. That’s part of what I’m pushing back against (although we’ve deviated from the original discussion point, which was on organising vs mobilising). [got confused about which quicktake we were discussing]
I think all of the ERIs are narrow (they target talented researchers). A more broad project would be the Existential Risk Observatory, which aims to inform the public through mass media outreach. They’ve done a lot of good work in the Netherlands and abroad, but I don’t think they’ve been able to get funding from the biggest EA funds. I don’t know why but I suspect it’s because their main focus is the general public, and not the decision-makers.
Why would you not want >1% of the population to fit this description? I think even prominent EA haters would be in favor, if you left out the name “EA” out.
People often argue for ‘Narrow EA’. Here is an example of where I suggested this strategy might not be wise and people disagreed.
Although of course, there’s an ‘at the current margin’ thing going on here. I.e., maybe the ideal size is huge, but since we’ve got limited time and resources we should not aim for that and instead focus on keeping it small and high quality.
Perhaps a more informative question would be something like, “For the next 5 years, should the Dutch EA community aim for broad growth or narrow specialisation?” (in other words, something similar to this Q from the MCF survey).
Yeah, I think you ended up asking “would it be good for a lot of people to share our values”, instead of “should we try to actively recruit tons of people to our specific community”
Gave it a second go.
I asked, “As we plan our future initiatives, it’s useful to understand where our community believes we should focus our efforts. Please share your opinion on which of the following we should prioritise.
Growing the Community: Focus on increasing our membership and raising broader awareness of EA.
Developing Community Depth: Concentrate on deepening understanding and engagement.
Taking a Balanced Approach: Allocate our efforts equally between growing and deepening.
Other (Please specify): If you have a different perspective, we’d love to hear it.
I don’t know”
27 people voted, 16 voted for ‘taking a balanced approach’, 6 for ‘growing the community’, 1 for ‘developing community depth’, and 4 for ‘I don’t know’.
‘Narrow EA’ and having >1% of the population fitting the above description aren’t opposite strategies.
Maybe it’s similar to someone interested in animal welfare thinking alt protein coordination should focus on scientists, entrepreneurs, funders and policy makers but also thinking it would be good for there to be lots of people interested in veganism.
Aren’t they? Like, if I’m aiming for >1% of the population I ought to spend a lot of my resources on marketing and building a network of organisers. If I’m aiming for something smaller I ought to spend my time investing in the community I’ve already got and maybe some field building.
To make it more concrete, in Q1 of 2024 I could spend 15% of my time investing in our marketing so that we double the number of intro programme sign-ups; alternatively, I could put that time into developing a Dutch Existential Risk Initiative. One is big EA, one is narrow EA.
I think it depends on how you define ‘narrow EA’, if you focus on getting 1% of the population to give effectively, that’s different to helping 100 people make impactful career switches but both could be defined as narrow in different ways.
One being narrow as it focuses on a small number of people, one being narrow as it spreads a subset of EA ideas.
Taking the Dutch Existential Risk Initiative example, it will be narrow in terms of cause focus but the strategy could still vary between focusing on top academics or a mass media campaign.
I’m pretty sure Narrow EA is usually used to refer to the strategy of influencing a small number of particularly influential people.
That’s part of what I’m pushing back against (although we’ve deviated from the original discussion point, which was on organising vs mobilising).[got confused about which quicktake we were discussing]I think all of the ERIs are narrow (they target talented researchers). A more broad project would be the Existential Risk Observatory, which aims to inform the public through mass media outreach. They’ve done a lot of good work in the Netherlands and abroad, but I don’t think they’ve been able to get funding from the biggest EA funds. I don’t know why but I suspect it’s because their main focus is the general public, and not the decision-makers.