I’m somewhat confused that you list the formation of many groups as a benefit of broad mindset spread, but then say that we should try to achieve the formation of one very large group (that of “low-level EA”). If our goal is many groups, maybe it would be better to just create many groups?
I must have expressed myself badly somehow—I specifically meant that “low-level EA” would be composed of multiple groups. What gave you the opposite impression?
For example, the current situation is that organizations like the Centre for Effective Altruism and Open Philanthropy Project are high-level organizations: they are devoted to finding the best ways of doing good in general. At the same time, organizations like Centre for the Study of Existential Risk, Animal Charity Evaluators, and Center for Applied Rationality are low-level organizations, as they are each devoted to some specific cause area (x-risk, animal welfare, and rationality, respectively). We already have several high- and low-level EA groups, and spreading the ideas would ideally cause even more of both to be formed.
If our goal is to spread particular memes, why not the naive approach of trying to achieve positions of influence in order to spread those particular memes?
This seems completely compatible with what I said? On my own behalf, I’m definitely interested in trying to achieve a position of higher influence to better spread these ideas.
I must have expressed myself badly somehow—I specifically meant that “low-level EA” would be composed of multiple groups. What gave you the opposite impression?
For example, the current situation is that organizations like the Centre for Effective Altruism and Open Philanthropy Project are high-level organizations: they are devoted to finding the best ways of doing good in general. At the same time, organizations like Centre for the Study of Existential Risk, Animal Charity Evaluators, and Center for Applied Rationality are low-level organizations, as they are each devoted to some specific cause area (x-risk, animal welfare, and rationality, respectively). We already have several high- and low-level EA groups, and spreading the ideas would ideally cause even more of both to be formed.
This seems completely compatible with what I said? On my own behalf, I’m definitely interested in trying to achieve a position of higher influence to better spread these ideas.