Perhaps a bit of a ramble after reading your post and others, but I opted to click “comment” rather than delete it.
I pretty reflexively identify as and would not hesitate to associate myself with the EA community. A part of this is that I view my identity as an EA as relating to its core principles of a commitment to doing good and using reason to determine how best to do so. Although I believe that much of the cause prioritization/downstream beliefs common in the EA community are wise and well-taken, I don’t consider these to be essential to being an EA. Were it the case that I thought that the community was mistaken on any or many of its determinations, I would still consider myself to be EA.
Of course, we cannot choose the inferences that others make when we say that we are EA, and thus one might risk people making assumptions or just getting unwelcome vibes.
I wonder what could be done to spread a message that EA is about the fundamental commitment to doing good the best one can rather than commitment to specific cause areas or downstream beliefs. One issue is likely the relative unipolarity of EA funding. Even if members of the EA community disagree on downstream beliefs held by funders and cause area prioritizations, there will likely be less representation of this within EA because it is unlikely that this work will get funded and EA is likely to be viewed as “what EA does” rather than “what people who identify/affiliate as EA think”. Furthermore, it is likely that people who identify as EAs will be tempted to adopt beliefs and priorities that are funded by EA.
I think there is just going to be a tension between the core principles of EA and the demonstrated actions of the EA community insofar as no person would be perfectly represented by its collective actions. And action is pretty heavily tethered to funding which the community has little ability to influence, so others may see an EA that reflects priorities and beliefs that an individual EA would not endorse. So, there is likely a gap between what you might mean when you say “I am EA” and what somewhat else might hear, and I understand why it might make more sense to be more concerned about the latter.
One of the things that comes to mind are the variety of beliefs that might fall under religious identities. People might have significant political disagreements, for instance, but come together as Christians under the belief that Jesus is the son of God who died for our sins. Getting EA to be more associated with its essential principles than downstream beliefs and conclusions might be critical to its expansion and making people more comfortable identifying/affiliating with it, but this would probably be a difficult project.
Perhaps a bit of a ramble after reading your post and others, but I opted to click “comment” rather than delete it.
I pretty reflexively identify as and would not hesitate to associate myself with the EA community. A part of this is that I view my identity as an EA as relating to its core principles of a commitment to doing good and using reason to determine how best to do so. Although I believe that much of the cause prioritization/downstream beliefs common in the EA community are wise and well-taken, I don’t consider these to be essential to being an EA. Were it the case that I thought that the community was mistaken on any or many of its determinations, I would still consider myself to be EA.
Of course, we cannot choose the inferences that others make when we say that we are EA, and thus one might risk people making assumptions or just getting unwelcome vibes.
I wonder what could be done to spread a message that EA is about the fundamental commitment to doing good the best one can rather than commitment to specific cause areas or downstream beliefs. One issue is likely the relative unipolarity of EA funding. Even if members of the EA community disagree on downstream beliefs held by funders and cause area prioritizations, there will likely be less representation of this within EA because it is unlikely that this work will get funded and EA is likely to be viewed as “what EA does” rather than “what people who identify/affiliate as EA think”. Furthermore, it is likely that people who identify as EAs will be tempted to adopt beliefs and priorities that are funded by EA.
I think there is just going to be a tension between the core principles of EA and the demonstrated actions of the EA community insofar as no person would be perfectly represented by its collective actions. And action is pretty heavily tethered to funding which the community has little ability to influence, so others may see an EA that reflects priorities and beliefs that an individual EA would not endorse. So, there is likely a gap between what you might mean when you say “I am EA” and what somewhat else might hear, and I understand why it might make more sense to be more concerned about the latter.
One of the things that comes to mind are the variety of beliefs that might fall under religious identities. People might have significant political disagreements, for instance, but come together as Christians under the belief that Jesus is the son of God who died for our sins. Getting EA to be more associated with its essential principles than downstream beliefs and conclusions might be critical to its expansion and making people more comfortable identifying/affiliating with it, but this would probably be a difficult project.
Thank you for sharing this!
I think that’s nice.
I agree with this. I think this is one of the major issues, and I’ve mentioned it in the past.
Yes, i’d guess one could say it’s the other side of the token problem, and why we might need to show a greater diversity of people “affiliating”.