I do think that you have a point, but my (admittedly somewhat limited) engagement with community builders over the past year makes me believe that the goal of highly-engaged EAs creates suboptimal conditions for this to actually happen. This is mostly because community building, particularly on campuses, seems very focused on producing highly involved EAs who are willing to make career choices and major life decisions based on their engagement, which has an unfortunate by-product of not giving less involved EAs a meaningful mechanism to engage with the community. I don’t think this necessarily means these people will not have those EA principles somewhere in their minds while doing things like voting or making decisions about charitable giving, but I wouldn’t expect to see most of them here on the Forum or engaging with people at a EAGx.
I think that there should be some very easy fixes to this in terms of how exactly you do community building, but in the meanwhile I understand how somebody could look at the EA community and feel they don’t really have a place in it or that they would have to devote an immense amount of time and resources before having a say—I’ve been involved since 2016 and still feel that way sometimes.
Sorry, I spaced on this comment and am replying quite late.
In terms of what I think is missing, intellectually, the thing that stands out to me the most is a lack of rights-based perspectives. I think this is a good and necessary countervailing normative framework for utilitarianism. I didn’t think was necessary when EA was mostly about charity evaluation and charitable donations to mostly global health interventions, but I do think it’s very necessary now that there is a burgeoning tendency towards policy and even political campaigns, particularly in the context of longtermism. EA has traditionally stayed away from areas arounds rights advocacy, and for good reason, but I think it’s vital if the movement is going to become more involved in policy and politics. I think it’s also important to avoid some of the more repugnant logical extensions of longtermism when it comes to tradeoffs between current and future people in intervention design and prioritization.
I’d like to see more discussions around philanthropy and state vs. non-state provision for need satisfiers, particularly in the developing world. I think this is also increasingly important as EA tends more towards a policy and politics direction and ideas like charter cities and fields AI governance are cropping up. I think it was fair enough to handwave objections to the unintended effects of “Western” charities when EA was mainly focused on directing charitable donations to where they would be most effective, but at this scale and within this new framework, I think acting more deliberately and thinking seriously about the meta-level (hopefully with the aid of thinkers and studies from outside EA—I’ve seen the same few books and SSC or ACX posts mentioned whenever these issues come up and there is an enormous wealth of higher quality, more rigorous extant research on the topic that can be helpful, as well as academic who have devoted their career to studying these things) will be extremely helpful.
I also think that there’s probably a lot missing in terms of perspectives from communities that EA is trying to target, and from people who work in the fields of charitable giving, impact assessment, and EA-adjacent areas who are not EAs. There seems to be a bit of an uptick and increased interest in these perspectives lately, but there’s probably a need for direct solicitation to get these voices more visible in the community, and I’m not sure how much is actually being done in that regard.
On a more meta-level, a significant reason I feel that way is to with discussion norms, particularly on the Forum. A lot of discussions are very jargon heavy (and I’d argue that a lot of the jargon is pointless and borders on affectation, and I say this as someone who is involved in academia which is ridiculously jargon-ified, but that’s obviously a personal observation), and my impression is that comments can be explicitly or implicitly hostile to posts that aren’t presented with an air of impartiality and a certain style of argumentation, with highly coted comments addressing the tone and style over the substance. I understand that every space has it’s own idiosyncratic tendencies, and I don’t intend to argue that EA needs to change this on the Forum or in person, but as someone who has a lot of pressure on my time (as I assume many others do), writing a post in line with some controversial-ish stances that conforms to the Forum style is too high cost and too little reward for me. I really don’t think that me not having the time or urge to write up my ideas is a huge loss for the EA community, but I imagine there are some people who have great and innovative ideas who are being lost in this dynamic where the Forum is sometimes treated like an academic journal rather than an online forum. I imagine this effect is particularly strong for people who don’t have direct EA involvement but work in EA-adjacent fields, whose insight could be seriosuly beneficial.
Again, sorry for the late and unnecessarily long and not very well written reply.
I do think that you have a point, but my (admittedly somewhat limited) engagement with community builders over the past year makes me believe that the goal of highly-engaged EAs creates suboptimal conditions for this to actually happen. This is mostly because community building, particularly on campuses, seems very focused on producing highly involved EAs who are willing to make career choices and major life decisions based on their engagement, which has an unfortunate by-product of not giving less involved EAs a meaningful mechanism to engage with the community. I don’t think this necessarily means these people will not have those EA principles somewhere in their minds while doing things like voting or making decisions about charitable giving, but I wouldn’t expect to see most of them here on the Forum or engaging with people at a EAGx.
I think that there should be some very easy fixes to this in terms of how exactly you do community building, but in the meanwhile I understand how somebody could look at the EA community and feel they don’t really have a place in it or that they would have to devote an immense amount of time and resources before having a say—I’ve been involved since 2016 and still feel that way sometimes.
Sorry to hear that you feel that way. What kinds of things do you think are missing?
Sorry, I spaced on this comment and am replying quite late.
In terms of what I think is missing, intellectually, the thing that stands out to me the most is a lack of rights-based perspectives. I think this is a good and necessary countervailing normative framework for utilitarianism. I didn’t think was necessary when EA was mostly about charity evaluation and charitable donations to mostly global health interventions, but I do think it’s very necessary now that there is a burgeoning tendency towards policy and even political campaigns, particularly in the context of longtermism. EA has traditionally stayed away from areas arounds rights advocacy, and for good reason, but I think it’s vital if the movement is going to become more involved in policy and politics. I think it’s also important to avoid some of the more repugnant logical extensions of longtermism when it comes to tradeoffs between current and future people in intervention design and prioritization.
I’d like to see more discussions around philanthropy and state vs. non-state provision for need satisfiers, particularly in the developing world. I think this is also increasingly important as EA tends more towards a policy and politics direction and ideas like charter cities and fields AI governance are cropping up. I think it was fair enough to handwave objections to the unintended effects of “Western” charities when EA was mainly focused on directing charitable donations to where they would be most effective, but at this scale and within this new framework, I think acting more deliberately and thinking seriously about the meta-level (hopefully with the aid of thinkers and studies from outside EA—I’ve seen the same few books and SSC or ACX posts mentioned whenever these issues come up and there is an enormous wealth of higher quality, more rigorous extant research on the topic that can be helpful, as well as academic who have devoted their career to studying these things) will be extremely helpful.
I also think that there’s probably a lot missing in terms of perspectives from communities that EA is trying to target, and from people who work in the fields of charitable giving, impact assessment, and EA-adjacent areas who are not EAs. There seems to be a bit of an uptick and increased interest in these perspectives lately, but there’s probably a need for direct solicitation to get these voices more visible in the community, and I’m not sure how much is actually being done in that regard.
On a more meta-level, a significant reason I feel that way is to with discussion norms, particularly on the Forum. A lot of discussions are very jargon heavy (and I’d argue that a lot of the jargon is pointless and borders on affectation, and I say this as someone who is involved in academia which is ridiculously jargon-ified, but that’s obviously a personal observation), and my impression is that comments can be explicitly or implicitly hostile to posts that aren’t presented with an air of impartiality and a certain style of argumentation, with highly coted comments addressing the tone and style over the substance. I understand that every space has it’s own idiosyncratic tendencies, and I don’t intend to argue that EA needs to change this on the Forum or in person, but as someone who has a lot of pressure on my time (as I assume many others do), writing a post in line with some controversial-ish stances that conforms to the Forum style is too high cost and too little reward for me. I really don’t think that me not having the time or urge to write up my ideas is a huge loss for the EA community, but I imagine there are some people who have great and innovative ideas who are being lost in this dynamic where the Forum is sometimes treated like an academic journal rather than an online forum. I imagine this effect is particularly strong for people who don’t have direct EA involvement but work in EA-adjacent fields, whose insight could be seriosuly beneficial.
Again, sorry for the late and unnecessarily long and not very well written reply.
Thanks, that was interesting!