I’m in favour of the movement continuing its focus on producing highly-engaged EA’s, but also iterating on the red-teaming competition.
I guess I’m more skeptical of this becoming a problem then you are. We’ve seen a lot, almost non-stop criticism on the EA forum after the FTX collapse following on a huge increase in criticism due to the red-teaming competition, many of which was highly upvoted.
So I don’t know, I hardly feel like we have a shortage of criticism at the moment. If anything, I’m starting to worry we’re getting too distracted.
All that said, perhaps you feel that I’m focusing too much on the short-term rather than the long?
I don’t quite know how I feel about this perspective. On one hand, everyone has ways to improve and so if you aren’t finding them you probably aren’t looking hard enough. On the other hand, just because X number of people say something, it doesn’t mean that they are correct.
What are the changes that you think should be made that have the strongest case?
I’ve written a bunch of stuff on this recently, so in that sense I’m biased. But my suggestions have generally been:
More transparency from core EA orgs (Givewell seem to set a good benchmark here, and even moreso the charities they recommend)
More division of responsibility among EA orgs (ie more orgs doing fewer things each) - especially (but not exclusively) having separate orgs for EA PR and special ops
More carrots and/or more sticks to incentivise staff at EA orgs to perform their jobs adequately
Less long-term reliance on heuristic reasoning (eg ITN, orthogonality thesis, existential risk, single-author assessments of particular events)
What are the changes that you think should be made that have the strongest case?
Next red-teaming competition shall include a forecasting contest: “What is the worst thing to happen to EA in 2023?” First, two winners will be selected for “best entries ex ante”. Then, in January, we see if anyone actually predicted the worst hazar that happened.
If it was my choice, I’d likely give it a small prize, but not a large one as my perspective is that it is only vaguely in the general vicinity of what happened.
We’ve seen a lot, almost non-stop criticism on the EA forum after the FTX collapse following on a huge increase in criticism due to the red-teaming competition, many of which was highly upvoted.
So I don’t know, I hardly feel like we have a shortage of criticism at the moment.
Agreed.
While much of the FTX criticism/discussion is justified, and the red-teaming competition seems like a (mostly) valuable and healthy undertaking, what I find so motivating about EA is the combination of rigorous criticism alongside the examples of hugely positive impact that can be (and already have been!) achieved. If we forget to at least occasionally celebrate the latter, EA will become too miserable for anyone to want to get involved with.
I don’t think red teaming is a good replacement for the kind of diversity of perspectives and approaches that having more moderately involved EAs would bring. Being a highly involved EA takes a lot time and mental resources, and I would expect mideratley involved EAs to be able to provide this kind of diversity simply because they will have more engagement with non-EA things. They will also be less enmeshed with EA as an identity so will presumably have a more disinterested approach, which I think will valuable.
I also think they would be less effected by the perverse incentives that are in place for highly engaged EAs when it comes to criticism and new ideas—currently, both the whay weighted voting on the Forum works and funding environment where, based in the writeups for EA Funds, having made a good impression on a grantmaker seems to be a non-trivial factor in getting funded disincentivizes criticisms that might not go over well, and having more people who are less concerned with their reputation and standing in the EA community would be a good way to counteract this.
I guess my perspective is that even if you’re focusing heavily on highly engaged EAs, you tend to automatically create a lot (probably even more) moderately engaged EAs, so I don’t buy the claim that we’ll have a shortage of such people to create criticism. This is based on creating highly engaged members being hard.
I do think that you have a point, but my (admittedly somewhat limited) engagement with community builders over the past year makes me believe that the goal of highly-engaged EAs creates suboptimal conditions for this to actually happen. This is mostly because community building, particularly on campuses, seems very focused on producing highly involved EAs who are willing to make career choices and major life decisions based on their engagement, which has an unfortunate by-product of not giving less involved EAs a meaningful mechanism to engage with the community. I don’t think this necessarily means these people will not have those EA principles somewhere in their minds while doing things like voting or making decisions about charitable giving, but I wouldn’t expect to see most of them here on the Forum or engaging with people at a EAGx.
I think that there should be some very easy fixes to this in terms of how exactly you do community building, but in the meanwhile I understand how somebody could look at the EA community and feel they don’t really have a place in it or that they would have to devote an immense amount of time and resources before having a say—I’ve been involved since 2016 and still feel that way sometimes.
Sorry, I spaced on this comment and am replying quite late.
In terms of what I think is missing, intellectually, the thing that stands out to me the most is a lack of rights-based perspectives. I think this is a good and necessary countervailing normative framework for utilitarianism. I didn’t think was necessary when EA was mostly about charity evaluation and charitable donations to mostly global health interventions, but I do think it’s very necessary now that there is a burgeoning tendency towards policy and even political campaigns, particularly in the context of longtermism. EA has traditionally stayed away from areas arounds rights advocacy, and for good reason, but I think it’s vital if the movement is going to become more involved in policy and politics. I think it’s also important to avoid some of the more repugnant logical extensions of longtermism when it comes to tradeoffs between current and future people in intervention design and prioritization.
I’d like to see more discussions around philanthropy and state vs. non-state provision for need satisfiers, particularly in the developing world. I think this is also increasingly important as EA tends more towards a policy and politics direction and ideas like charter cities and fields AI governance are cropping up. I think it was fair enough to handwave objections to the unintended effects of “Western” charities when EA was mainly focused on directing charitable donations to where they would be most effective, but at this scale and within this new framework, I think acting more deliberately and thinking seriously about the meta-level (hopefully with the aid of thinkers and studies from outside EA—I’ve seen the same few books and SSC or ACX posts mentioned whenever these issues come up and there is an enormous wealth of higher quality, more rigorous extant research on the topic that can be helpful, as well as academic who have devoted their career to studying these things) will be extremely helpful.
I also think that there’s probably a lot missing in terms of perspectives from communities that EA is trying to target, and from people who work in the fields of charitable giving, impact assessment, and EA-adjacent areas who are not EAs. There seems to be a bit of an uptick and increased interest in these perspectives lately, but there’s probably a need for direct solicitation to get these voices more visible in the community, and I’m not sure how much is actually being done in that regard.
On a more meta-level, a significant reason I feel that way is to with discussion norms, particularly on the Forum. A lot of discussions are very jargon heavy (and I’d argue that a lot of the jargon is pointless and borders on affectation, and I say this as someone who is involved in academia which is ridiculously jargon-ified, but that’s obviously a personal observation), and my impression is that comments can be explicitly or implicitly hostile to posts that aren’t presented with an air of impartiality and a certain style of argumentation, with highly coted comments addressing the tone and style over the substance. I understand that every space has it’s own idiosyncratic tendencies, and I don’t intend to argue that EA needs to change this on the Forum or in person, but as someone who has a lot of pressure on my time (as I assume many others do), writing a post in line with some controversial-ish stances that conforms to the Forum style is too high cost and too little reward for me. I really don’t think that me not having the time or urge to write up my ideas is a huge loss for the EA community, but I imagine there are some people who have great and innovative ideas who are being lost in this dynamic where the Forum is sometimes treated like an academic journal rather than an online forum. I imagine this effect is particularly strong for people who don’t have direct EA involvement but work in EA-adjacent fields, whose insight could be seriosuly beneficial.
Again, sorry for the late and unnecessarily long and not very well written reply.
Yep, mostly agree that there is a good amounts of criticism around at the moment but this will probably dry up in a few months. I like the idea of iterating on the red-teaming contest.
I’m in favour of the movement continuing its focus on producing highly-engaged EA’s, but also iterating on the red-teaming competition.
I guess I’m more skeptical of this becoming a problem then you are. We’ve seen a lot, almost non-stop criticism on the EA forum after the FTX collapse following on a huge increase in criticism due to the red-teaming competition, many of which was highly upvoted.
So I don’t know, I hardly feel like we have a shortage of criticism at the moment. If anything, I’m starting to worry we’re getting too distracted.
All that said, perhaps you feel that I’m focusing too much on the short-term rather than the long?
I would say there’s been a tonne of criticism, but not a lot of indication that the main players are updating their behaviour based on any of it.
I don’t quite know how I feel about this perspective. On one hand, everyone has ways to improve and so if you aren’t finding them you probably aren’t looking hard enough. On the other hand, just because X number of people say something, it doesn’t mean that they are correct.
What are the changes that you think should be made that have the strongest case?
I’ve written a bunch of stuff on this recently, so in that sense I’m biased. But my suggestions have generally been:
More transparency from core EA orgs (Givewell seem to set a good benchmark here, and even moreso the charities they recommend)
More division of responsibility among EA orgs (ie more orgs doing fewer things each) - especially (but not exclusively) having separate orgs for EA PR and special ops
More carrots and/or more sticks to incentivise staff at EA orgs to perform their jobs adequately
Less long-term reliance on heuristic reasoning (eg ITN, orthogonality thesis, existential risk, single-author assessments of particular events)
What are the changes that you think should be made that have the strongest case?
Next red-teaming competition shall include a forecasting contest: “What is the worst thing to happen to EA in 2023?” First, two winners will be selected for “best entries ex ante”. Then, in January, we see if anyone actually predicted the worst hazar that happened.
Give this person a prize.
If it was my choice, I’d likely give it a small prize, but not a large one as my perspective is that it is only vaguely in the general vicinity of what happened.
Agreed.
While much of the FTX criticism/discussion is justified, and the red-teaming competition seems like a (mostly) valuable and healthy undertaking, what I find so motivating about EA is the combination of rigorous criticism alongside the examples of hugely positive impact that can be (and already have been!) achieved. If we forget to at least occasionally celebrate the latter, EA will become too miserable for anyone to want to get involved with.
Strong agree. I’ve been part of other communities/projects that withered away in this way.
Do you have examples/links?
I don’t think red teaming is a good replacement for the kind of diversity of perspectives and approaches that having more moderately involved EAs would bring. Being a highly involved EA takes a lot time and mental resources, and I would expect mideratley involved EAs to be able to provide this kind of diversity simply because they will have more engagement with non-EA things. They will also be less enmeshed with EA as an identity so will presumably have a more disinterested approach, which I think will valuable.
I also think they would be less effected by the perverse incentives that are in place for highly engaged EAs when it comes to criticism and new ideas—currently, both the whay weighted voting on the Forum works and funding environment where, based in the writeups for EA Funds, having made a good impression on a grantmaker seems to be a non-trivial factor in getting funded disincentivizes criticisms that might not go over well, and having more people who are less concerned with their reputation and standing in the EA community would be a good way to counteract this.
I guess my perspective is that even if you’re focusing heavily on highly engaged EAs, you tend to automatically create a lot (probably even more) moderately engaged EAs, so I don’t buy the claim that we’ll have a shortage of such people to create criticism. This is based on creating highly engaged members being hard.
I do think that you have a point, but my (admittedly somewhat limited) engagement with community builders over the past year makes me believe that the goal of highly-engaged EAs creates suboptimal conditions for this to actually happen. This is mostly because community building, particularly on campuses, seems very focused on producing highly involved EAs who are willing to make career choices and major life decisions based on their engagement, which has an unfortunate by-product of not giving less involved EAs a meaningful mechanism to engage with the community. I don’t think this necessarily means these people will not have those EA principles somewhere in their minds while doing things like voting or making decisions about charitable giving, but I wouldn’t expect to see most of them here on the Forum or engaging with people at a EAGx.
I think that there should be some very easy fixes to this in terms of how exactly you do community building, but in the meanwhile I understand how somebody could look at the EA community and feel they don’t really have a place in it or that they would have to devote an immense amount of time and resources before having a say—I’ve been involved since 2016 and still feel that way sometimes.
Sorry to hear that you feel that way. What kinds of things do you think are missing?
Sorry, I spaced on this comment and am replying quite late.
In terms of what I think is missing, intellectually, the thing that stands out to me the most is a lack of rights-based perspectives. I think this is a good and necessary countervailing normative framework for utilitarianism. I didn’t think was necessary when EA was mostly about charity evaluation and charitable donations to mostly global health interventions, but I do think it’s very necessary now that there is a burgeoning tendency towards policy and even political campaigns, particularly in the context of longtermism. EA has traditionally stayed away from areas arounds rights advocacy, and for good reason, but I think it’s vital if the movement is going to become more involved in policy and politics. I think it’s also important to avoid some of the more repugnant logical extensions of longtermism when it comes to tradeoffs between current and future people in intervention design and prioritization.
I’d like to see more discussions around philanthropy and state vs. non-state provision for need satisfiers, particularly in the developing world. I think this is also increasingly important as EA tends more towards a policy and politics direction and ideas like charter cities and fields AI governance are cropping up. I think it was fair enough to handwave objections to the unintended effects of “Western” charities when EA was mainly focused on directing charitable donations to where they would be most effective, but at this scale and within this new framework, I think acting more deliberately and thinking seriously about the meta-level (hopefully with the aid of thinkers and studies from outside EA—I’ve seen the same few books and SSC or ACX posts mentioned whenever these issues come up and there is an enormous wealth of higher quality, more rigorous extant research on the topic that can be helpful, as well as academic who have devoted their career to studying these things) will be extremely helpful.
I also think that there’s probably a lot missing in terms of perspectives from communities that EA is trying to target, and from people who work in the fields of charitable giving, impact assessment, and EA-adjacent areas who are not EAs. There seems to be a bit of an uptick and increased interest in these perspectives lately, but there’s probably a need for direct solicitation to get these voices more visible in the community, and I’m not sure how much is actually being done in that regard.
On a more meta-level, a significant reason I feel that way is to with discussion norms, particularly on the Forum. A lot of discussions are very jargon heavy (and I’d argue that a lot of the jargon is pointless and borders on affectation, and I say this as someone who is involved in academia which is ridiculously jargon-ified, but that’s obviously a personal observation), and my impression is that comments can be explicitly or implicitly hostile to posts that aren’t presented with an air of impartiality and a certain style of argumentation, with highly coted comments addressing the tone and style over the substance. I understand that every space has it’s own idiosyncratic tendencies, and I don’t intend to argue that EA needs to change this on the Forum or in person, but as someone who has a lot of pressure on my time (as I assume many others do), writing a post in line with some controversial-ish stances that conforms to the Forum style is too high cost and too little reward for me. I really don’t think that me not having the time or urge to write up my ideas is a huge loss for the EA community, but I imagine there are some people who have great and innovative ideas who are being lost in this dynamic where the Forum is sometimes treated like an academic journal rather than an online forum. I imagine this effect is particularly strong for people who don’t have direct EA involvement but work in EA-adjacent fields, whose insight could be seriosuly beneficial.
Again, sorry for the late and unnecessarily long and not very well written reply.
Thanks, that was interesting!
Yep, mostly agree that there is a good amounts of criticism around at the moment but this will probably dry up in a few months. I like the idea of iterating on the red-teaming contest.