I disagree that we should avoid discussing topics so as to avoid putting people off this community.[1]
I think some of EA’s greatest contributions come from being willing to voice, discuss and seriously tackle questions that seemed weird or out of touch at the time (e.g. AI safety). If we couldn’t do that, and instead remained within the overton window, I think we lose a lot of the value of taking EA principles seriously.
If someone finds the discussion of extinction or incredibly good/bad futures offputting, this community likely isn’t for them. That happens a lot!
This is not what I am saying, my point is about attentional highlighting.
I am all for discussing everything on the Forum, but I do think when we set attentional priorities—as those weeks do—we could reflect whether we are targeting things that are high value to be discussed and how they land with and how they affect the broader world could be a consideration here.
I think messaging to the broader world that we focus our attention on a question that will only have effects for the small set of funders that are hardcore EA-aligned makes ourselves small.
By crude analogy it’s like having a whole Forum week on welfare weights at the opportunity cost of a week focused on how to improve animal funding generally.
We could have discussion weeks right now on key EA priorities in the news, from the future of effective development aid, to great power war and nuclear risk, to how to manage AI risk under new political realities, that all would seem to affect a much larger set of resourcing and, crucially, also signal to the wider world that we are a community engaging on some of the most salient issues of the day.
I think setting a debate week on a topic that has essentially no chance of affecting non-EA funders is a lost opportunity and I don’t think it would come out on top as a topic in a prioritization of debate weeks topic in the spirit of “how can we do the most good?”
On a more personal level, but I think this is useful to report here, because I don’t think I am the only one with this reaction: I’ve been part of this community for a decade and have built my professional life around it—and I do find it quite alienating that, at a time where we are close to a constitutional crisis in the US, where USAID is in shambles and where the post WW2-order is in question, we aee not highlighting how to take better action in those circumstances but instead discussing a cause prioritization question that seems very unlikely to affect major funding. It feeds the critique of EA that I’ve previously seen as bad faith—that we are too much armchair philosophers.
I think that’s fine—we just have different views on what a desirable size of the potential size of the movement would be.
To clarify—my point is not so much that this discussion is outside the Overton Window, but that it is deeply inside-looking / insular. It was good to be early on AI risk and shrimp welfare and all of the other things we have been early on as a community, but I do think these issues have a higher tractability in mobilizing larger movements / having an impact outside our community than this debate week has.
On a more personal level, but I think this is useful to report here, because I don’t think I am the only one with this reaction: I’ve been part of this community for a decade and have built my professional life around it—and I do find it quite alienating that, at a time where we are close to a constitutional crisis in the US, where USAID is in shambles and where the post WW2-order is in question, we aee not highlighting how to take better action in those circumstances but instead discussing a cause prioritization question that seems very unlikely to affect major funding. It feeds the critique of EA that I’ve previously seen as bad faith—that we are too much armchair philosophers.
I do think it’s a good chance to show that the EA brand is not about short-term interventions but principles of first thinking, being open to weird topics, and inviting people to think outside of the media bubble. At the same time, I would like to see more stories out there (very generally speaking) about people who have used the EA principles to address current issues (at EA Germany, we have been doing this every month for 2 years and were happy to have you as one of the people in our portraits). It’s great that Founders Pledge and TLYCS are acting on the crisis, and Effektiv Spenden is raising funds for that. But I’m glad they are doing this with their brands, leaving EA to focus on the narrow target group of impartially altruistic and truth-seeking people who might, in the future, build the next generation of organizations addressing these or other problems.
I have the impression we have not learned from the communicative mistakes of 2022 in that we are again pushing arguments of limited practical import that alienate people and limit our coalitional options.
In my view, the mistakes of 2022 involved not being professional in running organizations and strategically doing outreach. Instead of the broad communication under their EA brand then, I’m much more positive about how GWWC, 80k, or The School for Moral Ambition are spreading ideas that originated from EA. I hope we can get better at defining our niche target group for the EA brand and working to appeal to them instead of the broad public.
Thanks for laying out your view in such detail, Patrick!
I find it hard to grasp how the EA Forum can be so narrow—given there are no Fora / equivalents for the other brands you mention.
E.g. I still expect the EA Forum is widely perceived as the main place where community discussion happens beyond the narrow mandate you outline so which attentional priorities will be set here will be seen as a broader reflection of the movement than what I think you intend.
I think the main issue is that I was interpreting your point about the public forum’s perception as a fear that people outside could see EA as weird (in a broad sense). I would be fine with this.
But at the same time, I hope that people already interested in EA don’t get the impression from the forum that the topics are limited. On the contrary, I would love to have many discussions here, not restricted by fear of outside perception.
I disagree that we should avoid discussing topics so as to avoid putting people off this community.[1]
I think some of EA’s greatest contributions come from being willing to voice, discuss and seriously tackle questions that seemed weird or out of touch at the time (e.g. AI safety). If we couldn’t do that, and instead remained within the overton window, I think we lose a lot of the value of taking EA principles seriously.
If someone finds the discussion of extinction or incredibly good/bad futures offputting, this community likely isn’t for them. That happens a lot!
Perhaps for some distasteful-to-almost-everyone topics, but this topic doesn’t seem like that at all.
This is not what I am saying, my point is about attentional highlighting.
I am all for discussing everything on the Forum, but I do think when we set attentional priorities—as those weeks do—we could reflect whether we are targeting things that are high value to be discussed and how they land with and how they affect the broader world could be a consideration here.
I think messaging to the broader world that we focus our attention on a question that will only have effects for the small set of funders that are hardcore EA-aligned makes ourselves small.
By crude analogy it’s like having a whole Forum week on welfare weights at the opportunity cost of a week focused on how to improve animal funding generally.
We could have discussion weeks right now on key EA priorities in the news, from the future of effective development aid, to great power war and nuclear risk, to how to manage AI risk under new political realities, that all would seem to affect a much larger set of resourcing and, crucially, also signal to the wider world that we are a community engaging on some of the most salient issues of the day.
I think setting a debate week on a topic that has essentially no chance of affecting non-EA funders is a lost opportunity and I don’t think it would come out on top as a topic in a prioritization of debate weeks topic in the spirit of “how can we do the most good?”
On a more personal level, but I think this is useful to report here, because I don’t think I am the only one with this reaction: I’ve been part of this community for a decade and have built my professional life around it—and I do find it quite alienating that, at a time where we are close to a constitutional crisis in the US, where USAID is in shambles and where the post WW2-order is in question, we aee not highlighting how to take better action in those circumstances but instead discussing a cause prioritization question that seems very unlikely to affect major funding. It feeds the critique of EA that I’ve previously seen as bad faith—that we are too much armchair philosophers.
It seems like you’re making a few slightly different points:
There are much more pressing things to discuss than this question.
This question will alienate people and harm the EA brand because it’s too philosophical/weird.
The fact that the EA Forum team chose this question given the circumstances will alienate people (kind of a mix between 1 and 2).
I’m sympathetic to 1, but disagree with 2 and 3 for the reasons I outlined in my first comment.
I think that’s fine—we just have different views on what a desirable size of the potential size of the movement would be.
To clarify—my point is not so much that this discussion is outside the Overton Window, but that it is deeply inside-looking / insular. It was good to be early on AI risk and shrimp welfare and all of the other things we have been early on as a community, but I do think these issues have a higher tractability in mobilizing larger movements / having an impact outside our community than this debate week has.
I do think it’s a good chance to show that the EA brand is not about short-term interventions but principles of first thinking, being open to weird topics, and inviting people to think outside of the media bubble. At the same time, I would like to see more stories out there (very generally speaking) about people who have used the EA principles to address current issues (at EA Germany, we have been doing this every month for 2 years and were happy to have you as one of the people in our portraits). It’s great that Founders Pledge and TLYCS are acting on the crisis, and Effektiv Spenden is raising funds for that. But I’m glad they are doing this with their brands, leaving EA to focus on the narrow target group of impartially altruistic and truth-seeking people who might, in the future, build the next generation of organizations addressing these or other problems.
In my view, the mistakes of 2022 involved not being professional in running organizations and strategically doing outreach. Instead of the broad communication under their EA brand then, I’m much more positive about how GWWC, 80k, or The School for Moral Ambition are spreading ideas that originated from EA. I hope we can get better at defining our niche target group for the EA brand and working to appeal to them instead of the broad public.
Thanks for laying out your view in such detail, Patrick!
I find it hard to grasp how the EA Forum can be so narrow—given there are no Fora / equivalents for the other brands you mention.
E.g. I still expect the EA Forum is widely perceived as the main place where community discussion happens beyond the narrow mandate you outline so which attentional priorities will be set here will be seen as a broader reflection of the movement than what I think you intend.
I think the main issue is that I was interpreting your point about the public forum’s perception as a fear that people outside could see EA as weird (in a broad sense). I would be fine with this.
But at the same time, I hope that people already interested in EA don’t get the impression from the forum that the topics are limited. On the contrary, I would love to have many discussions here, not restricted by fear of outside perception.