+1 to Jay’s point. I would probably just give up on working with EAs if this sort of reasoning were dominant to that degree? I don’t think EA can have much positive effect on the world if we’re obsessed with reputation-optimizing to that degree; it’s the sort of thing that can sound reasonable to worry about on paper, but in practice tends to cause more harm than good to fixate on in a big way.
(More reputational harm than reputational benefit, of the sort that matters most for EA’s ability to do the most good; and also more substantive harm than substantive benefit.
Being optics-obsessed is not a good look! I think this is currently the largest reputational problem EA currently actually faces: we promote too much of a culture of fearing and obsessing over optics and reputational risks.)
I think a movement is shaped to a rather large degree by its optics/culture, because that is what will determine who joins and to a lesser extent, who stays when things go wrong.
It seems plausible to me that a culture of somewhat spartan frugality, which seems (from my relatively uninformed perspective) like it was a larger part of the movement in the past, would have a larger positive impact on EA conferences than the stimulating-ness of the site. There’s something poetic about working harder in less onerous conditions than others would, forgoing luxury for extra donations, that I would imagine is at least as animating to the types of people in EA as scenery.
Beyond that, preserving core cultural aspects of a movement, even if the cost is substantial, is crucial to the story that the movement aims to tell.
Most people who are EAs today were inspired by the story of scrappy people gathering in whatever way is cheapest and most accessible, cheeks flushed with intellectual passion, figuring out how to stretch their dollars for the greater good. I think this aesthetic differs substantially from that of AI researchers in a castle, in terms of both losing the “slumming it out for the world” vibe and focusing on the reduction of an existential risk in a way that only a few people can understand rather than global development in a way that everyone can understand.
I’m sure the AI researchers are extremely competent and flushed with intellectual passion for the greatest good too, regardless of where they’re working. Maybe even more so in the castles. I am solely critiquing the optics and their potential cultural effect.
I have little formal evidence for this except the interest in and occasional resistance to the shift towards longtermism that seems widespread on the forum and a few external articles on EA. But I strongly suspect that “people with a career relating to longtermism” is an attractive archetypal representation of the epitome EA to far fewer people than “person who argues about the best place to donate, and donates as much as they can”, because the latter is much more relatable and attainable.
Perhaps an EA mostly focused on attracting select candidates for high impact careers will be more impactful than an EA attempting to make a wide, diffuse cultural impact by including many grassroots supporters. However, it seems that this runs the risk of modifying the target audience of EA from “everyone, because nearly everyone can afford at least 1% with a giving pledge” to .1% of the population of developed countries.
To me, it is at least plausible that the sheer cost of losing the grassroots-y story, paid in fewer, perhaps less-ideologically-committed new recruits, and a generally less positive public view of things related to effective altruism and rationality, could swing the net effect in the other direction. I think the mainstream being influenced over time to be more concerned with sentient beings, more concerned with rationality and calculating expected values on all sorts of purchases/donations, etc is a major potential positive impact that a more outward-facing EA could make.
If EA loses hold of the narrative and becomes, in the eye of the public, “sketchy, naive Masonic elites who only care about their own pet projects, future beings and animals”, I believe the cost to both EA and broader society will be high. Anecdotally, I have seen external articles critiquing EA from these angles, but never from the angle “EA worries too much about its own image”.
+1 to Jay’s point. I would probably just give up on working with EAs if this sort of reasoning were dominant to that degree? I don’t think EA can have much positive effect on the world if we’re obsessed with reputation-optimizing to that degree; it’s the sort of thing that can sound reasonable to worry about on paper, but in practice tends to cause more harm than good to fixate on in a big way.
(More reputational harm than reputational benefit, of the sort that matters most for EA’s ability to do the most good; and also more substantive harm than substantive benefit.
Being optics-obsessed is not a good look! I think this is currently the largest reputational problem EA currently actually faces: we promote too much of a culture of fearing and obsessing over optics and reputational risks.)
I think a movement is shaped to a rather large degree by its optics/culture, because that is what will determine who joins and to a lesser extent, who stays when things go wrong.
It seems plausible to me that a culture of somewhat spartan frugality, which seems (from my relatively uninformed perspective) like it was a larger part of the movement in the past, would have a larger positive impact on EA conferences than the stimulating-ness of the site. There’s something poetic about working harder in less onerous conditions than others would, forgoing luxury for extra donations, that I would imagine is at least as animating to the types of people in EA as scenery.
Beyond that, preserving core cultural aspects of a movement, even if the cost is substantial, is crucial to the story that the movement aims to tell.
Most people who are EAs today were inspired by the story of scrappy people gathering in whatever way is cheapest and most accessible, cheeks flushed with intellectual passion, figuring out how to stretch their dollars for the greater good. I think this aesthetic differs substantially from that of AI researchers in a castle, in terms of both losing the “slumming it out for the world” vibe and focusing on the reduction of an existential risk in a way that only a few people can understand rather than global development in a way that everyone can understand.
I’m sure the AI researchers are extremely competent and flushed with intellectual passion for the greatest good too, regardless of where they’re working. Maybe even more so in the castles. I am solely critiquing the optics and their potential cultural effect.
I have little formal evidence for this except the interest in and occasional resistance to the shift towards longtermism that seems widespread on the forum and a few external articles on EA. But I strongly suspect that “people with a career relating to longtermism” is an attractive archetypal representation of the epitome EA to far fewer people than “person who argues about the best place to donate, and donates as much as they can”, because the latter is much more relatable and attainable.
Perhaps an EA mostly focused on attracting select candidates for high impact careers will be more impactful than an EA attempting to make a wide, diffuse cultural impact by including many grassroots supporters. However, it seems that this runs the risk of modifying the target audience of EA from “everyone, because nearly everyone can afford at least 1% with a giving pledge” to .1% of the population of developed countries.
To me, it is at least plausible that the sheer cost of losing the grassroots-y story, paid in fewer, perhaps less-ideologically-committed new recruits, and a generally less positive public view of things related to effective altruism and rationality, could swing the net effect in the other direction. I think the mainstream being influenced over time to be more concerned with sentient beings, more concerned with rationality and calculating expected values on all sorts of purchases/donations, etc is a major potential positive impact that a more outward-facing EA could make.
If EA loses hold of the narrative and becomes, in the eye of the public, “sketchy, naive Masonic elites who only care about their own pet projects, future beings and animals”, I believe the cost to both EA and broader society will be high. Anecdotally, I have seen external articles critiquing EA from these angles, but never from the angle “EA worries too much about its own image”.