“Huh, this person definitely speaks fluent LessWrong. I wonder if they read Project Lawful? Who wrote this post, anyway? I may have heard of them.
...Okay, yeah, fair enough.”
One thing I definitely believe, and have commented on before[1], is that median EA’s (I.e, EA’s without an unusual amount of influence) are over-optimising for the image of EA as a whole, which sometimes conflicts with actually trying to do effective altruism. Let the PR people and the intellectual leaders of EA handle that—people outside that should be focusing on saying what we sincerely believe to be true, and worrying much less about whether someone, somewhere, might call us bad people for saying it. That ship has sailed—there are people out there, by now, who already have the conclusion of “And therefore, EA’s are bad people” written down—refusing to post an opinion won’t stop them filling in the middle bits with something else, and this was true even before the FTX debacle.
In short—“We should give the money back because it would help EA’s image” is, imo, a bad take. “We should give the money back because it would be the right thing to do” is, imo, a much better take, which I won’t take a stand on myself at this point since I don’t have a horse in this race.
Maybe I should write a post about this at some point...though I recognise that now, in particular, isn’t exactly the right time to do that.
On a deleted post, so I can’t link the comment here, but people can search my comment history for “Personally, I disagree strenuously” if they wish to verify this.
One thing I definitely believe, and have commented on before[1], is that median EA’s (I.e, EA’s without an unusual amount of influence) are over-optimising for the image of EA as a whole, which sometimes conflicts with actually trying to do effective altruism. Let the PR people and the intellectual leaders of EA handle that—people outside that should be focusing on saying what we sincerely believe to be true
FWIW, I’m directly updating on this (and on the slew of aggressively bad faith criticism from detractors following this event).
I’ll stop trying to think about how we should optimise for and manage PR, and default to honesty and accurate representation (as opposed to strategic presentation of our positions to make them more appealing/easier to accept).
(This is not to imply that I ever condoned lying, but I have thought that it may be better to e.g. change which parts of EA messaging we highlight based on what people seem best receptive to vs our real cruxes: e.g. justify existential risk mitigation because 8 billion people dying is bad, instead of via inaccessible longtermist arguments.)
I strongly endorse this (and also strongly endorse Eliezer’s OP).
A related comment I made just before reading this post (in response to someone suggesting that we ditch the “EA” brand in order to reduce future criticism):
I strongly disagree—first, because this is dishonest and dishonorable. And second, because I don’t think EA should try to have an immaculate brand.
Indeed, I suspect that part of what went wrong in the FTX case is that EA was optimizing too hard for having an immaculate brand, at the expense of optimizing for honesty, integrity, open discussion of what we actually believe, etc. I don’t think this is the only thing that was going on, but it would help explain why people with concerns about SBF/FTX kept quiet about those concerns. Because they either were worried about sullying EA’s name, or they were worried about social punishment from others who didn’t want EA’s name sullied.
IMO, trying super hard to never have your brand’s name sullied, at the expense of ordinary moral goals like “be honest”, tends to sully one’s brand far more than if you’d just ignored the brand and prioritized other concerns. Especially insofar as the people you’re trying to appeal to are very smart, informed, careful thinkers; you might be able to trick the Median Voter that EA is cool via a shallow PR campaign and attempts to strategically manipulate the narrative, but you’ll have a far harder time tricking the sorts of extremely smart, educated, morally scrupulous people who EA is actually mostly trying to recruit and persuade.
And this is especially especially true insofar as you’re trying to get a huge diffuse community to all play along with your fake narrative! It’s not as though you can send out a memo to all EAs to dissemble in a specific way that furthers your narrative; the memo will just leak and cause more harm than good. EA needs to give up on this “have an immaculate brand” mirage and shift its mindset toward goals more like “exemplify real virtues”, “focus on the object level”, and “specialize in the niche of being The Group That Speaks Truth Even When It’s Uncomfortable”. You won’t win over everyone that way, but you’ll win more of the battles that matter, and that are actually winnable.
I’ve expressed my own concerns in the past about the term “EA”, more in line with “this brand creates too much of a temptation to try to Look Immaculate rather than Be Scrupulously Honest”, and in line with Eliezer’s concern that people are twisting themselves up in knots out of a perceived pressure to be Good, rather than finding a more healthy balancing point.
But I strongly, strongly disendorse this reason to change brand — not just because it’s an obvious overreaction, but because it’s moving EA in the wrong direction.
If we ever rebranded, we should if anything plausibly try for a less immaculate-looking brand — something messier, weirder, and less possessed of a Virtuous Glow. Something that encourages us more to keep our focus on The Work and on personal integrity (in multiple respects), rather than tempting and distracting participants with the possibility of Looking Amazingly Good To The Entire World.
I do think it’s good for us to take pride in what we’re doing, because I think we’re legitimately doing things that warrant pride. I think most of the stuff I’ve done since joining the rationalists and EAs has been goddam awesome, and I find it fun as heck to get to collaborate and chat with other people working on cool ambitious projects to make the world more awesome.
But I think there’s something that breaks when that personal pride and happy camaraderie sublimates into “wow I’m participating in this legibly immaculately good Abstraction called Effective Altruism” sorts of feelings.
“Huh, this person definitely speaks fluent LessWrong. I wonder if they read Project Lawful? Who wrote this post, anyway? I may have heard of them.
...Okay, yeah, fair enough.”
One thing I definitely believe, and have commented on before[1], is that median EA’s (I.e, EA’s without an unusual amount of influence) are over-optimising for the image of EA as a whole, which sometimes conflicts with actually trying to do effective altruism. Let the PR people and the intellectual leaders of EA handle that—people outside that should be focusing on saying what we sincerely believe to be true, and worrying much less about whether someone, somewhere, might call us bad people for saying it. That ship has sailed—there are people out there, by now, who already have the conclusion of “And therefore, EA’s are bad people” written down—refusing to post an opinion won’t stop them filling in the middle bits with something else, and this was true even before the FTX debacle.
In short—“We should give the money back because it would help EA’s image” is, imo, a bad take. “We should give the money back because it would be the right thing to do” is, imo, a much better take, which I won’t take a stand on myself at this point since I don’t have a horse in this race.
Maybe I should write a post about this at some point...though I recognise that now, in particular, isn’t exactly the right time to do that.
On a deleted post, so I can’t link the comment here, but people can search my comment history for “Personally, I disagree strenuously” if they wish to verify this.
FWIW, I’m directly updating on this (and on the slew of aggressively bad faith criticism from detractors following this event).
I’ll stop trying to think about how we should optimise for and manage PR, and default to honesty and accurate representation (as opposed to strategic presentation of our positions to make them more appealing/easier to accept). (This is not to imply that I ever condoned lying, but I have thought that it may be better to e.g. change which parts of EA messaging we highlight based on what people seem best receptive to vs our real cruxes: e.g. justify existential risk mitigation because 8 billion people dying is bad, instead of via inaccessible longtermist arguments.)
I strongly endorse this (and also strongly endorse Eliezer’s OP).
A related comment I made just before reading this post (in response to someone suggesting that we ditch the “EA” brand in order to reduce future criticism):
I’ve expressed my own concerns in the past about the term “EA”, more in line with “this brand creates too much of a temptation to try to Look Immaculate rather than Be Scrupulously Honest”, and in line with Eliezer’s concern that people are twisting themselves up in knots out of a perceived pressure to be Good, rather than finding a more healthy balancing point.
But I strongly, strongly disendorse this reason to change brand — not just because it’s an obvious overreaction, but because it’s moving EA in the wrong direction.
If we ever rebranded, we should if anything plausibly try for a less immaculate-looking brand — something messier, weirder, and less possessed of a Virtuous Glow. Something that encourages us more to keep our focus on The Work and on personal integrity (in multiple respects), rather than tempting and distracting participants with the possibility of Looking Amazingly Good To The Entire World.
I do think it’s good for us to take pride in what we’re doing, because I think we’re legitimately doing things that warrant pride. I think most of the stuff I’ve done since joining the rationalists and EAs has been goddam awesome, and I find it fun as heck to get to collaborate and chat with other people working on cool ambitious projects to make the world more awesome.
But I think there’s something that breaks when that personal pride and happy camaraderie sublimates into “wow I’m participating in this legibly immaculately good Abstraction called Effective Altruism” sorts of feelings.