There’s no debate over the definition of a castle: Wytham Abbey is not a castle (it is not a form of military fortification). Roughly, Wytham Abbey is a castle in the way that an underground eco-house is a nuclear bunker. Which is to say: not at all (it’s not some mere technicality that makes it not a castle; it is radically not a castle). There is no debate about definition to be had here.
So I’m not having a debate about definition; I’m noting a misrepresentation. I agree that the optics issue is already lost. I also think that we should not be misrepresenting things on the forum, and I think this misrepresentation is not totally irrelevant.
To give a comparison: I think calling Wytham Abbey a castle it’s roughly as big a misrepresentation as claiming that Wytham Abbey cost £80 million rather than £15 million. A castle is a much more expensive, much rarer structure than a mansion (which is basically what Wytham Abbey would be accurately described as: a mansion).
As noted, I agree that the optics battle is lost, but I find it a little odd that people seem to think it’s totally irrelevant that a comment misrepresented things in a way that radically overstates the case (a castle owned by Elizabeth I is more than an order of magnitude more ostentatious than a manor house visited by Elizabeth I). This sort of misrepresentation is not good epistemics (just as it would be bad if a forum comment misstated the price as being £80 million and it would be reasonable to correct this misstatement).
My statement is the following: let’s just represent things correctly and then have the perfectly reasonable discussion from that starting point. If £15 million is too much to spend, let’s say that (rather than discussing whether £80 million is too much to spend). If a manor house is the wrong thing to buy, let’s say that (rather than discussing whether it would be wrong to buy Elizabeth I’s castle).
It’s entirely reasonable to say, as a normative claim, that people should be accurate in reporting.
But when you are thinking about reputational impact of a choice you should be examining not just what the reaction would be to strictly accurate reporting, but how people operating in bad faith could easily represent it, or how good faith people could misinterpret it. Whether they should or not is irrelevant to the predictable consequences.
I want to downvote this comment more strongly than any other comment I’ve downvoted on the EA Forum.
On the EA Forum, we should care about what’s actually true. “Haha, you lose for having to clarify your point!” may be the rule of the game in soundbite politics, but it can’t become the rule of the game in EA’s internal decision-making or in conversations on the EA Forum, or we have little hope of actually doing the most good (as opposed to “doing the thing that’s safest and plays the best, in adversarial soundbite form, to a random consumer of mass media with no context or interest in the topic”).
Truth matters, and the hearts and minds we want to win should heavily skew toward those who care about truth, and not just on what things look like to hypothetical third parties.
I agree much more with Rob’s principles than with my guess at projectionconfusion’s principles. But looking just at PC’s literal statements: yeah, it is stupid anyone thought manor house vs castle was a relevant argument. The question is whether you think it’s a good idea to spend lots of money on a large building at all (I think it can be but obviously depends on specifics), and if so, does it matter if it’s a nice building whose previous owners were rich. I think it’s obvious the latter doesn’t matter, but for people for whom it does matter I don’t think it matters exactly what kind of large old rich person building it is.
So I do view people arguing manor house vs castle as conceding that a castle would be bad optics, and that this is dumb because none of the differences between manor houses and castles are relevant to the question. I just don’t care about the optics of buying large old rich people buildings.
I agree that people shouldn’t think that way, but observably they do. And acknowledging human irrationality and working around it was the founding insight or rationalism and EA. I honestly can’t really respond to most of your first two paragraphs since it seems to be based on the idea we shouldn’t even be considering the question.
I’m not saying truth doesn’t matter (if it came across that way I apologise) but that reputational effects are real and also matter. Which is very different from the strawman position of “we shouldn’t do anything at all odd or unpopular”.
Truth matters, and the hearts and minds we want to win should heavily skew toward those who care about truth, and not just on what things look like to hypothetical third parties.
I disagree with this fundamentally. Its short sighted to narrow down the people we want to persuade to being only a certain set of people. The donations and other contributions of everyone are equally valuable. And the general perception of EA effects people’s likelihood to learn more to begin with.
These are not hypothetical people either. This and FTX are the main stories people are discussing online in relation to EA, and therefore what comes up when people initially do searches looking into it. And if someone’s first impression is negative they are less likely to find out more and more likely to dismiss the movement.
To narrow down our disagreement a bit. Is your position a) this won’t have reputational effects on EA b) there will be reputational effects but they won’t decrease recruitment and donations or c) even if it does decrease recruitment and donations we shouldn’t care about that.
Cool, thanks for clarifying your view! To clarify, here’s a version of your comment that I wouldn’t have objected to at all (filling in details where I left things in square brackets):
‘You’re right that the building is a manor house rather than a castle, and I’m generally in favor of EAs being accurate and precise in our language. That said, I think [demographic D] will mostly not be convinced of EA’s virtue by this argument, because people still think of manor houses as quite fancy. And I think it’s important for EA to convince [demographic D] of our virtue, because [two-sentence summary of why I think we should prioritize appealing to demographic D].’
The main things I found objectionable about “At the point you are having to debate the definition of a castle you’ve lost the optics argument even if you’re technically correct.” were:
The response is snarky, in a way that slightly nudges EA toward a norm of “it’s cringey and low-status to get into nit-picky arguments about what’s true, when what really matters is public perception”. I want to pump hard against moves in that direction, even small ones. I’m already wary of how much EA focuses on public perception; propagating the meme that we should focus on public perception and it’s laughable to care about what’s true (on topics with PR implications) seems outright toxic to me, even if that wasn’t your intent at all.
The response says nothing about whether you agree or disagree with the nitpick about castle terminology, further reinforcing the idea that accuracy is silly and unimportant for EAs to internally think about whenever a topic is PR-adjacent.
The response gives no argument for who you think EA should be trying to court here, or why we should be courting them. I think this is a pretty important step, because it’s quite important (and not trivial) for EAs to carefully mentally distinguish “let’s do PR action X because of specific real-world goal Y” from generalized “we feel socially anxious that we aren’t being socially embraced by enough other monkeys, and will reflexively try to appease them”.
The latter approach doesn’t work in a hostile environment of journalists or Twitter trolls who will strategically drum up outrage in order to push your psychological buttons and compel concessions from you. If you’re going to “play the game” and try to outmaneuver them, it’s very important that you do so in a clear-sighted way. Which requires being unusually explicit about why you think X is a good idea, as opposed to just smirkily shooting down other EAs’ points with an “lol how cringe of you to respond to falsehoods with corrections”.
If it actually matters to avoid some “cringe” behavior, then we should do that in a self-aware way that involves explicitly understanding what we’re doing and why, rather than just parroting whatever the current social gradient is. This helps ensure, among other things, that EA deliberately keeps its cringiness in contexts where it’s actually better to select the cringe option.
More broadly, I objected to what struck me as an attempt to import into the EA Forum the norms of “play the politics game rather than trying to figure out what’s true”, as opposed to merely describing those norms here and explicitly proposing some policy response.
If you’re going to take on the epistemic risk of playing the Game, you need to be sure that you’re explicitly simulating the Game’s moving parts in your world-model, as opposed to steering toward options based primarily on inchoate feelings about what’s popular or unpopular. Otherwise, you’re liable to over-weight near-term status risks, because human brains hyperbolically discount, and were mostly built by evolution to handle coalitional politics in ~200-person local communities where rejection meant literal death.
To narrow down our disagreement a bit. Is your position a) this won’t have reputational effects on EA b) there will be reputational effects but they won’t decrease recruitment and donations or c) even if it does decrease recruitment and donations we shouldn’t care about that.
None of the above! It’s: This will plausibly decrease recruitment and donations a nonzero amount, and that’s a real cost, but news-cycle-obsessed status-anxious people will tend to fixate on this more than makes sense, in ways that:
exaggerate the long-term importance of specific news-cycle ups and downs, resulting in panicky and untethered-from-reality reactions;
distract from more useful stuff we’d otherwise be doing;
provide bad incentives for nervous EAs to dissemble-in-the-name-of-EA or rush-to-concede-too-much;
signal weakness and blood-in-the-water in the political game;
and incentivize adversaries to try to push your buttons more.
Something can be a real cost/problem, and yet the natural reaction to that cost/problem be something that makes things worse rather than better.
I think a few EAs should have a day job of thinking about hit-piece writers (and related topics), and other EAs should mostly ignore the topic, except insofar as they see locally false things and (candidly, unstrategically) chime in to say the true thing in response. (In particular, we should chime in with the true thing whether doing so makes EA look better or worse.)
Its short sighted to narrow down the people we want to persuade to being only a certain set of people. The donations and other contributions of everyone are equally valuable. And the general perception of EA effects people’s likelihood to learn more to begin with.
I agree that we shouldn’t completely write off anyone. But we should very much prioritize reaching some groups over others, and it’s rarely the case that there’s any action available to EA that will please everyone equally. E.g., research of a given quality level is equally useful regardless of who it comes from, but not all human beings are equally likely to do useful research, and pretending that they are doesn’t help anyone.
We shouldn’t put equal effort into outreach to theologians and to biosecurity specialists. Generalizing this principle, we shouldn’t put equal effort into outreach to “people who care a lot about truth” and “people who don’t care a lot about truth”. (Though yes, we should put nonzero effort into reaching the latter group, insofar as we can do so without compromising our core principles or neglecting more-tractable and more-important opportunities.)
At the point you are having to debate the definition of a castle you’ve lost the optics argument even if you’re technically correct.
There’s no debate over the definition of a castle: Wytham Abbey is not a castle (it is not a form of military fortification). Roughly, Wytham Abbey is a castle in the way that an underground eco-house is a nuclear bunker. Which is to say: not at all (it’s not some mere technicality that makes it not a castle; it is radically not a castle). There is no debate about definition to be had here.
So I’m not having a debate about definition; I’m noting a misrepresentation. I agree that the optics issue is already lost. I also think that we should not be misrepresenting things on the forum, and I think this misrepresentation is not totally irrelevant.
To give a comparison: I think calling Wytham Abbey a castle it’s roughly as big a misrepresentation as claiming that Wytham Abbey cost £80 million rather than £15 million. A castle is a much more expensive, much rarer structure than a mansion (which is basically what Wytham Abbey would be accurately described as: a mansion).
As noted, I agree that the optics battle is lost, but I find it a little odd that people seem to think it’s totally irrelevant that a comment misrepresented things in a way that radically overstates the case (a castle owned by Elizabeth I is more than an order of magnitude more ostentatious than a manor house visited by Elizabeth I). This sort of misrepresentation is not good epistemics (just as it would be bad if a forum comment misstated the price as being £80 million and it would be reasonable to correct this misstatement).
My statement is the following: let’s just represent things correctly and then have the perfectly reasonable discussion from that starting point. If £15 million is too much to spend, let’s say that (rather than discussing whether £80 million is too much to spend). If a manor house is the wrong thing to buy, let’s say that (rather than discussing whether it would be wrong to buy Elizabeth I’s castle).
It’s entirely reasonable to say, as a normative claim, that people should be accurate in reporting.
But when you are thinking about reputational impact of a choice you should be examining not just what the reaction would be to strictly accurate reporting, but how people operating in bad faith could easily represent it, or how good faith people could misinterpret it. Whether they should or not is irrelevant to the predictable consequences.
I want to downvote this comment more strongly than any other comment I’ve downvoted on the EA Forum.
On the EA Forum, we should care about what’s actually true. “Haha, you lose for having to clarify your point!” may be the rule of the game in soundbite politics, but it can’t become the rule of the game in EA’s internal decision-making or in conversations on the EA Forum, or we have little hope of actually doing the most good (as opposed to “doing the thing that’s safest and plays the best, in adversarial soundbite form, to a random consumer of mass media with no context or interest in the topic”).
Truth matters, and the hearts and minds we want to win should heavily skew toward those who care about truth, and not just on what things look like to hypothetical third parties.
See also:
My other comment: https://forum.effectivealtruism.org/posts/xof7iFB3uh8Kc53bG/why-did-cea-buy-wytham-abbey?commentId=irngCnzuDm4JR7ufc
It’s Not What It Looks Like
Common Knowledge and Miasma
Politics is way too meta
“PR” is corrosive; “reputation” is not.
I agree much more with Rob’s principles than with my guess at projectionconfusion’s principles. But looking just at PC’s literal statements: yeah, it is stupid anyone thought manor house vs castle was a relevant argument. The question is whether you think it’s a good idea to spend lots of money on a large building at all (I think it can be but obviously depends on specifics), and if so, does it matter if it’s a nice building whose previous owners were rich. I think it’s obvious the latter doesn’t matter, but for people for whom it does matter I don’t think it matters exactly what kind of large old rich person building it is.
So I do view people arguing manor house vs castle as conceding that a castle would be bad optics, and that this is dumb because none of the differences between manor houses and castles are relevant to the question. I just don’t care about the optics of buying large old rich people buildings.
What’s the second strongest time you wanted to downvote a comment on the EA Forum?
I agree that people shouldn’t think that way, but observably they do. And acknowledging human irrationality and working around it was the founding insight or rationalism and EA. I honestly can’t really respond to most of your first two paragraphs since it seems to be based on the idea we shouldn’t even be considering the question.
I’m not saying truth doesn’t matter (if it came across that way I apologise) but that reputational effects are real and also matter. Which is very different from the strawman position of “we shouldn’t do anything at all odd or unpopular”.
I disagree with this fundamentally. Its short sighted to narrow down the people we want to persuade to being only a certain set of people. The donations and other contributions of everyone are equally valuable. And the general perception of EA effects people’s likelihood to learn more to begin with.
These are not hypothetical people either. This and FTX are the main stories people are discussing online in relation to EA, and therefore what comes up when people initially do searches looking into it. And if someone’s first impression is negative they are less likely to find out more and more likely to dismiss the movement.
To narrow down our disagreement a bit. Is your position a) this won’t have reputational effects on EA b) there will be reputational effects but they won’t decrease recruitment and donations or c) even if it does decrease recruitment and donations we shouldn’t care about that.
Cool, thanks for clarifying your view! To clarify, here’s a version of your comment that I wouldn’t have objected to at all (filling in details where I left things in square brackets):
‘You’re right that the building is a manor house rather than a castle, and I’m generally in favor of EAs being accurate and precise in our language. That said, I think [demographic D] will mostly not be convinced of EA’s virtue by this argument, because people still think of manor houses as quite fancy. And I think it’s important for EA to convince [demographic D] of our virtue, because [two-sentence summary of why I think we should prioritize appealing to demographic D].’
The main things I found objectionable about “At the point you are having to debate the definition of a castle you’ve lost the optics argument even if you’re technically correct.” were:
The response is snarky, in a way that slightly nudges EA toward a norm of “it’s cringey and low-status to get into nit-picky arguments about what’s true, when what really matters is public perception”. I want to pump hard against moves in that direction, even small ones. I’m already wary of how much EA focuses on public perception; propagating the meme that we should focus on public perception and it’s laughable to care about what’s true (on topics with PR implications) seems outright toxic to me, even if that wasn’t your intent at all.
The response says nothing about whether you agree or disagree with the nitpick about castle terminology, further reinforcing the idea that accuracy is silly and unimportant for EAs to internally think about whenever a topic is PR-adjacent.
The response gives no argument for who you think EA should be trying to court here, or why we should be courting them. I think this is a pretty important step, because it’s quite important (and not trivial) for EAs to carefully mentally distinguish “let’s do PR action X because of specific real-world goal Y” from generalized “we feel socially anxious that we aren’t being socially embraced by enough other monkeys, and will reflexively try to appease them”.
The latter approach doesn’t work in a hostile environment of journalists or Twitter trolls who will strategically drum up outrage in order to push your psychological buttons and compel concessions from you. If you’re going to “play the game” and try to outmaneuver them, it’s very important that you do so in a clear-sighted way. Which requires being unusually explicit about why you think X is a good idea, as opposed to just smirkily shooting down other EAs’ points with an “lol how cringe of you to respond to falsehoods with corrections”.
If it actually matters to avoid some “cringe” behavior, then we should do that in a self-aware way that involves explicitly understanding what we’re doing and why, rather than just parroting whatever the current social gradient is. This helps ensure, among other things, that EA deliberately keeps its cringiness in contexts where it’s actually better to select the cringe option.
More broadly, I objected to what struck me as an attempt to import into the EA Forum the norms of “play the politics game rather than trying to figure out what’s true”, as opposed to merely describing those norms here and explicitly proposing some policy response.
If you’re going to take on the epistemic risk of playing the Game, you need to be sure that you’re explicitly simulating the Game’s moving parts in your world-model, as opposed to steering toward options based primarily on inchoate feelings about what’s popular or unpopular. Otherwise, you’re liable to over-weight near-term status risks, because human brains hyperbolically discount, and were mostly built by evolution to handle coalitional politics in ~200-person local communities where rejection meant literal death.
None of the above! It’s: This will plausibly decrease recruitment and donations a nonzero amount, and that’s a real cost, but news-cycle-obsessed status-anxious people will tend to fixate on this more than makes sense, in ways that:
exaggerate the long-term importance of specific news-cycle ups and downs, resulting in panicky and untethered-from-reality reactions;
distract from more useful stuff we’d otherwise be doing;
provide bad incentives for nervous EAs to dissemble-in-the-name-of-EA or rush-to-concede-too-much;
signal weakness and blood-in-the-water in the political game;
and incentivize adversaries to try to push your buttons more.
Something can be a real cost/problem, and yet the natural reaction to that cost/problem be something that makes things worse rather than better.
I think a few EAs should have a day job of thinking about hit-piece writers (and related topics), and other EAs should mostly ignore the topic, except insofar as they see locally false things and (candidly, unstrategically) chime in to say the true thing in response. (In particular, we should chime in with the true thing whether doing so makes EA look better or worse.)
I agree that we shouldn’t completely write off anyone. But we should very much prioritize reaching some groups over others, and it’s rarely the case that there’s any action available to EA that will please everyone equally. E.g., research of a given quality level is equally useful regardless of who it comes from, but not all human beings are equally likely to do useful research, and pretending that they are doesn’t help anyone.
We shouldn’t put equal effort into outreach to theologians and to biosecurity specialists. Generalizing this principle, we shouldn’t put equal effort into outreach to “people who care a lot about truth” and “people who don’t care a lot about truth”. (Though yes, we should put nonzero effort into reaching the latter group, insofar as we can do so without compromising our core principles or neglecting more-tractable and more-important opportunities.)