I can see this point, but I’m curious—how would you feel about the reverse? Let’s say that CEA chose not to buy it, and instead did conferences the normal way. A few months later, you’re talking to someone from CEA, and they say something like:
Yeah, we were thinking of buying a nice place for these retreats, which would have been cheaper in the long run, but we realised that would probably make us look bad. So we decided to eat the extra cost and use conference halls instead, in order to help EA’s reputation.
Would you be at all concerned by this statement, or would that be a totally reasonable tradeoff to make?
+1 to Jay’s point. I would probably just give up on working with EAs if this sort of reasoning were dominant to that degree? I don’t think EA can have much positive effect on the world if we’re obsessed with reputation-optimizing to that degree; it’s the sort of thing that can sound reasonable to worry about on paper, but in practice tends to cause more harm than good to fixate on in a big way.
(More reputational harm than reputational benefit, of the sort that matters most for EA’s ability to do the most good; and also more substantive harm than substantive benefit.
Being optics-obsessed is not a good look! I think this is currently the largest reputational problem EA currently actually faces: we promote too much of a culture of fearing and obsessing over optics and reputational risks.)
I think a movement is shaped to a rather large degree by its optics/culture, because that is what will determine who joins and to a lesser extent, who stays when things go wrong.
It seems plausible to me that a culture of somewhat spartan frugality, which seems (from my relatively uninformed perspective) like it was a larger part of the movement in the past, would have a larger positive impact on EA conferences than the stimulating-ness of the site. There’s something poetic about working harder in less onerous conditions than others would, forgoing luxury for extra donations, that I would imagine is at least as animating to the types of people in EA as scenery.
Beyond that, preserving core cultural aspects of a movement, even if the cost is substantial, is crucial to the story that the movement aims to tell.
Most people who are EAs today were inspired by the story of scrappy people gathering in whatever way is cheapest and most accessible, cheeks flushed with intellectual passion, figuring out how to stretch their dollars for the greater good. I think this aesthetic differs substantially from that of AI researchers in a castle, in terms of both losing the “slumming it out for the world” vibe and focusing on the reduction of an existential risk in a way that only a few people can understand rather than global development in a way that everyone can understand.
I’m sure the AI researchers are extremely competent and flushed with intellectual passion for the greatest good too, regardless of where they’re working. Maybe even more so in the castles. I am solely critiquing the optics and their potential cultural effect.
I have little formal evidence for this except the interest in and occasional resistance to the shift towards longtermism that seems widespread on the forum and a few external articles on EA. But I strongly suspect that “people with a career relating to longtermism” is an attractive archetypal representation of the epitome EA to far fewer people than “person who argues about the best place to donate, and donates as much as they can”, because the latter is much more relatable and attainable.
Perhaps an EA mostly focused on attracting select candidates for high impact careers will be more impactful than an EA attempting to make a wide, diffuse cultural impact by including many grassroots supporters. However, it seems that this runs the risk of modifying the target audience of EA from “everyone, because nearly everyone can afford at least 1% with a giving pledge” to .1% of the population of developed countries.
To me, it is at least plausible that the sheer cost of losing the grassroots-y story, paid in fewer, perhaps less-ideologically-committed new recruits, and a generally less positive public view of things related to effective altruism and rationality, could swing the net effect in the other direction. I think the mainstream being influenced over time to be more concerned with sentient beings, more concerned with rationality and calculating expected values on all sorts of purchases/donations, etc is a major potential positive impact that a more outward-facing EA could make.
If EA loses hold of the narrative and becomes, in the eye of the public, “sketchy, naive Masonic elites who only care about their own pet projects, future beings and animals”, I believe the cost to both EA and broader society will be high. Anecdotally, I have seen external articles critiquing EA from these angles, but never from the angle “EA worries too much about its own image”.
I refuse to believe that renting out a conference hall would actually have cost more.
Investing £15,000,000 a year would yield roughly £1,000,000 a year on the stock market. If you are spending a million pounds on the venue alone for a 1,000 person conference, you are not doing it right. A convention hall typically runs in the tens of thousands of dollars, not the millions. This is a 100x markup.
The calculations there are completely correct under the assumption that the space is being used 365 days a year, which strikes me as wildly implausible. I was working on the assumption the is space used a few days each year. If this space is actually being occupied 100% of the time, I’d gladly retract my criticism.
The actual usage of the abbey is very likely to be somewhere between these two numbers. Definitely I would expect it to be used far more than for one major conference per year, but I wouldn’t expect 100% usage either.
It depends. In isolation, that statement does seem concerning to me, like they may have been overestimating the potential negative optics.
What matters to me here is whether sufficient thought was put into all the different aspects. Clearly, they thought a lot about the non-optics stuff. I have no way of easily evaluating those kinds of statements, as I have very little experience organizing conferences. But I’m concerned that maybe there wasn’t sufficient thought given to just how bad the optics can get with this sort of thing.
My career has been in communications, so I’m used to thinking about PR risks and advocating for thinking about those aspects. Perhaps I’m posting here with a bias from that point of view. If I were in a room with decision-makers, I’d expect my comments here to be balanced by arguments on the other side.
Even so, my suspicion is that, if you write something like “do what really is good rather than just what seems good”, you’re more likely to be underestimating rather than overestimating PR risks.
FWIW, as someone who also works in communications, I strongly disagree here and think EA spends massively too much of its mental energy thinking about optics.
More specifically:
I tend to criticize virtue ethics and deontology a lot more than I praise them—IMO these are approaches that often go badly wrong. But I think PR (for a community like EA) is an area where deontology-like adherence to “behave honestly and with integrity” and virtue-ethics-like focus on “be the sort of person internally who you would find most admirable and virtuous” tends to have far better consequences than “select the action that naively looks as though it will make others like you the most”.
If you’re an EA and you want to improve EA’s reputation, my main advice to you is going to look very virtue-ethics-flavored: be brave, be thoughtful, be discerning, be honest, be honorable, be fair, be compassionate, be trustworthy; and insofar as you’re not those things, be honest about it (because honesty is on the list, and is paramount to trusting everything else about your apparent virtues); and let your reputation organically follow from the visible signs of those internal traits of yours, rather than being a thing you work hard on optimizing separately from optimizing whether you’re actually an awesome person.
Have integrity, and speak truth even when you’re scared to, and be the sort of person you’d have found inspiring to run into in your early days at EA, if someone could read your mind and see the generators of your behavior.
Do stuff that you feel really and deeply proud of, rather than stuff that you’d be embarrassed by if someone fully understood what you were doing and why, context and all.
I think that for all or nearly-all EAs, that should pretty much be the entire focus of their thoughts about EA’s reputation.
The other 10% is something like: “But sometimes adding time and care to how, when, and whether you say something can be a big deal. It could have real effects on the first impressions you, and the ideas and communities and memes you care about, make on people who (a) could have a lot to contribute on goals you care about; (b) are the sort of folks for whom first impressions matter.”
10% is maybe an average. I think it should be lower (5%?) for an early-career person who’s prioritizing exploration, experimentation and learning. I think it should be higher (20%?) for someone who’s in a high-stakes position, has a lot of people scrutinizing what they say, and would lose the opportunity to do a lot of valuable things if they substantially increased the time they spent clearing up misunderstandings.
I wish it could be 0% instead of 5-20%, and this emphatically includes what I wish for myself. I deeply wish I could constantly express myself in exploratory, incautious ways—including saying things colorfully and vividly, saying things I’m not even sure I believe, and generally ‘trying on’ all kinds of ideas and messages. This is my natural way of being; but I feel like I’ve got pretty unambiguous reasons to think it’s a bad idea.
If you want to defend 0%, can you give me something here beyond your intuition? The stakes are high (and I think “Heuristics are almost never >90% right” is a pretty good prior).
Frankly I would think that there was finally someone with a modicum of sense and understanding of basic PR working in the area. And upgrade my views of the competency of the organisation accordingly.
Also I’d not that “this will save money in the long run” is a fairly big claim that has not been justified. There are literally hundreds of conference venues within a reasonable distance of Oxford, all of which are run by professional event managers who are able to take advantage of specialisation and economies of scale. Making it difficult to believe
Optics is real. We live in the real world. Optics factor into QUALYs or any other metric. Why would the reverse be true, that we ignore reputation-related effects, even if they are fully real?
I feel a bit awkward quoting the Bible, but there’s one part that’s super relevant to this discussion from a secular perspective. It’s Corinthians 8:6 to 8:13, and is basically like, “hey, we know doing X isn’t bad, but anyone seeing us doing X they’d think we’re casting away our principles, which would cause them to do wrong, so we’re not going to do X.” Here’s the quote,
yet for us there is one God, the Father, from whom are all things and for whom we exist, and one Lord, Jesus Christ, through whom are all things and through whom we exist. However, not all possess this knowledge. But some, through former association with idols, eat food as really offered to an idol, and their conscience, being weak, is defiled. Food will not commend us to God. We are no worse off if we do not eat, and no better off if we do. But take care that this right of yours does not somehow become a stumbling block to the weak. For if anyone sees you who have knowledge eating in an idol’s temple, will he not be encouraged, if his conscience is weak, to eat food offered to idols? And so by your knowledge this weak person is destroyed, the brother for whom Christ died. Thus, sinning against your brothers and wounding their conscience when it is weak, you sin against Christ. Therefore, if food makes my brother stumble, I will never eat meat, lest I make my brother stumble.
It also comes off as quite manipulative and dishonest, which puts people off. There are many people who’ll respect you if you disagree with them but state your opinion plainly and clearly, without trying to hide the weird or objectionable parts of your view. There are relatively few who will respect you if they find out you tried to manipulate their opinion of you, prioritizing optics over substance.
And this seems especially harmful for EA, whose central selling point is “we’re the people who try to actually do the most good, not just signal goodness or go through the motions”. Most public conversations about EA optics are extremely naive on this point, treating it as a free action for EAs to spend half their time publicly hand-wringing about their reputations.
What sort of message do you think that sends to people who come to the EA Forum for the first time, interested in EA, and find the conversations dominated by reputation obsession, panicky glances at the news cycle, complicated strategies to toss first-order utility out the window for the sake of massaging outsiders’ views of EA, etc.? Is that the best possible public face you could pick for EA?
In fact, I don’t think that we should adopt the stance “be so terrified of PR risks that you refuse to talk about PR”. I think EA should blurt far more than it currently does, and this will inevitably mean talking at least a little about people’s emotional fears re looking weird to others, being embarrassed to do something, etc.
But recognizing the deep PR costs of EA’s long-standing public obsession with reputation management is at least a first step in the direction of unraveling the obsession for some people, I’d hope.
Yeah I totally agree. I’d agree with the statement “it’s helpful to take optics into account, but not let it dominate our decision making process”. My original comment was in response to the idea that ‘actually doing good is more important than looking like doing good’ which I would argue is an oversimplification of the real world and not a good principle. I don’t think that it’s helpful to care entirely about optics or never care about optics. It’s more nuanced.
I also think it could help to break down the term “optics” a bit. I think the purchase is bad for first impressions, which is one particular type of optics.
Anyways this whole discussion about optics is kind of a red herring. People will be shocked by the purchase because it was by a charity and was pretty exorbitant, and in fact it was (by that one guy’s admission… I’m on a phone and don’t want to look up his name in the comment above) purchased for to make conference participants feel inspired and was not made as a cost savings mechanism. Appearance (not being frugal) reflects reality in this case, at least based on that comment I read by that one guy (and if I’m wrong just let me be wrong at this point, I have work to do and don’t care to debate this further).
But yeah I agree about let’s not wholly concentrate on optics. Of course.
Let’s say we had one charitable person who has a reputation for being charitable, and another charitable person who has a reputation for hurting others. Someone needing charity avoid the latter, even though the latter is also beneficial.
There’s a big difference between trying to represent yourself in an accurate or an inaccurate way. In either case you’re caring about what people think about you, but if we assume the perceiver is acting in their self interest, then the accurate representation will benefit them, and the inaccurate representation may harm them.
I’m not disagreeing with what you wrote. I’m adding to it that “caring about optics” can actually be more honest. It’s possible to care about optics so that you’re represented honestly, too.
SBF caused damage not because he virtue signaled with his cheap car and lack of style, but because he was misrepresenting himself and being a dick.
It makes sense for people to talk about not wanting to be misrepresented, and if I were a new visitor to the forum and I saw people upset about being misrepresented, I’d probably be sympathetic to them. I also might think they were cry babies and too obsessed with their image, which is what you’re saying could be the case, and I agree with that.
Also just by the way, I guess the ideal would be to care what other people think but be strong enough to do what one thinks is right. I think there’s a psychological element to all this. I’ve lived in some places where I was looked down on, even though I was working hard for their benefit, and it did suck psychologically. It would’ve been better for everyone if people had known how much I cared about them, but yeah it can be important to not worry too much about what other people think, as you wrote.
I do think it’s great when EAs see a false thing being said about EA and they speak up to say what they think the true thing is.
Which can look like “optics”, and is not exactly the same as “just be internally virtuous and go about your business”.
Ideally, though, I think this would apply to false positive claims about EA, as well as false negative claims about EA.
Otherwise, discerning people will be rightly skeptical when you object to criticisms of EA, or will generally suspect you of cherry-picking evidence when you want to push a specific conclusion.
I don’t think EA should be indifferent to “other human beings have false beliefs about something that seems very important to me”.
And plausibly a few EAs should specialize in thinking about EA’s reputation, and go further still. Though I think EA PR specialists should still ground their activities in “first and foremost, be actually-virtuous and honest”.
The stronger claim I want to make is less “never ever think about EA’s reputation”, more “make it your goal to give people accurate models of EA, rather than to improve EA’s reputation per se”.
Treat people (including non-EAs) more like agents, and more like peers.
Even when you don’t actually think someone is your peer, I think that on the whole it turns out to be a really good behavioral policy to share information with people like you would with a (new-to-this-topic) peer.
Try to empower them by giving them more accurate models, even if you’re quietly worried they’ll screw it up somehow. Even when it’s tempting to try to steer people by feeding them subtly-unrepresentative data, resist the temptation and just tell them what your actual model is.
You’ll lose some winnable battles that way, but I think the net effect of EA adopting this approach is that we’re a lot likelier to win the war and cause things to go well in the future.
EA is typically an (honest) virtue signal to sufficiently savvy, discerning assessors of virtue.
This isn’t the only reason people do EA, but it’s a good and important reason, because virtue is a good thing and therefore it’s good when there are incentives to honestly signal it.
Like, the central reason EA is a good idea is that helping people is good, and EA is a good approach / community / body of knowledge for achieving that.
But if something is a way to help others, then it’s also typically a way to show that you care about others (and are smart, discerning, etc.)
And that’s good too. Because it’s socially valuable to have ways to accurately signal those characteristics to others, and because (insofar as visibly having those characteristics is socially rewarded) this increases the incentive to actually do good.
This also suggests that some level of insularity and elitism is prosocial, as long as you don’t fuck up the process of picking which people to trust and value the opinions of.
“EA is a signal of goodness” incentivizes actual goodness insofar as either (a) EA’s good in a way that’s maximally legible, impressive, and emotionally salient to a Random Human Being, or (b) you care more about signaling goodness to the most discerning and knowledgeable people.
It’s the same mechanism as, e.g., an academic field: the quality of academic discourse depends on the fact that physicists care more about impressing their colleagues (plus visibly smart and discerning outsiders), than about impressing random CNN viewing audiences.
If physicists cared more about impressing CNN watchers, then there would be a lot more wannabe Deepak Chopras, people engaged in hyperbole and fuzzy thinking, etc. It’s important that physicists care primarily about elite opinion, and it’s important they chose the right elite.
[...] Embracing the wrong elites can potentially be arbitrarily bad, because any group can be thought of as an “elite” relative to some selection process. For some people, Infowars is elite opinion, and Infowars fans are the specialists you try to impress. For others, theologians are the elite. Or nutrition scientists, or psi researchers, or your favorite political party, or...
Picking the wrong elite to signal to (and defer to, etc.) can be a complete disaster. That, plus general pressures to virtue-signal egalitarianism, causes a lot of people to try to find some way to avoid having to pick an “elite”.
But populism faces the same core problem as elitism: just as elites can be wrong, Mass Opinion can be wrong too.
In the end, there’s no good alternative to putting in the hard work to identify which people have a stronger grasp on reality.
But once you’ve put in that work, insofar as you feel confident about the result, you should indeed care more about the opinion of people who have accurate beliefs than about what a random person on the street thinks of you.
This has failure modes, because every option has failure modes. But the failure mode “the physics community mostly just cares about popular book sales a la Deepak Chopra” is a lot worse than the failure mode “physicists sometimes put too much weight on a researcher’s bad work”.
And it’s a lot worse, at the meta / civilizational level, for humanity to have no true community of rigorous physicists, than for humanity to have rigorous physicists and theologians. Some things aren’t worth giving up in order to prevent the existence of theology.
I can see this point, but I’m curious—how would you feel about the reverse? Let’s say that CEA chose not to buy it, and instead did conferences the normal way. A few months later, you’re talking to someone from CEA, and they say something like:
Yeah, we were thinking of buying a nice place for these retreats, which would have been cheaper in the long run, but we realised that would probably make us look bad. So we decided to eat the extra cost and use conference halls instead, in order to help EA’s reputation.
Would you be at all concerned by this statement, or would that be a totally reasonable tradeoff to make?
+1 to Jay’s point. I would probably just give up on working with EAs if this sort of reasoning were dominant to that degree? I don’t think EA can have much positive effect on the world if we’re obsessed with reputation-optimizing to that degree; it’s the sort of thing that can sound reasonable to worry about on paper, but in practice tends to cause more harm than good to fixate on in a big way.
(More reputational harm than reputational benefit, of the sort that matters most for EA’s ability to do the most good; and also more substantive harm than substantive benefit.
Being optics-obsessed is not a good look! I think this is currently the largest reputational problem EA currently actually faces: we promote too much of a culture of fearing and obsessing over optics and reputational risks.)
I think a movement is shaped to a rather large degree by its optics/culture, because that is what will determine who joins and to a lesser extent, who stays when things go wrong.
It seems plausible to me that a culture of somewhat spartan frugality, which seems (from my relatively uninformed perspective) like it was a larger part of the movement in the past, would have a larger positive impact on EA conferences than the stimulating-ness of the site. There’s something poetic about working harder in less onerous conditions than others would, forgoing luxury for extra donations, that I would imagine is at least as animating to the types of people in EA as scenery.
Beyond that, preserving core cultural aspects of a movement, even if the cost is substantial, is crucial to the story that the movement aims to tell.
Most people who are EAs today were inspired by the story of scrappy people gathering in whatever way is cheapest and most accessible, cheeks flushed with intellectual passion, figuring out how to stretch their dollars for the greater good. I think this aesthetic differs substantially from that of AI researchers in a castle, in terms of both losing the “slumming it out for the world” vibe and focusing on the reduction of an existential risk in a way that only a few people can understand rather than global development in a way that everyone can understand.
I’m sure the AI researchers are extremely competent and flushed with intellectual passion for the greatest good too, regardless of where they’re working. Maybe even more so in the castles. I am solely critiquing the optics and their potential cultural effect.
I have little formal evidence for this except the interest in and occasional resistance to the shift towards longtermism that seems widespread on the forum and a few external articles on EA. But I strongly suspect that “people with a career relating to longtermism” is an attractive archetypal representation of the epitome EA to far fewer people than “person who argues about the best place to donate, and donates as much as they can”, because the latter is much more relatable and attainable.
Perhaps an EA mostly focused on attracting select candidates for high impact careers will be more impactful than an EA attempting to make a wide, diffuse cultural impact by including many grassroots supporters. However, it seems that this runs the risk of modifying the target audience of EA from “everyone, because nearly everyone can afford at least 1% with a giving pledge” to .1% of the population of developed countries.
To me, it is at least plausible that the sheer cost of losing the grassroots-y story, paid in fewer, perhaps less-ideologically-committed new recruits, and a generally less positive public view of things related to effective altruism and rationality, could swing the net effect in the other direction. I think the mainstream being influenced over time to be more concerned with sentient beings, more concerned with rationality and calculating expected values on all sorts of purchases/donations, etc is a major potential positive impact that a more outward-facing EA could make.
If EA loses hold of the narrative and becomes, in the eye of the public, “sketchy, naive Masonic elites who only care about their own pet projects, future beings and animals”, I believe the cost to both EA and broader society will be high. Anecdotally, I have seen external articles critiquing EA from these angles, but never from the angle “EA worries too much about its own image”.
I refuse to believe that renting out a conference hall would actually have cost more.
Investing £15,000,000 a year would yield roughly £1,000,000 a year on the stock market. If you are spending a million pounds on the venue alone for a 1,000 person conference, you are not doing it right. A convention hall typically runs in the tens of thousands of dollars, not the millions. This is a 100x markup.
This comment suggests that renting conference venues in Oxford can be pretty expensive:
https://forum.effectivealtruism.org/posts/xof7iFB3uh8Kc53bG/why-did-cea-buy-wytham-abbey?commentId=3yeffQWcRFvmeteqc
Your cost estimates seems to be in the wrong order of magnitude.
The calculations there are completely correct under the assumption that the space is being used 365 days a year, which strikes me as wildly implausible. I was working on the assumption the is space used a few days each year. If this space is actually being occupied 100% of the time, I’d gladly retract my criticism.
The actual usage of the abbey is very likely to be somewhere between these two numbers. Definitely I would expect it to be used far more than for one major conference per year, but I wouldn’t expect 100% usage either.
It depends. In isolation, that statement does seem concerning to me, like they may have been overestimating the potential negative optics.
What matters to me here is whether sufficient thought was put into all the different aspects. Clearly, they thought a lot about the non-optics stuff. I have no way of easily evaluating those kinds of statements, as I have very little experience organizing conferences. But I’m concerned that maybe there wasn’t sufficient thought given to just how bad the optics can get with this sort of thing.
My career has been in communications, so I’m used to thinking about PR risks and advocating for thinking about those aspects. Perhaps I’m posting here with a bias from that point of view. If I were in a room with decision-makers, I’d expect my comments here to be balanced by arguments on the other side.
Even so, my suspicion is that, if you write something like “do what really is good rather than just what seems good”, you’re more likely to be underestimating rather than overestimating PR risks.
FWIW, as someone who also works in communications, I strongly disagree here and think EA spends massively too much of its mental energy thinking about optics.
More specifically:
I tend to criticize virtue ethics and deontology a lot more than I praise them—IMO these are approaches that often go badly wrong. But I think PR (for a community like EA) is an area where deontology-like adherence to “behave honestly and with integrity” and virtue-ethics-like focus on “be the sort of person internally who you would find most admirable and virtuous” tends to have far better consequences than “select the action that naively looks as though it will make others like you the most”.
If you’re an EA and you want to improve EA’s reputation, my main advice to you is going to look very virtue-ethics-flavored: be brave, be thoughtful, be discerning, be honest, be honorable, be fair, be compassionate, be trustworthy; and insofar as you’re not those things, be honest about it (because honesty is on the list, and is paramount to trusting everything else about your apparent virtues); and let your reputation organically follow from the visible signs of those internal traits of yours, rather than being a thing you work hard on optimizing separately from optimizing whether you’re actually an awesome person.
Have integrity, and speak truth even when you’re scared to, and be the sort of person you’d have found inspiring to run into in your early days at EA, if someone could read your mind and see the generators of your behavior.
Do stuff that you feel really and deeply proud of, rather than stuff that you’d be embarrassed by if someone fully understood what you were doing and why, context and all.
I think that for all or nearly-all EAs, that should pretty much be the entire focus of their thoughts about EA’s reputation.
My take is about 90% in agreement with this.
The other 10% is something like: “But sometimes adding time and care to how, when, and whether you say something can be a big deal. It could have real effects on the first impressions you, and the ideas and communities and memes you care about, make on people who (a) could have a lot to contribute on goals you care about; (b) are the sort of folks for whom first impressions matter.”
10% is maybe an average. I think it should be lower (5%?) for an early-career person who’s prioritizing exploration, experimentation and learning. I think it should be higher (20%?) for someone who’s in a high-stakes position, has a lot of people scrutinizing what they say, and would lose the opportunity to do a lot of valuable things if they substantially increased the time they spent clearing up misunderstandings.
I wish it could be 0% instead of 5-20%, and this emphatically includes what I wish for myself. I deeply wish I could constantly express myself in exploratory, incautious ways—including saying things colorfully and vividly, saying things I’m not even sure I believe, and generally ‘trying on’ all kinds of ideas and messages. This is my natural way of being; but I feel like I’ve got pretty unambiguous reasons to think it’s a bad idea.
If you want to defend 0%, can you give me something here beyond your intuition? The stakes are high (and I think “Heuristics are almost never >90% right” is a pretty good prior).
Frankly I would think that there was finally someone with a modicum of sense and understanding of basic PR working in the area. And upgrade my views of the competency of the organisation accordingly.
Also I’d not that “this will save money in the long run” is a fairly big claim that has not been justified. There are literally hundreds of conference venues within a reasonable distance of Oxford, all of which are run by professional event managers who are able to take advantage of specialisation and economies of scale. Making it difficult to believe
Optics is real. We live in the real world. Optics factor into QUALYs or any other metric. Why would the reverse be true, that we ignore reputation-related effects, even if they are fully real?
I feel a bit awkward quoting the Bible, but there’s one part that’s super relevant to this discussion from a secular perspective. It’s Corinthians 8:6 to 8:13, and is basically like, “hey, we know doing X isn’t bad, but anyone seeing us doing X they’d think we’re casting away our principles, which would cause them to do wrong, so we’re not going to do X.” Here’s the quote,
yet for us there is one God, the Father, from whom are all things and for whom we exist, and one Lord, Jesus Christ, through whom are all things and through whom we exist. However, not all possess this knowledge. But some, through former association with idols, eat food as really offered to an idol, and their conscience, being weak, is defiled. Food will not commend us to God. We are no worse off if we do not eat, and no better off if we do. But take care that this right of yours does not somehow become a stumbling block to the weak. For if anyone sees you who have knowledge eating in an idol’s temple, will he not be encouraged, if his conscience is weak, to eat food offered to idols? And so by your knowledge this weak person is destroyed, the brother for whom Christ died. Thus, sinning against your brothers and wounding their conscience when it is weak, you sin against Christ. Therefore, if food makes my brother stumble, I will never eat meat, lest I make my brother stumble.
Here’s an explanation of some of the reasons it’s often harmful for a community to fixate on optics, even though optics is real: https://www.lesswrong.com/posts/Js34Ez9nrDeJCTYQL/politics-is-way-too-meta
It also comes off as quite manipulative and dishonest, which puts people off. There are many people who’ll respect you if you disagree with them but state your opinion plainly and clearly, without trying to hide the weird or objectionable parts of your view. There are relatively few who will respect you if they find out you tried to manipulate their opinion of you, prioritizing optics over substance.
And this seems especially harmful for EA, whose central selling point is “we’re the people who try to actually do the most good, not just signal goodness or go through the motions”. Most public conversations about EA optics are extremely naive on this point, treating it as a free action for EAs to spend half their time publicly hand-wringing about their reputations.
What sort of message do you think that sends to people who come to the EA Forum for the first time, interested in EA, and find the conversations dominated by reputation obsession, panicky glances at the news cycle, complicated strategies to toss first-order utility out the window for the sake of massaging outsiders’ views of EA, etc.? Is that the best possible public face you could pick for EA?
In fact, I don’t think that we should adopt the stance “be so terrified of PR risks that you refuse to talk about PR”. I think EA should blurt far more than it currently does, and this will inevitably mean talking at least a little about people’s emotional fears re looking weird to others, being embarrassed to do something, etc.
But recognizing the deep PR costs of EA’s long-standing public obsession with reputation management is at least a first step in the direction of unraveling the obsession for some people, I’d hope.
Yeah I totally agree. I’d agree with the statement “it’s helpful to take optics into account, but not let it dominate our decision making process”. My original comment was in response to the idea that ‘actually doing good is more important than looking like doing good’ which I would argue is an oversimplification of the real world and not a good principle. I don’t think that it’s helpful to care entirely about optics or never care about optics. It’s more nuanced.
I also think it could help to break down the term “optics” a bit. I think the purchase is bad for first impressions, which is one particular type of optics.
Anyways this whole discussion about optics is kind of a red herring. People will be shocked by the purchase because it was by a charity and was pretty exorbitant, and in fact it was (by that one guy’s admission… I’m on a phone and don’t want to look up his name in the comment above) purchased for to make conference participants feel inspired and was not made as a cost savings mechanism. Appearance (not being frugal) reflects reality in this case, at least based on that comment I read by that one guy (and if I’m wrong just let me be wrong at this point, I have work to do and don’t care to debate this further).
But yeah I agree about let’s not wholly concentrate on optics. Of course.
Let’s say we had one charitable person who has a reputation for being charitable, and another charitable person who has a reputation for hurting others. Someone needing charity avoid the latter, even though the latter is also beneficial.
There’s a big difference between trying to represent yourself in an accurate or an inaccurate way. In either case you’re caring about what people think about you, but if we assume the perceiver is acting in their self interest, then the accurate representation will benefit them, and the inaccurate representation may harm them.
I’m not disagreeing with what you wrote. I’m adding to it that “caring about optics” can actually be more honest. It’s possible to care about optics so that you’re represented honestly, too.
SBF caused damage not because he virtue signaled with his cheap car and lack of style, but because he was misrepresenting himself and being a dick.
It makes sense for people to talk about not wanting to be misrepresented, and if I were a new visitor to the forum and I saw people upset about being misrepresented, I’d probably be sympathetic to them. I also might think they were cry babies and too obsessed with their image, which is what you’re saying could be the case, and I agree with that.
Also just by the way, I guess the ideal would be to care what other people think but be strong enough to do what one thinks is right. I think there’s a psychological element to all this. I’ve lived in some places where I was looked down on, even though I was working hard for their benefit, and it did suck psychologically. It would’ve been better for everyone if people had known how much I cared about them, but yeah it can be important to not worry too much about what other people think, as you wrote.
Some related stuff I’ve said:
And: