The problem with considering optics is that it’s chaotic. I think Wytham is a reasonable example. You might want a fancy space so you can have good optics—imagining that you need to convince fancy people of things, otherwise they won’t take you seriously. Or you might imagine that it looks too fancy, and then people won’t take you seriously because it looks like you’re spending too much money.
Pretty much everything in “PR” has weird nonlinear dynamics like this. I’m not going to say that it is completely unpredictable but I do think that it’s quite hard to predict, and subtleties really matter, and most people seem overconfident; I think “bad optics” only looks predictable in hindsight. It also changes quickly, like fashion: what seems like bad optics now could be good countersignaling in year, and standard practice in three.
It’s a better heuristic to focus on things which are actually good for the world, consistent with your values. I think in most cases if you can justify your actions being consistent with a set of values you can survive most short term optical disasters and even come out of it stronger.
The problem with considering optics is that it’s chaotic.
The world is chaotic, and everything EAs try to do have a largely unpredictable long-term effect because of complex dynamic interactions. We should try to think through the contingencies and make the best guess we can, but completely ignoring chaotic considerations just seems impossible.
It’s a better heuristic to focus on things which are actually good for the world, consistent with your values.
This sounds good in principle, but there are a ton of things that might conceivably be good-but-for-pr-reasons where the pr reasons are decisive. E.g. should EAs engage in personal harassment campaigns against productive ML researchers in order to slow AI capabilities research? Maybe that would be good if it weren’t terrible PR, but I think we very obviously should not do it because it would be terrible PR.
Holding conferences is not “actually good for the world” in any direct sense. It is good only to the extent that it results in net good outcomes—and you’re quite right that those outcomes can be hard to predict. What I think we have to be careful to avoid is the crediting the hoped-for positive aspects while dismissing the negative aspects as “optics” that cannot be adequately predicted.
Also, you could always commission a survey to generate at least some data on how the public would perceive an action. That doesn’t give much confidence in what the actual perception would be . . . but these sorts of things are hard to measure/predict on both the positive and negative ends. If people are just too unpredictable to make EV estimates based on their reactions to anything, then we should just hold all conferences at the local Motel 6 or wherever the cheapest venue is. “Dollars spent” is at least measurable.
I agree with this. It’s also not clear where to draw the boundary. If even well-informed people who shared your worldview and values thought a given purchase was bad, then there’s no need to call it “optics” – it’s just a bad purchase.
So “optics” is about what people think who either don’t have all the info or who have different views and values. There’s a whole range of potential differences here that can affect what people think.
Some people are more averse to spending large amounts of money without some careful process that’s there to prevent corruption. Some people might be fine with the decision but would’ve liked to see things being addressed and explained more proactively. Some people may have uncharitable priors towards EA or towards everyone (including themselves?) so they’d never accept multi-step arguments about why some investment is actually altruistic if it superficially looks like what a selfish rich person would also buy. And maybe some people don’t understand how investments work (the fact that you can sell something again and get money back).
At the extreme, it seems unreasonable to give weight to all the ways a decision could cause backlash – some of the viewpoints I described above are clearly stupid.
At the same time, factoring in that there are parts of EA that would welcome more transparency or some kind of process designed to prevent risk of corruption – that seems fine/good.
The problem with considering optics is that it’s chaotic. I think Wytham is a reasonable example. You might want a fancy space so you can have good optics—imagining that you need to convince fancy people of things, otherwise they won’t take you seriously. Or you might imagine that it looks too fancy, and then people won’t take you seriously because it looks like you’re spending too much money.
Pretty much everything in “PR” has weird nonlinear dynamics like this. I’m not going to say that it is completely unpredictable but I do think that it’s quite hard to predict, and subtleties really matter, and most people seem overconfident; I think “bad optics” only looks predictable in hindsight. It also changes quickly, like fashion: what seems like bad optics now could be good countersignaling in year, and standard practice in three.
It’s a better heuristic to focus on things which are actually good for the world, consistent with your values. I think in most cases if you can justify your actions being consistent with a set of values you can survive most short term optical disasters and even come out of it stronger.
The world is chaotic, and everything EAs try to do have a largely unpredictable long-term effect because of complex dynamic interactions. We should try to think through the contingencies and make the best guess we can, but completely ignoring chaotic considerations just seems impossible.
This sounds good in principle, but there are a ton of things that might conceivably be good-but-for-pr-reasons where the pr reasons are decisive. E.g. should EAs engage in personal harassment campaigns against productive ML researchers in order to slow AI capabilities research? Maybe that would be good if it weren’t terrible PR, but I think we very obviously should not do it because it would be terrible PR.
Holding conferences is not “actually good for the world” in any direct sense. It is good only to the extent that it results in net good outcomes—and you’re quite right that those outcomes can be hard to predict. What I think we have to be careful to avoid is the crediting the hoped-for positive aspects while dismissing the negative aspects as “optics” that cannot be adequately predicted.
Also, you could always commission a survey to generate at least some data on how the public would perceive an action. That doesn’t give much confidence in what the actual perception would be . . . but these sorts of things are hard to measure/predict on both the positive and negative ends. If people are just too unpredictable to make EV estimates based on their reactions to anything, then we should just hold all conferences at the local Motel 6 or wherever the cheapest venue is. “Dollars spent” is at least measurable.
I agree with this. It’s also not clear where to draw the boundary. If even well-informed people who shared your worldview and values thought a given purchase was bad, then there’s no need to call it “optics” – it’s just a bad purchase.
So “optics” is about what people think who either don’t have all the info or who have different views and values. There’s a whole range of potential differences here that can affect what people think.
Some people are more averse to spending large amounts of money without some careful process that’s there to prevent corruption. Some people might be fine with the decision but would’ve liked to see things being addressed and explained more proactively. Some people may have uncharitable priors towards EA or towards everyone (including themselves?) so they’d never accept multi-step arguments about why some investment is actually altruistic if it superficially looks like what a selfish rich person would also buy. And maybe some people don’t understand how investments work (the fact that you can sell something again and get money back).
At the extreme, it seems unreasonable to give weight to all the ways a decision could cause backlash – some of the viewpoints I described above are clearly stupid.
At the same time, factoring in that there are parts of EA that would welcome more transparency or some kind of process designed to prevent risk of corruption – that seems fine/good.
Relevant: PR is corrosive reputation is not