What are you referring to when you say âNaive consequentialismâ?[1] Because Iâm not sure that itâs what others reading might take it to mean?
Like you seem critical of the current plan to sell Wytham Abbey, but I think many critics view the original purchase of it as an act of naive consequentialism that ignored the side effects that itâs had, such as reinforcing negative views of EA etc. Can both the purchase and the sale be a case of NC? Are they the same kind of thing?
So Iâm not sure the 3 respondents from the MCF and you have the same thing in mind when you talk about naive consequentialism, and Iâm not quite sure I am either.
The issue is that there are degrees of naiveness. Oliverâs view, as I understand it, is that there are at least three positions:
Maximally Naive: Buy nice event venues, because we need more places to host events.
Moderately Naive: Donât buy nice event venues, because itâs more valuable to convince people that weâre frugal and humble than it is valuable to host events.
Non-Naive: Buy nice event venues, because we need more places to host events, and the value of signaling frugality and humility is in any case lower than the value of signaling that weâre willing to do weird and unpopular things when the first-order effects are clearly positive. Indeed, trying to look frugal here may even cause more harm than benefit, since:
(a) it nudges EA toward being a home for empty virtue-signalers instead of people trying to actually help others, and
(b) it nudges EA toward being a home for manipulative people who are obsessed with controlling othersâ perceptions of EA, as opposed to EA being a home for honest, open, and cooperative souls who prize doing good and causing others to have accurate models over having a good reputation.
Optimizing too hard for reputation can get you into hot water, because youâve hit the sour spot of being too naive to recognize that many others can see what youâre doing and discount your signals accordingly, but not naive enough to just blithely do the obvious right thing without overthinking it.
There are obviously cases where reputation matters for impact, but many people fall into the trap of fixating on reputation when they lack the skill to take into account enough higher-order effects.
(Of course, the above isnât the only reason people might disagree on the utility of event venues. If you think EA is mainly bottlenecked on research and ideas, then youâll want to gather people together to solve problems and share their thoughts. If you instead think EAâs big bottleneck is that we arenât drawing in enough people to donate to GiveWell top charities, then you should think events are a lot less useful, unless maybe itâs a very large event targeted at drawing in new people to donate.)
I think this captures some of what I mean, though my model is also that the âMaximally naiveâ view is not very stable, in that if you are being âmaximally naiveâ you do often end up just lying to people (because the predictable benefits from lying to people outweigh the predictable costs in that moment).
I do think a combination of being âmaximally naiveâ combined with strong norms against deception and in favor of honesty can work, though in-general people want good reasons for following norms, and arguing for honesty requires some non-naive reasoning.
What are you referring to when you say âNaive consequentialismâ?[1] Because Iâm not sure that itâs what others reading might take it to mean?
Like you seem critical of the current plan to sell Wytham Abbey, but I think many critics view the original purchase of it as an act of naive consequentialism that ignored the side effects that itâs had, such as reinforcing negative views of EA etc. Can both the purchase and the sale be a case of NC? Are they the same kind of thing?
So Iâm not sure the 3 respondents from the MCF and you have the same thing in mind when you talk about naive consequentialism, and Iâm not quite sure I am either.
Both here and in this other example, for instance
The issue is that there are degrees of naiveness. Oliverâs view, as I understand it, is that there are at least three positions:
Maximally Naive: Buy nice event venues, because we need more places to host events.
Moderately Naive: Donât buy nice event venues, because itâs more valuable to convince people that weâre frugal and humble than it is valuable to host events.
Non-Naive: Buy nice event venues, because we need more places to host events, and the value of signaling frugality and humility is in any case lower than the value of signaling that weâre willing to do weird and unpopular things when the first-order effects are clearly positive. Indeed, trying to look frugal here may even cause more harm than benefit, since:
(a) it nudges EA toward being a home for empty virtue-signalers instead of people trying to actually help others, and
(b) it nudges EA toward being a home for manipulative people who are obsessed with controlling othersâ perceptions of EA, as opposed to EA being a home for honest, open, and cooperative souls who prize doing good and causing others to have accurate models over having a good reputation.
Optimizing too hard for reputation can get you into hot water, because youâve hit the sour spot of being too naive to recognize that many others can see what youâre doing and discount your signals accordingly, but not naive enough to just blithely do the obvious right thing without overthinking it.
There are obviously cases where reputation matters for impact, but many people fall into the trap of fixating on reputation when they lack the skill to take into account enough higher-order effects.
(Of course, the above isnât the only reason people might disagree on the utility of event venues. If you think EA is mainly bottlenecked on research and ideas, then youâll want to gather people together to solve problems and share their thoughts. If you instead think EAâs big bottleneck is that we arenât drawing in enough people to donate to GiveWell top charities, then you should think events are a lot less useful, unless maybe itâs a very large event targeted at drawing in new people to donate.)
I think this captures some of what I mean, though my model is also that the âMaximally naiveâ view is not very stable, in that if you are being âmaximally naiveâ you do often end up just lying to people (because the predictable benefits from lying to people outweigh the predictable costs in that moment).
I do think a combination of being âmaximally naiveâ combined with strong norms against deception and in favor of honesty can work, though in-general people want good reasons for following norms, and arguing for honesty requires some non-naive reasoning.