Is wholesomeness basically consequentialism but with more vibes and less numbers?
I think it’s partially that (where the point of the vibes is often that they’re helpful for tracking things which aren’t directly good/bad, but have an increased chance of causing good/bad things down the line). It’s also a bit like “consequentialism, but with some extra weight on avoiding negative consequences to things you’re interacting with” (where the point of this is that it distributes responsibility in a sensible way across agents).
I think you actually have a really revisionist account of ‘wholesomeness’ - so revisitionist I think you should probably just pick a new word.
I agree that my sense is somewhat revisionist, although I think it’s more grounded in the usual usage than you’re giving it credit for. I did consider choosing a different word, but after talking it through with some folks felt better about “wholesome” than the alternatives. The main way in which I care about the vibes of the existing word is for people putting it into practice. I think if people ask of an action they’re considering “what are the ways in which this might not be wholesome?”, it will be easy to feel answers to that question. If I try to define “holistically-good” and then people ask “what are the ways in which this might not be holistically-good?”, I think they’ll more get caught up in explicit verbal models and not notice things which they might have caught as answers to the first question.
Put another way: one of my guiding principles for this is to try to have an account of things such that I think it could lead to people doing the good ambitious versions of EA, but such that I find it hard to imagine SBF trying to follow it could have made the same mistakes, even if there was motivated cognition towards doing so. Stuff about side constraints doesn’t really feel robust enough to me. If there’s a new term it could be vulnerable to a lot of reinterpretation. Anchoring in the existing term serves to resist that.
Maybe I’m supposed to more explicitly separate out the thing you do felt mental checks for from the thing that you overall choose to pursue? I’m worried that gets contrived.
I think it’s partially that (where the point of the vibes is often that they’re helpful for tracking things which aren’t directly good/bad, but have an increased chance of causing good/bad things down the line). It’s also a bit like “consequentialism, but with some extra weight on avoiding negative consequences to things you’re interacting with” (where the point of this is that it distributes responsibility in a sensible way across agents).
I agree that my sense is somewhat revisionist, although I think it’s more grounded in the usual usage than you’re giving it credit for. I did consider choosing a different word, but after talking it through with some folks felt better about “wholesome” than the alternatives. The main way in which I care about the vibes of the existing word is for people putting it into practice. I think if people ask of an action they’re considering “what are the ways in which this might not be wholesome?”, it will be easy to feel answers to that question. If I try to define “holistically-good” and then people ask “what are the ways in which this might not be holistically-good?”, I think they’ll more get caught up in explicit verbal models and not notice things which they might have caught as answers to the first question.
Put another way: one of my guiding principles for this is to try to have an account of things such that I think it could lead to people doing the good ambitious versions of EA, but such that I find it hard to imagine SBF trying to follow it could have made the same mistakes, even if there was motivated cognition towards doing so. Stuff about side constraints doesn’t really feel robust enough to me. If there’s a new term it could be vulnerable to a lot of reinterpretation. Anchoring in the existing term serves to resist that.
Maybe I’m supposed to more explicitly separate out the thing you do felt mental checks for from the thing that you overall choose to pursue? I’m worried that gets contrived.