Thank you for this thoughtful reply! I appreciate it, and the disambiguation is helpful. (I would personally like to do as much thinking-in-public about this stuff as seems feasible.)
I mean a combination of (1) and (4).
I used to not believe that (4) was a thing, but then I started to notice (usually unconscious) patterns of (4) behavior arising in me, and as I investigated further I kept noticing more & more (4) behavior in me, so now I think it’s really a thing (because I don’t believe that I’m an outlier in this regard).
(4) is the interesting version of this claim, and I think there’s some truth to it. I also think that this problem is much more widespread than just our own community, and fixing it is likely one of the core bottlenecks for civilization as a whole.
I agree with this. I think EA and Bay Area Rationality still have a plausible shot at shifting out of this equilibrium, whereas I think most communities don’t (not self-reflective enough, too tribal, too angry, etc...)
I think part of the problem is that people get triggered into defensiveness; when they mentally simulate (or emotionally half-simulate) setting up a feedback mechanism, if that feedback mechanism tells them they’re doing the wrong thing, their anticipations put a lot of weight on the possibility that they’ll be shamed and punished, and not much weight on the possibility that they’ll be able to switch to something else that works better.
Yes, this is a good statement of one of the equilibria that it would be profoundly good to shift out of. Core transformation is one operationalization of how to go about this.
Thank you for this thoughtful reply! I appreciate it, and the disambiguation is helpful. (I would personally like to do as much thinking-in-public about this stuff as seems feasible.)
I mean a combination of (1) and (4).
I used to not believe that (4) was a thing, but then I started to notice (usually unconscious) patterns of (4) behavior arising in me, and as I investigated further I kept noticing more & more (4) behavior in me, so now I think it’s really a thing (because I don’t believe that I’m an outlier in this regard).
I agree with this. I think EA and Bay Area Rationality still have a plausible shot at shifting out of this equilibrium, whereas I think most communities don’t (not self-reflective enough, too tribal, too angry, etc...)
Yes, this is a good statement of one of the equilibria that it would be profoundly good to shift out of. Core transformation is one operationalization of how to go about this.