Ah. In one sense, a core part of rationality is indeed rejecting beliefs you can’t justify. Similarly, a core part of EA is thinking carefully about your impact. However, I think one claim you could make here is that naively, intensely optimising these things will not actually win (e.g. lead to the formation of accurate beliefs; save the world). Specifically:
Rationality: often a deep integration with your feelings is required to form accurate beliefs—paying attention to a note of confusion, or something you can’t explain in rational terms yet. Indeed, sometimes it is harmful to impose “rationality” constraints too early, because you will tend to lose the information that can’t immediately comply with those constraints. Another example is defying social norms because one cannot justify them, only to later realise that they served some important function.
Sure, but that isn’t what the quoted text is saying. Trusting your gut or following social norms are not even on the same level as woo, or adopting beliefs with no justification.
If the harmful social norms Sasha actually had in mind were not trusting your gut & violating social norms with no gain, then I’d agree these actions are bad, and possibly a result of social norms in the rationality community. Another alternative is that the community’s made up of a bunch of socially awkward nerds, who are known for their social ineptness and inability to trust their gut.
But as it stands, this doesn’t seem to be what’s being argued, as the quoted text is tangential to what you said at best.
There are a few different ways of interpreting the quote, but there’s a concept of public positions and private guts. Public positions are ones that you can justify in public if pressed on, while private guts are illegible intuitions you hold which may nonetheless be correct—e.g. an expert mathematician may have a strong intuition that a particular proof or claim is correct, which they will then eventually translate to a publicly-verifiable proof.
As far as I can tell, lizards probably don’t have public positions, but they probably do have private guts. That suggests those guts are good for predicting things about the world and achieving desirable world states, as well as being one of the channels by which the desirability of world states is communicated inside a mind. It seems related to many sorts of ‘embodied knowledge’, like how to walk, which is not understood from first principles or in an abstract way, or habits, like adjective order in English. A neural network that ‘knows’ how to classify images of cats, but doesn’t know how it knows (or is ‘uninterpretable’), seems like an example of this. “Why is this image a cat?” → “Well, because when you do lots of multiplication and addition and nonlinear transforms on pixel intensities, it ends up having a higher cat-number than dog-number.” This seems similar to gut senses that are difficult to articulate; “why do you think the election will go this way instead of that way?” → “Well, because when you do lots of multiplication and addition and nonlinear transforms on environmental facts, it ends up having a higher A-number than B-number.” Private guts also seem to capture a category of amorphous visions; a startup can rarely write a formal proof that their project will succeed (generally, if they could, the company would already exist). The postrigorous mathematician’s hunch falls into this category, which I’ll elaborate on later.
As an another example, in the recent dialog on AGI alignment, Yudkowsky frequently referenced having strong intuitions about how minds work that come from studying specific things in detail (and from having “done the homework”), but which he does not know how to straightforwardly translate into a publicly justifiable argument.
Private guts are very important and arguably the thing that mostly guides people’s behavior, but they are often also ones that the person can’t justify. If a person felt like they should reject any beliefs they couldn’t justify, they would quickly become incapable of doing anything at all.
If this is what the line was saying, I agree. But it’s not, and having intuitions & a track record (or some reason to believe) those intuitions correlate with reality, and useful but known to be not true models of the world is a far cry from having unjustified beliefs & believing in woo, and the lack of these is what the post actually claims is the toxic social norm in rationality.
What makes you think it isn’t? To me it seems both like a reasonable interpretation of the quote (private guts are precisely the kinds of positions you can’t necessarily justify, and it’s talking about having beliefs you can’t justify) as well as a dynamic that feels like one that I recognize as one that has been occasionally present in the community. Fortunately posts like the one about private guts have helped push back against it.
Even if this interpretation wasn’t actually the author’s intent, choosing to steelman the claim in that way turns the essay into a pretty solid one, so we might as well engage with the strongest interpretation of it.
What makes you think it isn’t? To me it seems both like a reasonable interpretation of the quote (private guts are precisely the kinds of positions you can’t necessarily justify, and it’s talking about having beliefs you can’t justify) as well as a dynamic that feels like one that I recognize as one that has been occasionally present in the community.
Because it also mentions woo, so I think it’s talking about a broader class if unjustified beliefs than you think.
Even if this interpretation wasn’t actually the author’s intent, choosing to steelman the claim in that way turns the essay into a pretty solid one, so we might as well engage with the strongest interpretation of it.
I agree, but in that case you should say make it clear how your interpretation differs from the author’s. If you don’t, then it looks like a motte-bailey is happening (where the bailey is “rationalists should be more accepting of woo & other unjustified beliefs”, and the bailey is “oh no! I/they really just mean you shouldn’t completely ignore gut judgements, and occasionally models can be wrong in known ways but still useful”), or you may miss out on reasons the post-as-is doesn’t require your reformulation to be correct.
Because it also mentions woo, so I think it’s talking about a broader class if unjustified beliefs than you think.
My earlier comment mentioned that “there are also lots of different claims that seem (or even are) irrational but are pointing to true facts about the world.” That was intended to touch upon “woo”; e.g. meditation used to be, and to some extent still is, considered “woo”, but there nonetheless seem to be reasonable grounds to think that there’s nonetheless something of value to be found in meditation (despite there also being various crazy claims around it).
My above link mentions a few other examples (out-of-body experiences, folk traditions, “Ki” in martial arts) that have claims around them that are false if taken as the literal truth, but are still pointing to some true aspect of the world. Notably, a policy of “reject all woo things” could easily be taken to imply rejecting all such things as superstition that’s not worth looking at, thus missing out on the parts of the woo that were actually valuable.
IME, the more I look into them, the more I come to find that “woo” things that I’d previously rejected as not worth looking at because of them being obviously woo and false, are actually pointing to significantly valuable things. (Even if there is also quite a lot of nonsense floating around those same topics.)
I agree, but in that case you should say make it clear how your interpretation differs from the author’s.
This isn’t a toxic social norm. This is the point of rationality, is it not?
Ah. In one sense, a core part of rationality is indeed rejecting beliefs you can’t justify. Similarly, a core part of EA is thinking carefully about your impact. However, I think one claim you could make here is that naively, intensely optimising these things will not actually win (e.g. lead to the formation of accurate beliefs; save the world). Specifically:
Rationality: often a deep integration with your feelings is required to form accurate beliefs—paying attention to a note of confusion, or something you can’t explain in rational terms yet. Indeed, sometimes it is harmful to impose “rationality” constraints too early, because you will tend to lose the information that can’t immediately comply with those constraints. Another example is defying social norms because one cannot justify them, only to later realise that they served some important function.
EA: Burnout; depression.
Sure, but that isn’t what the quoted text is saying. Trusting your gut or following social norms are not even on the same level as woo, or adopting beliefs with no justification.
If the harmful social norms Sasha actually had in mind were not trusting your gut & violating social norms with no gain, then I’d agree these actions are bad, and possibly a result of social norms in the rationality community. Another alternative is that the community’s made up of a bunch of socially awkward nerds, who are known for their social ineptness and inability to trust their gut.
But as it stands, this doesn’t seem to be what’s being argued, as the quoted text is tangential to what you said at best.
There are a few different ways of interpreting the quote, but there’s a concept of public positions and private guts. Public positions are ones that you can justify in public if pressed on, while private guts are illegible intuitions you hold which may nonetheless be correct—e.g. an expert mathematician may have a strong intuition that a particular proof or claim is correct, which they will then eventually translate to a publicly-verifiable proof.
As an another example, in the recent dialog on AGI alignment, Yudkowsky frequently referenced having strong intuitions about how minds work that come from studying specific things in detail (and from having “done the homework”), but which he does not know how to straightforwardly translate into a publicly justifiable argument.
Private guts are very important and arguably the thing that mostly guides people’s behavior, but they are often also ones that the person can’t justify. If a person felt like they should reject any beliefs they couldn’t justify, they would quickly become incapable of doing anything at all.
Separately, there are also lots of different claims that seem (or even are) irrational but are pointing to true facts about the world.
If this is what the line was saying, I agree. But it’s not, and having intuitions & a track record (or some reason to believe) those intuitions correlate with reality, and useful but known to be not true models of the world is a far cry from having unjustified beliefs & believing in woo, and the lack of these is what the post actually claims is the toxic social norm in rationality.
What makes you think it isn’t? To me it seems both like a reasonable interpretation of the quote (private guts are precisely the kinds of positions you can’t necessarily justify, and it’s talking about having beliefs you can’t justify) as well as a dynamic that feels like one that I recognize as one that has been occasionally present in the community. Fortunately posts like the one about private guts have helped push back against it.
Even if this interpretation wasn’t actually the author’s intent, choosing to steelman the claim in that way turns the essay into a pretty solid one, so we might as well engage with the strongest interpretation of it.
Because it also mentions woo, so I think it’s talking about a broader class if unjustified beliefs than you think.
I agree, but in that case you should say make it clear how your interpretation differs from the author’s. If you don’t, then it looks like a motte-bailey is happening (where the bailey is “rationalists should be more accepting of woo & other unjustified beliefs”, and the bailey is “oh no! I/they really just mean you shouldn’t completely ignore gut judgements, and occasionally models can be wrong in known ways but still useful”), or you may miss out on reasons the post-as-is doesn’t require your reformulation to be correct.
My earlier comment mentioned that “there are also lots of different claims that seem (or even are) irrational but are pointing to true facts about the world.” That was intended to touch upon “woo”; e.g. meditation used to be, and to some extent still is, considered “woo”, but there nonetheless seem to be reasonable grounds to think that there’s nonetheless something of value to be found in meditation (despite there also being various crazy claims around it).
My above link mentions a few other examples (out-of-body experiences, folk traditions, “Ki” in martial arts) that have claims around them that are false if taken as the literal truth, but are still pointing to some true aspect of the world. Notably, a policy of “reject all woo things” could easily be taken to imply rejecting all such things as superstition that’s not worth looking at, thus missing out on the parts of the woo that were actually valuable.
IME, the more I look into them, the more I come to find that “woo” things that I’d previously rejected as not worth looking at because of them being obviously woo and false, are actually pointing to significantly valuable things. (Even if there is also quite a lot of nonsense floating around those same topics.)
That’s fair.