Re: Bayesian thinking helping one to communicate more clearly. I agree that this is a benefit, but I don’t think it’s the fastest route or the one with the highest marginal value. For instance, when you write:
A lot of expressed beliefs are “fake beliefs”: things people say to express solidarity with some group (“America is the greatest country in the world”), to emphasize some value (“We must do this fairly”), to let the listener hear what they want to hear (“Make America great again”), or simply to sound reasonable (“we will balance costs and benefits”) or wise (“I don’t see this issue as black or white”).
I’m immediately reminded of Orwell’s essay Politics and the English Language. I would generally expect people to learn more about clear, truth-seeking communication from reading Orwell (and other good books on writing) than by being Bayesian. Indeed, I find many Bayesian rationalists to be highly obscurantist in practice, perhaps moreso than the average similarly-educated person, and I feel that rationalist community norms tend to reward rather than punish this, because many people are drawn to deep but difficult-to-understand truths.
I would say that the value of the rationalist project so far has been in generating important hypotheses, rather than in clear communication around those hypotheses.
Re: Bayesian thinking helping one to communicate more clearly. I agree that this is a benefit, but I don’t think it’s the fastest route or the one with the highest marginal value. For instance, when you write:
I’m immediately reminded of Orwell’s essay Politics and the English Language. I would generally expect people to learn more about clear, truth-seeking communication from reading Orwell (and other good books on writing) than by being Bayesian. Indeed, I find many Bayesian rationalists to be highly obscurantist in practice, perhaps moreso than the average similarly-educated person, and I feel that rationalist community norms tend to reward rather than punish this, because many people are drawn to deep but difficult-to-understand truths.
I would say that the value of the rationalist project so far has been in generating important hypotheses, rather than in clear communication around those hypotheses.