I think I should have stated more clearly that I don’t see these tendencies as abnormal. I see them as maladaptive given the goal EA has. When thinking about the question of whether fandom is a good feature for epistemic health, I don’t care too much about whether fandom tendencies exists in other communities. I know that it’s the norm (same with hierarchy and homogeneity).
It can be quite effective to have such a community strucutre in situations in which you want to change the minds of many people quickly. You can now simply try change the mind of the one who others look up to (e.g. Toby Ord/ Y. Bengio) and expect other members will likely follow (models in ‘misinformation age’ by C. O’Connor & J Weatherall). A process of belief formation which does not use central stars will converge less quickly I imagine, but I’d have to look into that. This is the kind of research which I hope this article makes palatable to EAs.
My guess is there is not only a sweet spot of cog. diversity but also a sweet spot of how much a community should respect their central stars. Too much reverence and you loose feedback mechanisms. Too little and belief formation will be slow and confused and you lose the reward mechanism of reputation. I expect that there will always be individuals who deserve more respect and admiration than others in any community, because they have done more or better work on behalf of everyone else. But I would love for EAs to examine where the effective sweet spot lies and how one can influence the level of fandom culture (e.g. Will’s recent podcast episode on 80k was doing a good job I think) so that the end result is a healthy epistemic community.
Yepp, that all makes sense to me. Another thing we can do, that’s distinct from changing the overall level of respect, is changing the norms around showing respect. For example, whenever people bring up the fact that person X believes Y, we could encourage them to instead say that person X believes Y because of Z, which makes the appeal to authority easier to argue against.
I think in community building, it is a good trajectory to start with strong homogeneity and strong reference to ‘stars’ that act as reference points and communication hubs, and then to incrementally soften and expand as time passes. It is a much harder or even impossible to do this in reverse, as this risks to yield a fuzzy community that lacks the mechanisms to attract talent and converge on anything.
With that in mind, I think some of the rigidity of EA thinking in the past might have been good, but the time has come to re-think how the EA community should evolve from here on out.
I think I should have stated more clearly that I don’t see these tendencies as abnormal. I see them as maladaptive given the goal EA has. When thinking about the question of whether fandom is a good feature for epistemic health, I don’t care too much about whether fandom tendencies exists in other communities. I know that it’s the norm (same with hierarchy and homogeneity).
It can be quite effective to have such a community strucutre in situations in which you want to change the minds of many people quickly. You can now simply try change the mind of the one who others look up to (e.g. Toby Ord/ Y. Bengio) and expect other members will likely follow (models in ‘misinformation age’ by C. O’Connor & J Weatherall). A process of belief formation which does not use central stars will converge less quickly I imagine, but I’d have to look into that. This is the kind of research which I hope this article makes palatable to EAs.
My guess is there is not only a sweet spot of cog. diversity but also a sweet spot of how much a community should respect their central stars. Too much reverence and you loose feedback mechanisms. Too little and belief formation will be slow and confused and you lose the reward mechanism of reputation. I expect that there will always be individuals who deserve more respect and admiration than others in any community, because they have done more or better work on behalf of everyone else. But I would love for EAs to examine where the effective sweet spot lies and how one can influence the level of fandom culture (e.g. Will’s recent podcast episode on 80k was doing a good job I think) so that the end result is a healthy epistemic community.
Yepp, that all makes sense to me. Another thing we can do, that’s distinct from changing the overall level of respect, is changing the norms around showing respect. For example, whenever people bring up the fact that person X believes Y, we could encourage them to instead say that person X believes Y because of Z, which makes the appeal to authority easier to argue against.
I think in community building, it is a good trajectory to start with strong homogeneity and strong reference to ‘stars’ that act as reference points and communication hubs, and then to incrementally soften and expand as time passes. It is a much harder or even impossible to do this in reverse, as this risks to yield a fuzzy community that lacks the mechanisms to attract talent and converge on anything.
With that in mind, I think some of the rigidity of EA thinking in the past might have been good, but the time has come to re-think how the EA community should evolve from here on out.