The thing I agree with most is the idea that EA is too insular, and that we focus on value alignment too much (compared with excellence). More generally, networking with people outside EA has positive externalities (engaging more people with the movement) whereas networking with people inside EA is more likely to help you personally (since that allows you to get more of EA’s resources). So the former is likely undervalued.
I think the “revered for their intellect” thing is evidence of a genuine problem in EA, namely that we pay more attention to intelligence than we should, compared with achievements. However, the mere fact of having very highly-respected individuals doesn’t seem unusual; e.g. in other fields that I’ve been in (machine learning, philosophy) pioneers are treated with awe, and there are plenty of memes about them.
Members write articles about him in apparent awe and possibly jest
I think I should have stated more clearly that I don’t see these tendencies as abnormal. I see them as maladaptive given the goal EA has. When thinking about the question of whether fandom is a good feature for epistemic health, I don’t care too much about whether fandom tendencies exists in other communities. I know that it’s the norm (same with hierarchy and homogeneity).
It can be quite effective to have such a community strucutre in situations in which you want to change the minds of many people quickly. You can now simply try change the mind of the one who others look up to (e.g. Toby Ord/ Y. Bengio) and expect other members will likely follow (models in ‘misinformation age’ by C. O’Connor & J Weatherall). A process of belief formation which does not use central stars will converge less quickly I imagine, but I’d have to look into that. This is the kind of research which I hope this article makes palatable to EAs.
My guess is there is not only a sweet spot of cog. diversity but also a sweet spot of how much a community should respect their central stars. Too much reverence and you loose feedback mechanisms. Too little and belief formation will be slow and confused and you lose the reward mechanism of reputation. I expect that there will always be individuals who deserve more respect and admiration than others in any community, because they have done more or better work on behalf of everyone else. But I would love for EAs to examine where the effective sweet spot lies and how one can influence the level of fandom culture (e.g. Will’s recent podcast episode on 80k was doing a good job I think) so that the end result is a healthy epistemic community.
Yepp, that all makes sense to me. Another thing we can do, that’s distinct from changing the overall level of respect, is changing the norms around showing respect. For example, whenever people bring up the fact that person X believes Y, we could encourage them to instead say that person X believes Y because of Z, which makes the appeal to authority easier to argue against.
I think in community building, it is a good trajectory to start with strong homogeneity and strong reference to ‘stars’ that act as reference points and communication hubs, and then to incrementally soften and expand as time passes. It is a much harder or even impossible to do this in reverse, as this risks to yield a fuzzy community that lacks the mechanisms to attract talent and converge on anything.
With that in mind, I think some of the rigidity of EA thinking in the past might have been good, but the time has come to re-think how the EA community should evolve from here on out.
The thing I agree with most is the idea that EA is too insular, and that we focus on value alignment too much (compared with excellence). More generally, networking with people outside EA has positive externalities (engaging more people with the movement) whereas networking with people inside EA is more likely to help you personally (since that allows you to get more of EA’s resources). So the former is likely undervalued.
I think the “revered for their intellect” thing is evidence of a genuine problem in EA, namely that we pay more attention to intelligence than we should, compared with achievements. However, the mere fact of having very highly-respected individuals doesn’t seem unusual; e.g. in other fields that I’ve been in (machine learning, philosophy) pioneers are treated with awe, and there are plenty of memes about them.
Definitely jest.
I think I should have stated more clearly that I don’t see these tendencies as abnormal. I see them as maladaptive given the goal EA has. When thinking about the question of whether fandom is a good feature for epistemic health, I don’t care too much about whether fandom tendencies exists in other communities. I know that it’s the norm (same with hierarchy and homogeneity).
It can be quite effective to have such a community strucutre in situations in which you want to change the minds of many people quickly. You can now simply try change the mind of the one who others look up to (e.g. Toby Ord/ Y. Bengio) and expect other members will likely follow (models in ‘misinformation age’ by C. O’Connor & J Weatherall). A process of belief formation which does not use central stars will converge less quickly I imagine, but I’d have to look into that. This is the kind of research which I hope this article makes palatable to EAs.
My guess is there is not only a sweet spot of cog. diversity but also a sweet spot of how much a community should respect their central stars. Too much reverence and you loose feedback mechanisms. Too little and belief formation will be slow and confused and you lose the reward mechanism of reputation. I expect that there will always be individuals who deserve more respect and admiration than others in any community, because they have done more or better work on behalf of everyone else. But I would love for EAs to examine where the effective sweet spot lies and how one can influence the level of fandom culture (e.g. Will’s recent podcast episode on 80k was doing a good job I think) so that the end result is a healthy epistemic community.
Yepp, that all makes sense to me. Another thing we can do, that’s distinct from changing the overall level of respect, is changing the norms around showing respect. For example, whenever people bring up the fact that person X believes Y, we could encourage them to instead say that person X believes Y because of Z, which makes the appeal to authority easier to argue against.
I think in community building, it is a good trajectory to start with strong homogeneity and strong reference to ‘stars’ that act as reference points and communication hubs, and then to incrementally soften and expand as time passes. It is a much harder or even impossible to do this in reverse, as this risks to yield a fuzzy community that lacks the mechanisms to attract talent and converge on anything.
With that in mind, I think some of the rigidity of EA thinking in the past might have been good, but the time has come to re-think how the EA community should evolve from here on out.