Among many things I agree with, the part I agree the most with:
EAs give high credence to non-expert investigations written by their peers, they rarely publish in peer-review journals and become increasingly dismissive of academia
I think a fair amount of the discussion of intelligence loses its bite if “intelligence” is replaced with what I take to be its definition: “the ability to succeed a randomly sampled task” (for some reasonable distribution over tasks). But maybe you’d say that perceptions of intelligence in the EA community are only loosely correlated with intelligence in this sense?
As for cached beliefs that people accept on faith from the writings of perceived-intelligent central figures, I can’t identify any beliefs that I have that I couldn’t defend myself (with the exception that I think many mainstream cultural norms are hard to improve on, so for a particular one, like “universities are the best institutions for producing new ideas”, I can’t necessarily defend this on the object level). But I’m pretty sure there aren’t any beliefs I hold just because a high-status EA holds them. Of course, some high-status EAs have convinced me of some positions, most notably Peter Singer. But that mechanism for belief transmission within EA, i.e. object-level persuasion, doesn’t run afoul of your concerns about echochamberism, I don’t think.
But maybe you’ve had a number of conversations with people who appeal to “authority” in defending certain positions, which I agree would be a little dicey.
But that mechanism for belief transmission within EA, i.e. object-level persuasion, doesn’t run afoul of your concerns about echochamberism, I don’t think.
Getting too little exposure to opposing arguments is a problem. Most arguments are informal so not necessarily even valid, and even for the ones that are, we can still doubt their premises, because there may be other sets of premises that conflict with them but are at least as plausible. If you disproportionately hear arguments from a given community, you’re more likely than otherwise to be biased towards the views of that community.
Yeah I think the cost is mostly lack of exposure to the right ideas/having the affordance to think them through deeply, rather than because you’re presented with all the object-level arguments in a balanced manner and groupthink biases you to a specific view.
Among many things I agree with, the part I agree the most with:
I think a fair amount of the discussion of intelligence loses its bite if “intelligence” is replaced with what I take to be its definition: “the ability to succeed a randomly sampled task” (for some reasonable distribution over tasks). But maybe you’d say that perceptions of intelligence in the EA community are only loosely correlated with intelligence in this sense?
As for cached beliefs that people accept on faith from the writings of perceived-intelligent central figures, I can’t identify any beliefs that I have that I couldn’t defend myself (with the exception that I think many mainstream cultural norms are hard to improve on, so for a particular one, like “universities are the best institutions for producing new ideas”, I can’t necessarily defend this on the object level). But I’m pretty sure there aren’t any beliefs I hold just because a high-status EA holds them. Of course, some high-status EAs have convinced me of some positions, most notably Peter Singer. But that mechanism for belief transmission within EA, i.e. object-level persuasion, doesn’t run afoul of your concerns about echochamberism, I don’t think.
But maybe you’ve had a number of conversations with people who appeal to “authority” in defending certain positions, which I agree would be a little dicey.
Getting too little exposure to opposing arguments is a problem. Most arguments are informal so not necessarily even valid, and even for the ones that are, we can still doubt their premises, because there may be other sets of premises that conflict with them but are at least as plausible. If you disproportionately hear arguments from a given community, you’re more likely than otherwise to be biased towards the views of that community.
Yeah I think the cost is mostly lack of exposure to the right ideas/having the affordance to think them through deeply, rather than because you’re presented with all the object-level arguments in a balanced manner and groupthink biases you to a specific view.