I’ll defend the declining epistemics point of view, even though I don’t think it’s quite so bad as some others think.
I recommend first reading about the Eternal September. The basic effect is that when new people join a movement, it takes time for them to get up to speed and if people join at too fast a rate this breaks down. This isn’t necessarily a lack of intelligence, more a result of needing a certain number of more experienced people to help newer members understand why things work the way that they do.
When a movement grows too fast, it’s very easy for this cultural knowledge to dissipate and for the values of a movement to change not because people deeply understood the old values and consciously decided that it would be better because they were different, but more because people are importing their assumptions from the broader society.
Now some people think it would be arrogant for a group of people to think that they have better access to the truth than society on average. On the other, almost no-one denies that there are groups with worse access to the truth and that they have some idea of what groups these are. And the hypothesis that we can identify some groups with worse access to the truth, but we can’t identify groups with better access would be an extremely strange and weird hypothesis to entertain, the kind of hypothesis that is generally only produced by social processes.
Once we’ve accepted that it is possible to identify at least some groups with better epistemics than average, then it becomes reasonable to suggest that EA could be one of those groups. Indeed, if someone didn’t think EA was one of those groups, I’d wonder why they decided to join EA rather than something else?
So once you accept that it is likely that your group has above-average epistemics, it follows quite quickly that you don’t want it regressing to the societal mean. And this is a major challenge because social forces naturally push your movement to regress to the mean and the so maintaining the current quality requires constantly pushing back against these entropic forces.
I’d challenge you to reverse your analysis and consider what are the “smuggled assumptions” in the “epistemic decline isn’t an issue” hypothesis. As an example, one assumption might be, “We mostly can’t tell who has high-quality epistemics or not”. And I would push back against this by pointing out, as I did above, that people are pretty good at agreeing that certain groups have low-quality epistemics (flat-earthers, creationists, lizard-person conspiracy theorists) and so it would be strange if we had no idea of high-quality epistemics, particularly since you can get pretty far towards high-quality epistemics by not doing things that clearly degrade your epistemics.
I think that whichever group within EA which values the epistemic norms should stop complaining, and work on exhibiting those traits, explaining clearly why they are useful, and how to build them, and proactively pushing for the community to build those traits. That will be hard, but the alternative is to be insular and insulting, and will continue to undermine their goals—which might be narrowly epistemically virtuous, but is counterproductive if we have goals outside of pure epistemology. And EA, as distinct from LessWrong, is about maximizing the good in an impartial welfarist sense, not the art of human rationality.
I guess that sounds a lot like suggesting that people who value epistemics should just surrender the public conversation which is essentially the same as surrendering the direction that EA takes?
I think that epistemics will ground out in terms of more impact, but explaining this would take a bit of work and so I’m deciding to pass today because I already spent too long on my comment above. However, feel free to ping me in a few days if you’d like me to write something up.
I guess that sounds a lot like suggesting that people who value epistemics should just surrender the public conversation which is essentially the same as surrendering the direction that EA takes?
That’s not what I was saying, and I don’t really understand where you got that idea from. I was saying that the people who value epistemics need to actually do work, and push for investment by EA into community epistemics. In other words, they should “stop complaining, and work on exhibiting those traits, explaining clearly why they are useful, and how to build them, and proactively pushing for the community to build those traits.” That’s what they should do—instead of complaining and thinking that it will help, when what it actually does is hurt the community while failing to improve epistemic norms
I’ll defend the declining epistemics point of view, even though I don’t think it’s quite so bad as some others think.
I recommend first reading about the Eternal September. The basic effect is that when new people join a movement, it takes time for them to get up to speed and if people join at too fast a rate this breaks down. This isn’t necessarily a lack of intelligence, more a result of needing a certain number of more experienced people to help newer members understand why things work the way that they do.
When a movement grows too fast, it’s very easy for this cultural knowledge to dissipate and for the values of a movement to change not because people deeply understood the old values and consciously decided that it would be better because they were different, but more because people are importing their assumptions from the broader society.
Now some people think it would be arrogant for a group of people to think that they have better access to the truth than society on average. On the other, almost no-one denies that there are groups with worse access to the truth and that they have some idea of what groups these are. And the hypothesis that we can identify some groups with worse access to the truth, but we can’t identify groups with better access would be an extremely strange and weird hypothesis to entertain, the kind of hypothesis that is generally only produced by social processes.
Once we’ve accepted that it is possible to identify at least some groups with better epistemics than average, then it becomes reasonable to suggest that EA could be one of those groups. Indeed, if someone didn’t think EA was one of those groups, I’d wonder why they decided to join EA rather than something else?
So once you accept that it is likely that your group has above-average epistemics, it follows quite quickly that you don’t want it regressing to the societal mean. And this is a major challenge because social forces naturally push your movement to regress to the mean and the so maintaining the current quality requires constantly pushing back against these entropic forces.
I’d challenge you to reverse your analysis and consider what are the “smuggled assumptions” in the “epistemic decline isn’t an issue” hypothesis. As an example, one assumption might be, “We mostly can’t tell who has high-quality epistemics or not”. And I would push back against this by pointing out, as I did above, that people are pretty good at agreeing that certain groups have low-quality epistemics (flat-earthers, creationists, lizard-person conspiracy theorists) and so it would be strange if we had no idea of high-quality epistemics, particularly since you can get pretty far towards high-quality epistemics by not doing things that clearly degrade your epistemics.
I think that whichever group within EA which values the epistemic norms should stop complaining, and work on exhibiting those traits, explaining clearly why they are useful, and how to build them, and proactively pushing for the community to build those traits. That will be hard, but the alternative is to be insular and insulting, and will continue to undermine their goals—which might be narrowly epistemically virtuous, but is counterproductive if we have goals outside of pure epistemology. And EA, as distinct from LessWrong, is about maximizing the good in an impartial welfarist sense, not the art of human rationality.
I guess that sounds a lot like suggesting that people who value epistemics should just surrender the public conversation which is essentially the same as surrendering the direction that EA takes?
I think that epistemics will ground out in terms of more impact, but explaining this would take a bit of work and so I’m deciding to pass today because I already spent too long on my comment above. However, feel free to ping me in a few days if you’d like me to write something up.
That’s not what I was saying, and I don’t really understand where you got that idea from. I was saying that the people who value epistemics need to actually do work, and push for investment by EA into community epistemics. In other words, they should “stop complaining, and work on exhibiting those traits, explaining clearly why they are useful, and how to build them, and proactively pushing for the community to build those traits.”
That’s what they should do—instead of complaining and thinking that it will help, when what it actually does is hurt the community while failing to improve epistemic norms
Perhaps people already are exhibiting those traits =P? It’s not like it would necessarily be super legible if they were.
And it’s hard to make progress on a problem if you want to hide that it exists.