I find writing pretty hard and I imagine it was quite a task to compile all of these thoughts, thanks for doing that.
I only read the very first section (on epistemic health) but I found it pretty confusing. I did try and find explanations in the rest of the epistemics section.
EA’s focus on epistemics is almost exclusively directed towards individualistic issues like minimising the impact of cognitive biases and cultivating a Scout Mindset. The movement strongly emphasises intelligence, both in general and especially that of particular “thought-leaders” . An epistemically healthy community seems to be created by acquiring maximally-rational, intelligent, and knowledgeable individuals, with social considerations given second place. Unfortunately, the science does not bear this out. The quality of an epistemic community does not boil down to the de-biasing and training of individuals;[3] more important factors appear to be the community’s composition, its socio-economic structure, and its cultural norms.[4]
The footnotes and sources that you linked to don’t give me much evidence to update towards your position and saying “the science says x” (at least to me) implies that there is some kind of consensus view within the literature which I think you should be able to point to. This reads more like your hot takes rather than something you have thought about deeply. Superforecasting (footnote 4) does talk a bit about prediction markets but much more of the book is focused on how a few people with certain traits can beat most people at forecasting which I think runs counter to the point you are making so it seems misleading to me to link to it as if it supports your view.
I think it can be fine to give hot takes but I feel like the general vibe of the post was trying to persuade me rather than explain your view. Things that might have helped are focusing on a smaller set of points and trying to make a more rigorous case for them or communicating that you are not very confident in many of the key points if that is the case. I also felt like you were trying to make a claim that the ‘science’ supports your view—which based on the sources you linked to is really hard to verify.
I don’t think everything you wrote was clearly incorrect but in my view you made strong claims without demonstrating appropriate epistemic rigour.
Our two main references are the Yang & Sandberg paper and the Critchlow book, both of which act as accessible summaries of the collective intelligence literature.
They’re linked just a little after the paragraph you quoted.
I think my issues with this response and linking to that paper are better explained by looking at this post from SSC (beware the man of one study). To be clear I think we can learn things from the sources you linked—my issue is with the (imo) overconfidence and claims about what “the science” says.
I find writing pretty hard and I imagine it was quite a task to compile all of these thoughts, thanks for doing that.
I only read the very first section (on epistemic health) but I found it pretty confusing. I did try and find explanations in the rest of the epistemics section.
The footnotes and sources that you linked to don’t give me much evidence to update towards your position and saying “the science says x” (at least to me) implies that there is some kind of consensus view within the literature which I think you should be able to point to. This reads more like your hot takes rather than something you have thought about deeply. Superforecasting (footnote 4) does talk a bit about prediction markets but much more of the book is focused on how a few people with certain traits can beat most people at forecasting which I think runs counter to the point you are making so it seems misleading to me to link to it as if it supports your view.
I think it can be fine to give hot takes but I feel like the general vibe of the post was trying to persuade me rather than explain your view. Things that might have helped are focusing on a smaller set of points and trying to make a more rigorous case for them or communicating that you are not very confident in many of the key points if that is the case. I also felt like you were trying to make a claim that the ‘science’ supports your view—which based on the sources you linked to is really hard to verify.
I don’t think everything you wrote was clearly incorrect but in my view you made strong claims without demonstrating appropriate epistemic rigour.
Hi Caleb,
Our two main references are the Yang & Sandberg paper and the Critchlow book, both of which act as accessible summaries of the collective intelligence literature.
They’re linked just a little after the paragraph you quoted.
I think my issues with this response and linking to that paper are better explained by looking at this post from SSC (beware the man of one study). To be clear I think we can learn things from the sources you linked—my issue is with the (imo) overconfidence and claims about what “the science” says.