I’m pretty confused here. On the one hand, I think it’s probably good to have less epistemic deference and more independent thinking in EA. On the other, I think if I take your statements literally and extend them, I think they’re probably drawing the boundaries of “religious” way too broadly, in mostly-unhelpful ways.
As an example, I’d put the belief in prediction markets as an EA idea that tends towards the religious. Prediction markets may well be a beneficial innovation, but I personally don’t think we have good evidence one way or the other yet. But due to the idea’s connection to rationality and EA community leaders, it has gained many adherents who probably haven’t closely evaluated the supporting data. Again, maybe the idea is correct and this is a good thing. But I think it is better if EA had fewer of these canonized, insider signals, because it makes reevaluation of the ideas difficult.
I think people who study forecasting are usually aware of the potential limitations of prediction markets. See e.g. here, here, and here. And to the extent they aren’t, this is because “more research is needed”, not because of an unhealthy deference to authority.
People who don’t study forecasting may well overestimate the value of prediction markets, and some of this might be due to deference. But I don’t know, this just seems unavoidable as part of a healthy collective epistemic process, and categorizing it as “tends towards the religious” just seems to stretch the definition of “religious” way too far.
Analogously, many non-EAs also believe that a) handwashing stops covid-19, and b) the Earth orbits the Sun. In both cases, the epistemic process probably looks much more like some combination of “people I respect believe this”, “this seems to make sense”, and “the authorities believe this” rather than a deep principled understanding of the science. And this just seems...broadly not-religious to me? Of course, the main salient difference between a) and b) is that one of the above is probably false. But I don’t think it’d be appropriate to frame “have a mistaken belief because the apparent scientific consensus is incorrect” as “religious”
I’m pretty confused here. On the one hand, I think it’s probably good to have less epistemic deference and more independent thinking in EA. On the other, I think if I take your statements literally and extend them, I think they’re probably drawing the boundaries of “religious” way too broadly, in mostly-unhelpful ways.
I think people who study forecasting are usually aware of the potential limitations of prediction markets. See e.g. here, here, and here. And to the extent they aren’t, this is because “more research is needed”, not because of an unhealthy deference to authority.
People who don’t study forecasting may well overestimate the value of prediction markets, and some of this might be due to deference. But I don’t know, this just seems unavoidable as part of a healthy collective epistemic process, and categorizing it as “tends towards the religious” just seems to stretch the definition of “religious” way too far.
Analogously, many non-EAs also believe that a) handwashing stops covid-19, and b) the Earth orbits the Sun. In both cases, the epistemic process probably looks much more like some combination of “people I respect believe this”, “this seems to make sense”, and “the authorities believe this” rather than a deep principled understanding of the science. And this just seems...broadly not-religious to me? Of course, the main salient difference between a) and b) is that one of the above is probably false. But I don’t think it’d be appropriate to frame “have a mistaken belief because the apparent scientific consensus is incorrect” as “religious”