Imagine a forecasting tournament where virtually no one submitted indepedent predictions, but everyone just copied the forecasts of the individual that everyone thought at the beginning was already the best forecaster . Obviously the tournament will just generate fairly useless outputs unless it so happens that the so-called “best forecaster” actually is really good and dialled-in and maybe a bit lucky as well.
Epistemic deference is just obviously parasitic, a sort of dreadful tragedy of the commons of the mind. Take a walk on the wild side. Don’t be afraid to be wrong!
Epistemic deference is just obviously parasitic, a sort of dreadful tragedy of the commons of the mind.
I don’t think this is right. One has to defer quite a lot; what we do is, appropriately, mostly deferring, in one way or another. The world is so complicated, there’s so much information to process, and our problems are high-context (that is, require compressing and abstracting from a lot of information). Also coordination is important.
I think a blunt-force “just defer less” is therefore not viable. Instead, having a more detailed understanding of what’s undesirable about specific cases of deference opens the possibility of more specifically deferring less when it’s most undesirable, and alleviating those dangers.
I agree that “defer less” might not be viable advice for the median human, or even bad advice, but for the median EA I think it’s pretty good advice.
Deference should imo be explicitly temporary and provisional. “I will outsource to X until such time as I can develop my own opinion” is not always a bad move, and might well be a good one in some contexts, but you do actually need to develop your own takes on the things that matter if you want to make any useful contributions to anything ever.
I agree that “defer less” is good advice for EAs, but that’s because EAs are especially deferent, and also especially care about getting things right and might actually do something sane about it. I think part of doing something sane about it is to have a detailed model of deference.
Just put it in simple language:
Imagine a forecasting tournament where virtually no one submitted indepedent predictions, but everyone just copied the forecasts of the individual that everyone thought at the beginning was already the best forecaster . Obviously the tournament will just generate fairly useless outputs unless it so happens that the so-called “best forecaster” actually is really good and dialled-in and maybe a bit lucky as well.
Epistemic deference is just obviously parasitic, a sort of dreadful tragedy of the commons of the mind. Take a walk on the wild side. Don’t be afraid to be wrong!
I don’t think this is right. One has to defer quite a lot; what we do is, appropriately, mostly deferring, in one way or another. The world is so complicated, there’s so much information to process, and our problems are high-context (that is, require compressing and abstracting from a lot of information). Also coordination is important.
I think a blunt-force “just defer less” is therefore not viable. Instead, having a more detailed understanding of what’s undesirable about specific cases of deference opens the possibility of more specifically deferring less when it’s most undesirable, and alleviating those dangers.
I agree that “defer less” might not be viable advice for the median human, or even bad advice, but for the median EA I think it’s pretty good advice.
Deference should imo be explicitly temporary and provisional. “I will outsource to X until such time as I can develop my own opinion” is not always a bad move, and might well be a good one in some contexts, but you do actually need to develop your own takes on the things that matter if you want to make any useful contributions to anything ever.
I agree that “defer less” is good advice for EAs, but that’s because EAs are especially deferent, and also especially care about getting things right and might actually do something sane about it. I think part of doing something sane about it is to have a detailed model of deference.