I agree with Yarrow’s anti-‘truth-seeking’ sentiment here. That phrase seems to primarily serve as an epistemic deflection device indicating ‘someone whose views I don’t want to take seriously and don’t want to justify not taking seriously’.
I agree we shouldn’t defer to the CEO of PETA, but CEOs aren’t—often by their own admission—subject matter experts so much as people who can move stuff forwards. In my book the set of actual experts is certainly murky, but includes academics, researchers, sometimes forecasters, sometimes technical workers—sometimes CEOs but only in particular cases—anyone who’s spent several years researching the subject in question.
Sometimes, as you say, they don’t exist, but in such cases we don’t need to worry about deferring to them. When they do, it seems foolish to not to upweight their views relative to our own unless we’ve done the same, or unless we have very concrete reasons to think they’re inept or systemically biased (and perhaps even then).
Yeah, while I think truth-seeking is a real thing I agree it’s often hard to judge in practice and vulnerable to being a weasel word.
Basically I have two concerns with deferring to experts. First is that when the world lacks people with true subject matter expertise, whoever has the most prestige—maybe not CEOs but certainly mainstream researchers on slightly related questions—will be seen as experts and we will need to worry about deferring to them.
Second, because EA topics are selected for being too weird/unpopular to attract mainstream attention/funding, I think a common pattern is that of the best interventions, some are already funded, some are recommended by mainstream experts and remain underfunded, and some are too weird for the mainstream. It’s not really possible to find the “too weird” kind without forming an inside view. We can start out deferring to experts, but by the time we’ve spent enough resources investigating the question that you’re at all confident in what to do, the deferral to experts is partially replaced with understanding the research yourself as well as the load-bearing assumptions and biases of the experts. The mainstream experts will always get some weight, but it diminishes as your views start to incorporate their models rather than their views (example that comes to mind is economists on whether AGI will create explosive growth, and how recently good economic models have been developed by EA sources, now including some economists that vary assumptions and justify differences from the mainstream economists’ assumptions).
Wish I could give more concrete examples but I’m a bit swamped at work right now.
I agree with Yarrow’s anti-‘truth-seeking’ sentiment here. That phrase seems to primarily serve as an epistemic deflection device indicating ‘someone whose views I don’t want to take seriously and don’t want to justify not taking seriously’.
I agree we shouldn’t defer to the CEO of PETA, but CEOs aren’t—often by their own admission—subject matter experts so much as people who can move stuff forwards. In my book the set of actual experts is certainly murky, but includes academics, researchers, sometimes forecasters, sometimes technical workers—sometimes CEOs but only in particular cases—anyone who’s spent several years researching the subject in question.
Sometimes, as you say, they don’t exist, but in such cases we don’t need to worry about deferring to them. When they do, it seems foolish to not to upweight their views relative to our own unless we’ve done the same, or unless we have very concrete reasons to think they’re inept or systemically biased (and perhaps even then).
Yeah, while I think truth-seeking is a real thing I agree it’s often hard to judge in practice and vulnerable to being a weasel word.
Basically I have two concerns with deferring to experts. First is that when the world lacks people with true subject matter expertise, whoever has the most prestige—maybe not CEOs but certainly mainstream researchers on slightly related questions—will be seen as experts and we will need to worry about deferring to them.
Second, because EA topics are selected for being too weird/unpopular to attract mainstream attention/funding, I think a common pattern is that of the best interventions, some are already funded, some are recommended by mainstream experts and remain underfunded, and some are too weird for the mainstream. It’s not really possible to find the “too weird” kind without forming an inside view. We can start out deferring to experts, but by the time we’ve spent enough resources investigating the question that you’re at all confident in what to do, the deferral to experts is partially replaced with understanding the research yourself as well as the load-bearing assumptions and biases of the experts. The mainstream experts will always get some weight, but it diminishes as your views start to incorporate their models rather than their views (example that comes to mind is economists on whether AGI will create explosive growth, and how recently good economic models have been developed by EA sources, now including some economists that vary assumptions and justify differences from the mainstream economists’ assumptions).
Wish I could give more concrete examples but I’m a bit swamped at work right now.