Re the new 2024 Rethink Cause Prio survey: “The EA community should defer to mainstream experts on most topics, rather than embrace contrarian views. [“Defer to experts”]” 3% strongly agree, 18% somewhat agree, 35% somewhat disagree, 15% strongly disagree.
This seems pretty bad to me, especially for a group that frames itself as recognizing intellectual humility/we (base rate for an intellectual movement) are so often wrong.
(Charitable interpretation) It’s also just the case that EAs tend to have lots of views that they’re being contrarian about because they’re trying to maximize the the expected value of information (often justified with something like: “usually contrarians are wrong, but if they are right, they are often more valuable for information than average person who just agrees”).
If this is the case, though, I fear that some of us are confusing the norm of being contrarian instrumental reasons and for “being correct” reasons.
I think the “most topics” thing is ambiguous. There are some topics on which mainstream experts tend to be correct and some on which they’re wrong, and although expertise is valuable on topics experts think about, they might be wrong on most topics central to EA. [1] Do we really wish we deferred to the CEO of PETA on what animal welfare interventions are best? EAs built that field in the last 15 years far beyond what “experts” knew before.
In the real world, assuming we have more than five minutes to think about a question, we shouldn’t “defer” to experts or immediately “embrace contrarian views”, rather use their expertise and reject it when appropriate. Since this wasn’t an option in the poll, my guess is many respondents just wrote how much they like being contrarian, and EAs have to often be contrarian on topics they think about so it came out in favor of contrarianism.
[1] Experts can be wrong because they don’t think in probabilities, they have a lack of imagination, there are obvious political incentives to say one thing over another, and probably other reasons, and lots of the central EA questions don’t have actual well-developed scientific fields around them, so many of the “experts” aren’t people who have thought about similar questions in a truth-seeking way for many years
I agree with Yarrow’s anti-‘truth-seeking’ sentiment here. That phrase seems to primarily serve as an epistemic deflection device indicating ‘someone whose views I don’t want to take seriously and don’t want to justify not taking seriously’.
I agree we shouldn’t defer to the CEO of PETA, but CEOs aren’t—often by their own admission—subject matter experts so much as people who can move stuff forwards. In my book the set of actual experts is certainly murky, but includes academics, researchers, sometimes forecasters, sometimes technical workers—sometimes CEOs but only in particular cases—anyone who’s spent several years researching the subject in question.
Sometimes, as you say, they don’t exist, but in such cases we don’t need to worry about deferring to them. When they do, it seems foolish to not to upweight their views relative to our own unless we’ve done the same, or unless we have very concrete reasons to think they’re inept or systemically biased (and perhaps even then).
Yeah, while I think truth-seeking is a real thing I agree it’s often hard to judge in practice and vulnerable to being a weasel word.
Basically I have two concerns with deferring to experts. First is that when the world lacks people with true subject matter expertise, whoever has the most prestige—maybe not CEOs but certainly mainstream researchers on slightly related questions—will be seen as experts and we will need to worry about deferring to them.
Second, because EA topics are selected for being too weird/unpopular to attract mainstream attention/funding, I think a common pattern is that of the best interventions, some are already funded, some are recommended by mainstream experts and remain underfunded, and some are too weird for the mainstream. It’s not really possible to find the “too weird” kind without forming an inside view. We can start out deferring to experts, but by the time we’ve spent enough resources investigating the question that you’re at all confident in what to do, the deferral to experts is partially replaced with understanding the research yourself as well as the load-bearing assumptions and biases of the experts. The mainstream experts will always get some weight, but it diminishes as your views start to incorporate their models rather than their views (example that comes to mind is economists on whether AGI will create explosive growth, and how recently good economic models have been developed by EA sources, now including some economists that vary assumptions and justify differences from the mainstream economists’ assumptions).
Wish I could give more concrete examples but I’m a bit swamped at work right now.
What’s the definition of “truth-seeking”? Not your personal definition, but the pre-existing, canonical definition that’s been written down somewhere and that everyone agrees on.
Not “everyone agrees” what “utilitarianism” means either and it remains a useful word. In context you can tell I mean someone whose attitude, methods and incentives allow them to avoid the biases I listed and others.
If I want to know what “utilitarianism” means, including any disagreements among scholars about the meaning of the term (I have a philosophy degree, I have studied ethics, and I don’t have the impression there are meaningful disagreements among philosophers on the definition of “utilitarianism”), I can find this information in many places, such as:
Academic lectures on YouTube and Crash Course (a high-quality educational resource)
So, it’s easy for me to find out what “utilitarianism” means. There is no shortage of information about that.
Where do I go to find out what “truth-seeking” means? Even if some people disagree on the definition, can I go somewhere and read about, say, the top 3 most popular definitions of the term and why people prefer one definition over the other?
It seems like an important word. I notice people keep using it. So, what does it mean? Where has it been defined? Is there a source you can cite that attempts to define it?
I have tried to find a definition for “truth-seeking” before, more than once. I’ve asked what the definition is before, more than once. I don’t know if there is a definition. I don’t know if the term means anything definite and specific. I imagine it probably doesn’t have a clear definition or meaning, and that different people who say “truth-seeking” mean different things when they say it — and so people are largely talking past each other when they use this term.
Incidentally, I think what I just said about “truth-seeking” probably also largely applies to “epistemics”. I suspect “epistemics” probably either means epistemic practices or epistemology, but it’s not clear, and there is evidently some confusion on its intended meaning. Looking at the actual use of “epistemics”, I’m not sure different people mean the same thing by it.
Re the new 2024 Rethink Cause Prio survey: “The EA community should defer to mainstream experts on most topics, rather than embrace contrarian views. [“Defer to experts”]” 3% strongly agree, 18% somewhat agree, 35% somewhat disagree, 15% strongly disagree.
This seems pretty bad to me, especially for a group that frames itself as recognizing intellectual humility/we (base rate for an intellectual movement) are so often wrong.
(Charitable interpretation) It’s also just the case that EAs tend to have lots of views that they’re being contrarian about because they’re trying to maximize the the expected value of information (often justified with something like: “usually contrarians are wrong, but if they are right, they are often more valuable for information than average person who just agrees”).
If this is the case, though, I fear that some of us are confusing the norm of being contrarian instrumental reasons and for “being correct” reasons.
Tho lmk if you disagree.
I think the “most topics” thing is ambiguous. There are some topics on which mainstream experts tend to be correct and some on which they’re wrong, and although expertise is valuable on topics experts think about, they might be wrong on most topics central to EA. [1] Do we really wish we deferred to the CEO of PETA on what animal welfare interventions are best? EAs built that field in the last 15 years far beyond what “experts” knew before.
In the real world, assuming we have more than five minutes to think about a question, we shouldn’t “defer” to experts or immediately “embrace contrarian views”, rather use their expertise and reject it when appropriate. Since this wasn’t an option in the poll, my guess is many respondents just wrote how much they like being contrarian, and EAs have to often be contrarian on topics they think about so it came out in favor of contrarianism.
[1] Experts can be wrong because they don’t think in probabilities, they have a lack of imagination, there are obvious political incentives to say one thing over another, and probably other reasons, and lots of the central EA questions don’t have actual well-developed scientific fields around them, so many of the “experts” aren’t people who have thought about similar questions in a truth-seeking way for many years
I agree with Yarrow’s anti-‘truth-seeking’ sentiment here. That phrase seems to primarily serve as an epistemic deflection device indicating ‘someone whose views I don’t want to take seriously and don’t want to justify not taking seriously’.
I agree we shouldn’t defer to the CEO of PETA, but CEOs aren’t—often by their own admission—subject matter experts so much as people who can move stuff forwards. In my book the set of actual experts is certainly murky, but includes academics, researchers, sometimes forecasters, sometimes technical workers—sometimes CEOs but only in particular cases—anyone who’s spent several years researching the subject in question.
Sometimes, as you say, they don’t exist, but in such cases we don’t need to worry about deferring to them. When they do, it seems foolish to not to upweight their views relative to our own unless we’ve done the same, or unless we have very concrete reasons to think they’re inept or systemically biased (and perhaps even then).
Yeah, while I think truth-seeking is a real thing I agree it’s often hard to judge in practice and vulnerable to being a weasel word.
Basically I have two concerns with deferring to experts. First is that when the world lacks people with true subject matter expertise, whoever has the most prestige—maybe not CEOs but certainly mainstream researchers on slightly related questions—will be seen as experts and we will need to worry about deferring to them.
Second, because EA topics are selected for being too weird/unpopular to attract mainstream attention/funding, I think a common pattern is that of the best interventions, some are already funded, some are recommended by mainstream experts and remain underfunded, and some are too weird for the mainstream. It’s not really possible to find the “too weird” kind without forming an inside view. We can start out deferring to experts, but by the time we’ve spent enough resources investigating the question that you’re at all confident in what to do, the deferral to experts is partially replaced with understanding the research yourself as well as the load-bearing assumptions and biases of the experts. The mainstream experts will always get some weight, but it diminishes as your views start to incorporate their models rather than their views (example that comes to mind is economists on whether AGI will create explosive growth, and how recently good economic models have been developed by EA sources, now including some economists that vary assumptions and justify differences from the mainstream economists’ assumptions).
Wish I could give more concrete examples but I’m a bit swamped at work right now.
What’s the definition of “truth-seeking”? Not your personal definition, but the pre-existing, canonical definition that’s been written down somewhere and that everyone agrees on.
Not “everyone agrees” what “utilitarianism” means either and it remains a useful word. In context you can tell I mean someone whose attitude, methods and incentives allow them to avoid the biases I listed and others.
If I want to know what “utilitarianism” means, including any disagreements among scholars about the meaning of the term (I have a philosophy degree, I have studied ethics, and I don’t have the impression there are meaningful disagreements among philosophers on the definition of “utilitarianism”), I can find this information in many places, such as:
The Stanford Encyclopedia of Philosophy
Encyclopedia Britannica
Wikipedia
The book Utilitarianism: A Very Short Introduction co-authored by Peter Singer and published by Oxford University Press
A textbook like Normative Ethics or an anthology like Ethical Theory
Philosophy journals
An academic philosophy podcast like Philosophy Bites
Academic lectures on YouTube and Crash Course (a high-quality educational resource)
So, it’s easy for me to find out what “utilitarianism” means. There is no shortage of information about that.
Where do I go to find out what “truth-seeking” means? Even if some people disagree on the definition, can I go somewhere and read about, say, the top 3 most popular definitions of the term and why people prefer one definition over the other?
It seems like an important word. I notice people keep using it. So, what does it mean? Where has it been defined? Is there a source you can cite that attempts to define it?
I have tried to find a definition for “truth-seeking” before, more than once. I’ve asked what the definition is before, more than once. I don’t know if there is a definition. I don’t know if the term means anything definite and specific. I imagine it probably doesn’t have a clear definition or meaning, and that different people who say “truth-seeking” mean different things when they say it — and so people are largely talking past each other when they use this term.
Incidentally, I think what I just said about “truth-seeking” probably also largely applies to “epistemics”. I suspect “epistemics” probably either means epistemic practices or epistemology, but it’s not clear, and there is evidently some confusion on its intended meaning. Looking at the actual use of “epistemics”, I’m not sure different people mean the same thing by it.