I did read it, and I agree it improves the tone of your post (helpfully reduces the strength of its claim). My criticism is partly optical, but I do think you should write what you sincerely think: perhaps not every single thing you think (that’s a tall order alas in our society: “I say 80% of what I think, a hell of a lot more than any politician I know”—Gore Vidal), but sincerely on topics you do choose to opine on.
The main thrusts of my criticism are:
Because of the optical risk, and also just generally because criticizing others merits care, you should have (and still can) clarify which of the significantly different meanings I listed (or others) of “they are not seeking truth” you intended.
If you believe one of the stronger forms, eg “before EA the intersection of people who were very concerned about what was true, and people who were trying hard to make the world a better place, was negligible,” then I strongly disagree, and I think this is worth discussing further for both optical and substantive reasons. We would probably get lost in definition hairsplitting at some point, but I believe many, many people (activists, volunteers, missionaries, scientists, philanthropists, community leaders, …) for at least hundreds of years have both been trying hard to make the world a better place and trying hard to be guided by an accurate understanding of reality while doing so. We can certainly argue any one of them got a lot wrong: but that’s about execution, not intent.
This is, again, partly optical and partly substantive: but it’s worth realizing that to a lot of the world who predate EA or have read a lot about the world pre-EA, the quoted claim above is just laughable. I care about EA but I see it as a refinement, a sort of technical advance. Not an amazing invention.
I tried answering your question on the object level a few times but I notice myself either trying to be reconciliatory or defensive, and I don’t think I will endorse either response upon reflection.
All right. Well, I know you’re a good guy, just keep this stuff in mind.
Out of curiosity I ran the following question by our local EA NYC group’s Slack channel and got the following six responses. In hindsight I wish I’d given your wording, not mine, but oh well, maybe it’s better that way. Even if we just reasonably disagree at the object level, this response is worth considering in terms of optics. And this was an EA crowd, we can only guess how the public would react.
Jacob: what do y’all think about the following claim: “before EA the intersection of people who were very concerned about what was true, and people who were trying hard to make the world a better place, was negligible”
Jacob: all takes welcome!
A: I think it’s false 😛 as a lot of people are interested in the truth and trying hard to make the world a better place
B: also think it’s false; wasn’t this basically the premise of the enlightenment?
B: Thinking e.g. legal reforms esp. french revolution and prussian state, mexican cientificos, who were comteans
B: might steelman this by specifying the entire world i.e. a globalist outlook
B: even then, modernist projects c. 1920 onwards seemed to have a pretty strong alliance between proper reasoning on best evidence and genuine charitable impulses, even where ineffective or counterproductive
B: and, of course, before all the shit and social dynamics e.g. lysenkoism, marxism had a reasonably good claim at being scientific and materialist in its revolutionary aims
C: I find it plausible that one can be very concerned about what is true without being very good finding out the truth according to rationalists’ standards. Science and philosophy are hard! (And, in some cases, rationalists probably just have weird standards.)
D: Disagree. Analogy: before evidence-based medicine, physicians were still concerned with what was true and trying to make the world a better place (through medical practice). They just had terrible methodology (e.g., theorizing that led to humors and leeches).
D: Likewise, I think EA is a step up in methodology, but it’s not original in its simultaneous concern for welfare and truth.
E: Sounds crazy hubristic..
F: I think this isn’t right, but not necessarily because I think the intersection is all that common, it might be, I don’t know, but more because EA is small enough that its existence doesn’t provide much evidence of a large change in the number of people in this intersection. It could be a bunch them just talk to each other more now
I can see Jacob’s perspective and how Linch’s statement is very strong. For example, in developmental econ, in just one or two top schools, the set of professors and their post-docs/staff might be larger and more impressive than the entire staff of Rethink Priorities and Open Phil combined. It’s very very far from playpumps. So saying that they are not truth-seeking seems sort of questionable at least.
At the same time, in another perspective I find reasonable, I think I can see how academic work can be swayed by incentives, trends and become arcane and wasteful. Separately and additionally, the phrasing Linch used originally, reduces the aggressive/pejorative tone for me, certainly viewed through “LessWrong” sort of culture/norms. I think I understand and have no trouble with this statement, especially since it seems to be a personal avowal:
I’m probably not phrasing this well, but to give a sense of my priors: I guess my impression is that my interactions with approximately every entity that perceives themself as directly doing good outside of EA* is that they are not seeking truth, and this systematically corrupts them in important ways.
Again, I think there’s two different perspectives here and a reasonable person could both take up both or either.
I think a crux is the personal meaning of the statement being made.
Unfortunately, in his last response I’m replying to, it is now coming off as Jacob is sort of pursuing a point. This is less useful. For example, looking at his responses, it seems like people are just responding to “EA is much more truth seeking than everyone else”, which is generating responses like “Sounds crazy hubristic..”.
Instead, I think Jacob could have ended the discussion at Linch’s comment here or maybe asked for models and examples to get “gears-level” sense for Linch’s beliefs (e.g. what’s wrong with development econ, can you explain?).
I don’t think impressing everyone into a rigid scout mentality is required, but it would have been useful here.
“Y” is a strictly stronger claim than “If X, then Y”, but many people get more emotional with “If X, then Y.”
Consider “Most people around 2000 years ago had a lot of superstitions and usually believed wrong things” vs “Before Jesus Christ, people had a lot of superstitions and usually believed wrong things.”
In hindsight I wish I’d given your wording, not mine, but oh well
Well, my point wasn’t to prove you wrong. It was to see what people thought about a strong version of what you wrote: I couldn’t tell if that version was what you meant, which is why I asked for clarification. Larks seemed to think that version was plausible anyway.
I probably shouldn’t resurrect this thread. But I was reminded of it by yet another egregious example of bad reasoning in an EA-adjacent industry (maybe made by EAs. I’m not sure). So I’m going to have one last go.
To be clear, my issue with your phrasing isn’t that you used a stronger version of what I wrote, it’s that you used a weaker version of what I wrote, phrased in a misleading way that’s quite manipulative. Consider the following propositions:
A. “political partisans in the US are often irrational and believe false things” vs B. “Democrats are often irrational and believe false things.”
I claim that A is a strictly stronger claim than B (in the sense that an ideal Bayesian reasoner will assign lower probability to A than B), but unless it’s said is a epistemically healthy and socially safe context, B will get people much more angry in non-truth-seeking ways than A.
B is similar to using a phrasing like:
before EA the intersection of people who were very concerned about what was true, and people who were trying hard to make the world a better place, was negligible
instead of a more neutral (A-like)
the intersection of people who were very concerned about what was true, and people who were trying hard to make the world a better place, is negligible
Note again that the less emotional phrasing is actually a strictly stronger claim than the more emotional one.
Similarly, your initial question:
Do you mean: a) They don’t make truth-seeking as high a priority as they should (relative to, say, hands-on work for change)? b) They try to understand what’s true, but their feeble non-EA efforts go nowhere? c) They make zero effort to seek the truth? (“Not seeking truth”) d) They don’t care in the slightest what the truth is?
was very clearly (unintentionally?) optimized to really want me to answer “oh no I just meant a),” (unwritten: since that’s the socially safest thing to answer). Maybe this is unintentional, but this is how it came across to me.
A better person than me would have been able to successfully answered you accurately and directly despite that initial framing, but alas I was/am not mature enough.
(I’m not optimistic that this will update you since I’m basically saying the same thing 3 times, but occasionally this has worked in the past. I do appreciate your attempts to defuse the situation at a personal level. Also I think it bears mentioning that I don’t think this argument is particularly important, and I don’t really think less of you or your work because of it; I like barely know you).
I did read it, and I agree it improves the tone of your post (helpfully reduces the strength of its claim). My criticism is partly optical, but I do think you should write what you sincerely think: perhaps not every single thing you think (that’s a tall order alas in our society: “I say 80% of what I think, a hell of a lot more than any politician I know”—Gore Vidal), but sincerely on topics you do choose to opine on.
The main thrusts of my criticism are:
Because of the optical risk, and also just generally because criticizing others merits care, you should have (and still can) clarify which of the significantly different meanings I listed (or others) of “they are not seeking truth” you intended.
If you believe one of the stronger forms, eg “before EA the intersection of people who were very concerned about what was true, and people who were trying hard to make the world a better place, was negligible,” then I strongly disagree, and I think this is worth discussing further for both optical and substantive reasons. We would probably get lost in definition hairsplitting at some point, but I believe many, many people (activists, volunteers, missionaries, scientists, philanthropists, community leaders, …) for at least hundreds of years have both been trying hard to make the world a better place and trying hard to be guided by an accurate understanding of reality while doing so. We can certainly argue any one of them got a lot wrong: but that’s about execution, not intent.
This is, again, partly optical and partly substantive: but it’s worth realizing that to a lot of the world who predate EA or have read a lot about the world pre-EA, the quoted claim above is just laughable. I care about EA but I see it as a refinement, a sort of technical advance. Not an amazing invention.
I tried answering your question on the object level a few times but I notice myself either trying to be reconciliatory or defensive, and I don’t think I will endorse either response upon reflection.
All right. Well, I know you’re a good guy, just keep this stuff in mind.
Out of curiosity I ran the following question by our local EA NYC group’s Slack channel and got the following six responses. In hindsight I wish I’d given your wording, not mine, but oh well, maybe it’s better that way. Even if we just reasonably disagree at the object level, this response is worth considering in terms of optics. And this was an EA crowd, we can only guess how the public would react.
I can see Jacob’s perspective and how Linch’s statement is very strong. For example, in developmental econ, in just one or two top schools, the set of professors and their post-docs/staff might be larger and more impressive than the entire staff of Rethink Priorities and Open Phil combined. It’s very very far from playpumps. So saying that they are not truth-seeking seems sort of questionable at least.
At the same time, in another perspective I find reasonable, I think I can see how academic work can be swayed by incentives, trends and become arcane and wasteful. Separately and additionally, the phrasing Linch used originally, reduces the aggressive/pejorative tone for me, certainly viewed through “LessWrong” sort of culture/norms. I think I understand and have no trouble with this statement, especially since it seems to be a personal avowal:
Again, I think there’s two different perspectives here and a reasonable person could both take up both or either.
I think a crux is the personal meaning of the statement being made.
Unfortunately, in his last response I’m replying to, it is now coming off as Jacob is sort of pursuing a point. This is less useful. For example, looking at his responses, it seems like people are just responding to “EA is much more truth seeking than everyone else”, which is generating responses like “Sounds crazy hubristic..”.
Instead, I think Jacob could have ended the discussion at Linch’s comment here or maybe asked for models and examples to get “gears-level” sense for Linch’s beliefs (e.g. what’s wrong with development econ, can you explain?).
I don’t think impressing everyone into a rigid scout mentality is required, but it would have been useful here.
“Y” is a strictly stronger claim than “If X, then Y”, but many people get more emotional with “If X, then Y.”
Consider “Most people around 2000 years ago had a lot of superstitions and usually believed wrong things” vs “Before Jesus Christ, people had a lot of superstitions and usually believed wrong things.”
Oh what an interesting coincidence.Well, my point wasn’t to prove you wrong. It was to see what people thought about a strong version of what you wrote: I couldn’t tell if that version was what you meant, which is why I asked for clarification. Larks seemed to think that version was plausible anyway.
I probably shouldn’t resurrect this thread. But I was reminded of it by yet another egregious example of bad reasoning in an EA-adjacent industry (maybe made by EAs. I’m not sure). So I’m going to have one last go.
To be clear, my issue with your phrasing isn’t that you used a stronger version of what I wrote, it’s that you used a weaker version of what I wrote, phrased in a misleading way that’s quite manipulative. Consider the following propositions:
I claim that A is a strictly stronger claim than B (in the sense that an ideal Bayesian reasoner will assign lower probability to A than B), but unless it’s said is a epistemically healthy and socially safe context, B will get people much more angry in non-truth-seeking ways than A.
B is similar to using a phrasing like:
instead of a more neutral (A-like)
Note again that the less emotional phrasing is actually a strictly stronger claim than the more emotional one.
Similarly, your initial question:
was very clearly (unintentionally?) optimized to really want me to answer “oh no I just meant a),” (unwritten: since that’s the socially safest thing to answer). Maybe this is unintentional, but this is how it came across to me.
A better person than me would have been able to successfully answered you accurately and directly despite that initial framing, but alas I was/am not mature enough.
(I’m not optimistic that this will update you since I’m basically saying the same thing 3 times, but occasionally this has worked in the past. I do appreciate your attempts to defuse the situation at a personal level. Also I think it bears mentioning that I don’t think this argument is particularly important, and I don’t really think less of you or your work because of it; I like barely know you).