(Meta: I am afraid that I am strawmaning your position because I do not understand it correctly, so please let me know if that is the case )
Personally, I am a pretty strong believer that the unique thinking style of effective altruism has been essential for its success so far, and that this thinking style is very closely related to certain skills & virtues common in STEM fields. So I am skeptical that there is much substance behind claims #1 or #2 in general.
I agree with you that it seems plausible that the unique thinking style of EA has been essential to a lot of the successes achieved by EA + that those are closely related to STEM fields.
The “core” thinking tools of EA need to be improved by an infusion of humanities-ish thinking. Right now, the thinking style of EA is on the whole too STEM-ish, and this impairment is preventing EA from achieving its fundamental mission of doing the most good.
But it is unclear to me why this should imply that #1 is wrong. EA wants to achieve this massive goal of doing the most good. This makes it very important to get a highly accurate map of the territory we are operating in. Taking that into account, it is a very strong claim that we are confident that the “core” thinking tools we have used so far are the best we could be using and that we do not need to look at the tools that other fields are using before we decide that ours are actually the best. This is especially true since we do lack a bunch of academic disciplines in EA. Most EA ideas and thinking tools are from western analytic philosophy and STEM research. And that does not mean they are wrong—it could be that they all turn out to be correct—but they do encompass only a small portion of all knowledge out there. I dare you to chat to a philosopher who researches non-western epistemology—your mind will be blown by how different it is.
More generally: The fact that it is sometimes hard to understand people from very different fields is why it is so incredibly important and valuable to try to get those people into EA. They usually view the world through a very different lens and can check whether they see an aspect of the territory we do not see that we should incorporate into EA.
I am afraid that we are so confident in the tools we have that we do not spend enough time trying to understand how other fields think and therefore miss out on an important part of reality.
To be clear: I think that a big chunk of what makes EA special is related to STEM style reasoning and we should probably try hard to hold onto it.
2. The “core” thinking tools of EA are great and don’t need to change, but STEM style is only weakly correlated with those core thinking tools. We’re letting great potential EAs slip through the cracks because we’re stereotyping too hard on an easily-observed surface variable, thus getting lots of false positives and false negatives when we try to detect who really has the potential to be great at the “core” skills. STEM style is more like an incidental cultural difference than a reliable indicator of “core” EA mindset.
Small thing: It is unclear to me whether we get a lot of false positives + this was also not the claim of the post if I understand it correctly.
(Meta: I am afraid that I am strawmaning your position because I do not understand it correctly, so please let me know if that is the case )
I agree with you that it seems plausible that the unique thinking style of EA has been essential to a lot of the successes achieved by EA + that those are closely related to STEM fields.
But it is unclear to me why this should imply that #1 is wrong. EA wants to achieve this massive goal of doing the most good. This makes it very important to get a highly accurate map of the territory we are operating in. Taking that into account, it is a very strong claim that we are confident that the “core” thinking tools we have used so far are the best we could be using and that we do not need to look at the tools that other fields are using before we decide that ours are actually the best. This is especially true since we do lack a bunch of academic disciplines in EA. Most EA ideas and thinking tools are from western analytic philosophy and STEM research. And that does not mean they are wrong—it could be that they all turn out to be correct—but they do encompass only a small portion of all knowledge out there. I dare you to chat to a philosopher who researches non-western epistemology—your mind will be blown by how different it is.
More generally: The fact that it is sometimes hard to understand people from very different fields is why it is so incredibly important and valuable to try to get those people into EA. They usually view the world through a very different lens and can check whether they see an aspect of the territory we do not see that we should incorporate into EA.
I am afraid that we are so confident in the tools we have that we do not spend enough time trying to understand how other fields think and therefore miss out on an important part of reality.
To be clear: I think that a big chunk of what makes EA special is related to STEM style reasoning and we should probably try hard to hold onto it.
Small thing: It is unclear to me whether we get a lot of false positives + this was also not the claim of the post if I understand it correctly.