I think there are deeper comments here, that basically gets into theories of change of EA and the heart of EA. I think the below is a stronger argument than I would normally write, but it is what came out quickly:
In short, the tools of Bayesian reasoning, STEM are often merely cultural signals or simulcra for thought. They are naively and wrongly implied to provide a lot more utility than they really have in analysis and meta thought. They don’t have that much utility because the people who are breaking trail in EA appropriately take the underlying issues into account. They are rarely helped or hindered by a population of people fluent in running Guesstimates, control theory or applying “EV calculations”. Instead, the crux of knowledge/decisions occur with other information—to get a sense of this and tying this back to EA discourse, this seems related to at least the top paragraphs of this post (but I haven’t gone through the pretty deep examples in it).
Instead, these cultural signals provide cohesion for the movement, which is valuable. They might hinder growth too. I don’t know what the net value is. Answering this probably involves some full, deep model of EA.
But an aggressive, strong version of why we would be concerned might be that, in addition to filtering out promising young people, we might be harming acquisition of extremely valuable talent, who don’t want to read 1,800 articles about something they effectively learned in middle school or have to walk around shibboleths when communicating to EAs.
To get a sense for this:
if you have experience in academic circles, one often learns from experience that it can be really unpromising to parse thoughts from others who built their career in alternative world views, and this can easily lead to bouncing off from them (e.g. mainstream economist having to have a dinner party talk to someone versed in heterodox, Keynesian or Marxists thinking)
In tech , if you are the “A team”, and someone comes in to ask you to fix/refactor code that has weird patterns or has many odors, and that someone doesn’t seem to be aware of this
You are someone who built businesses and work with exec, then encounter consultants who seem to have pretty simplistic and didactic views of business theories (start mansplaining “Agile” or “First Principles” to you).
People who are extremely talented aren’t going to engage because the opportunity costs are extremely high.
As a caveat, the above is a story which may be wrong entirely, and I am pretty sure even when fully fleshed out, it is only part true. It also seems unnecessarily disagreeable to me, but I don’t have the ability to fix it in a short amount of time, maybe because I am dumb. Note that this caveat you are reading is not, “please don’t make fun of me”—it’s genuinely saying I don’t know, and I want these ideas to be ruthlessly stomped if it’s wrong.
I think there are deeper comments here, that basically gets into theories of change of EA and the heart of EA. I think the below is a stronger argument than I would normally write, but it is what came out quickly:
In short, the tools of Bayesian reasoning, STEM are often merely cultural signals or simulcra for thought. They are naively and wrongly implied to provide a lot more utility than they really have in analysis and meta thought. They don’t have that much utility because the people who are breaking trail in EA appropriately take the underlying issues into account. They are rarely helped or hindered by a population of people fluent in running Guesstimates, control theory or applying “EV calculations”. Instead, the crux of knowledge/decisions occur with other information—to get a sense of this and tying this back to EA discourse, this seems related to at least the top paragraphs of this post (but I haven’t gone through the pretty deep examples in it).
Instead, these cultural signals provide cohesion for the movement, which is valuable. They might hinder growth too. I don’t know what the net value is. Answering this probably involves some full, deep model of EA.
But an aggressive, strong version of why we would be concerned might be that, in addition to filtering out promising young people, we might be harming acquisition of extremely valuable talent, who don’t want to read 1,800 articles about something they effectively learned in middle school or have to walk around shibboleths when communicating to EAs.
To get a sense for this:
if you have experience in academic circles, one often learns from experience that it can be really unpromising to parse thoughts from others who built their career in alternative world views, and this can easily lead to bouncing off from them (e.g. mainstream economist having to have a dinner party talk to someone versed in heterodox, Keynesian or Marxists thinking)
In tech , if you are the “A team”, and someone comes in to ask you to fix/refactor code that has weird patterns or has many odors, and that someone doesn’t seem to be aware of this
You are someone who built businesses and work with exec, then encounter consultants who seem to have pretty simplistic and didactic views of business theories (start mansplaining “Agile” or “First Principles” to you).
People who are extremely talented aren’t going to engage because the opportunity costs are extremely high.
As a caveat, the above is a story which may be wrong entirely, and I am pretty sure even when fully fleshed out, it is only part true. It also seems unnecessarily disagreeable to me, but I don’t have the ability to fix it in a short amount of time, maybe because I am dumb. Note that this caveat you are reading is not, “please don’t make fun of me”—it’s genuinely saying I don’t know, and I want these ideas to be ruthlessly stomped if it’s wrong.