I think I strongly agree with all the ideas in this post. The post is a very well written, thoughtful and focuses on the value of non-STEM communities, and improving communication and access for these valuable people. It has a lot of thoughtful suggestions including such as reducing jargon, and more intentional outreach and intentional listening to valuable people from other backgrounds.
Random thoughts that aren’t fully related:
There is a recent case where a leader got “knocked on” for some phrasing in an EA forum post. This person was a mid-career Harvard lawyer. It’s worthwhile pointing out that this person is not only personally capable in writing in “EA code/speak”, but (in the alternate world where she went corporate instead of being an EA starting nonprofits) could have teams of underlings writing in EA speak for her too. Instead, she herself got on the horn here to write something that she thought was valuable. (Note that there was a reason her wording got criticized, it’s unclear if this is a “defect” and it’s unclear how to solve this).
I could see a similar issue with the above where extremely valuable policy makers, biologists, lawyers or economists (who have their own shibboleths) could be bounced off EA for related reasons.
I don’t know of a good way to solve this. What comes to mind quickly is deliberating creating discussion space for different groups or providing some sort of “ambassadors” for these disciplines (but “ambassadors” folds into gatekeeping and field building that senior EAs are involved in, so it’s even harder than it sounds)?
My guess is that if we wanted to move to a good compromise for jargon and tone, Holden’s blog post “Cold-Takes” gives a good model. It takes more effort than it looks, but Holden’s lack of jargon, short articles, and expression of uncertainty seems ideal for communicating to EA and regular folks.
I think there are deeper comments here, that basically gets into theories of change of EA and the heart of EA. I think the below is a stronger argument than I would normally write, but it is what came out quickly:
In short, the tools of Bayesian reasoning, STEM are often merely cultural signals or simulcra for thought. They are naively and wrongly implied to provide a lot more utility than they really have in analysis and meta thought. They don’t have that much utility because the people who are breaking trail in EA appropriately take the underlying issues into account. They are rarely helped or hindered by a population of people fluent in running Guesstimates, control theory or applying “EV calculations”. Instead, the crux of knowledge/decisions occur with other information—to get a sense of this and tying this back to EA discourse, this seems related to at least the top paragraphs of this post (but I haven’t gone through the pretty deep examples in it).
Instead, these cultural signals provide cohesion for the movement, which is valuable. They might hinder growth too. I don’t know what the net value is. Answering this probably involves some full, deep model of EA.
But an aggressive, strong version of why we would be concerned might be that, in addition to filtering out promising young people, we might be harming acquisition of extremely valuable talent, who don’t want to read 1,800 articles about something they effectively learned in middle school or have to walk around shibboleths when communicating to EAs.
To get a sense for this:
if you have experience in academic circles, one often learns from experience that it can be really unpromising to parse thoughts from others who built their career in alternative world views, and this can easily lead to bouncing off from them (e.g. mainstream economist having to have a dinner party talk to someone versed in heterodox, Keynesian or Marxists thinking)
In tech , if you are the “A team”, and someone comes in to ask you to fix/refactor code that has weird patterns or has many odors, and that someone doesn’t seem to be aware of this
You are someone who built businesses and work with exec, then encounter consultants who seem to have pretty simplistic and didactic views of business theories (start mansplaining “Agile” or “First Principles” to you).
People who are extremely talented aren’t going to engage because the opportunity costs are extremely high.
As a caveat, the above is a story which may be wrong entirely, and I am pretty sure even when fully fleshed out, it is only part true. It also seems unnecessarily disagreeable to me, but I don’t have the ability to fix it in a short amount of time, maybe because I am dumb. Note that this caveat you are reading is not, “please don’t make fun of me”—it’s genuinely saying I don’t know, and I want these ideas to be ruthlessly stomped if it’s wrong.
I think I strongly agree with all the ideas in this post. The post is a very well written, thoughtful and focuses on the value of non-STEM communities, and improving communication and access for these valuable people. It has a lot of thoughtful suggestions including such as reducing jargon, and more intentional outreach and intentional listening to valuable people from other backgrounds.
Random thoughts that aren’t fully related:
There is a recent case where a leader got “knocked on” for some phrasing in an EA forum post. This person was a mid-career Harvard lawyer. It’s worthwhile pointing out that this person is not only personally capable in writing in “EA code/speak”, but (in the alternate world where she went corporate instead of being an EA starting nonprofits) could have teams of underlings writing in EA speak for her too. Instead, she herself got on the horn here to write something that she thought was valuable. (Note that there was a reason her wording got criticized, it’s unclear if this is a “defect” and it’s unclear how to solve this).
I could see a similar issue with the above where extremely valuable policy makers, biologists, lawyers or economists (who have their own shibboleths) could be bounced off EA for related reasons.
I don’t know of a good way to solve this. What comes to mind quickly is deliberating creating discussion space for different groups or providing some sort of “ambassadors” for these disciplines (but “ambassadors” folds into gatekeeping and field building that senior EAs are involved in, so it’s even harder than it sounds)?
My guess is that if we wanted to move to a good compromise for jargon and tone, Holden’s blog post “Cold-Takes” gives a good model. It takes more effort than it looks, but Holden’s lack of jargon, short articles, and expression of uncertainty seems ideal for communicating to EA and regular folks.
I think there are deeper comments here, that basically gets into theories of change of EA and the heart of EA. I think the below is a stronger argument than I would normally write, but it is what came out quickly:
In short, the tools of Bayesian reasoning, STEM are often merely cultural signals or simulcra for thought. They are naively and wrongly implied to provide a lot more utility than they really have in analysis and meta thought. They don’t have that much utility because the people who are breaking trail in EA appropriately take the underlying issues into account. They are rarely helped or hindered by a population of people fluent in running Guesstimates, control theory or applying “EV calculations”. Instead, the crux of knowledge/decisions occur with other information—to get a sense of this and tying this back to EA discourse, this seems related to at least the top paragraphs of this post (but I haven’t gone through the pretty deep examples in it).
Instead, these cultural signals provide cohesion for the movement, which is valuable. They might hinder growth too. I don’t know what the net value is. Answering this probably involves some full, deep model of EA.
But an aggressive, strong version of why we would be concerned might be that, in addition to filtering out promising young people, we might be harming acquisition of extremely valuable talent, who don’t want to read 1,800 articles about something they effectively learned in middle school or have to walk around shibboleths when communicating to EAs.
To get a sense for this:
if you have experience in academic circles, one often learns from experience that it can be really unpromising to parse thoughts from others who built their career in alternative world views, and this can easily lead to bouncing off from them (e.g. mainstream economist having to have a dinner party talk to someone versed in heterodox, Keynesian or Marxists thinking)
In tech , if you are the “A team”, and someone comes in to ask you to fix/refactor code that has weird patterns or has many odors, and that someone doesn’t seem to be aware of this
You are someone who built businesses and work with exec, then encounter consultants who seem to have pretty simplistic and didactic views of business theories (start mansplaining “Agile” or “First Principles” to you).
People who are extremely talented aren’t going to engage because the opportunity costs are extremely high.
As a caveat, the above is a story which may be wrong entirely, and I am pretty sure even when fully fleshed out, it is only part true. It also seems unnecessarily disagreeable to me, but I don’t have the ability to fix it in a short amount of time, maybe because I am dumb. Note that this caveat you are reading is not, “please don’t make fun of me”—it’s genuinely saying I don’t know, and I want these ideas to be ruthlessly stomped if it’s wrong.