Native languages in the EA community (and issues with assessing promisingness)

Epistemic status: not certain— a hand-wavy ramble.

Disclaimer: all opinions here are my own, not my employer’s.

Summary

  1. Language barriers can cause people to be dismissive of writing/​speech due to aesthetics or communicative clumsiness rather than the content of the speech.

  2. This manifests in the EA community in a number of ways, one of which is that the language we speak is informed by our STEM-leaning community and our corresponding tendency to think quantitatively, which creates a context (discursive context?) that is foreign to people who are less STEM-y.[1]

  3. Harm 1: Good ideas and useful knowledge from these groups get discounted as a result.

  4. Harm 2: Talented non-STEM people who are into EA get misunderstood when they try to communicate, and don’t get noticed as “promising” (and we do in fact really need the these folks).

  5. Harm 3: non-STEM folks have a worse experience in the EA community.

  6. Some suggestions

Pre-amble

My university had a lot of international students. In class discussions and group interactions, I would often notice that the contributions of international students (those whose first language wasn’t English) tended to be dismissed faster and interrupted more frequently than others’, even when the others’ thoughts seemed less insightful. My guess was that this was due to the smoother and more familiar presentation of the native speakers’ thoughts.

This situation was not new to me; I had spent a total of around three years in school in France, experiencing something similar first-hand (as a nerdy American kid). I would frequently have to suppress a mix of anger, annoyance, and shame at being patronized or ignored because my French was halting and awkward.

The point of these stories is that not being fluent (or being less-than-native) in the language of the community with which you are conversing makes everything harder. And it can be unpleasant and discouraging.

I think a version of this happens in the EA community with people who are less STEM-y than the average. (It also happens with other underrepresented groups in EA, but I’m focusing on this one for this post.)

Harm 1: We lose out on ideas and knowledge

I have most frequently seen this phenomenon in live conversations. These folks’ natural speech or writing follows different norms, and they contort their thoughts to make the EA community hear them. They misuse jargon like “updating” and “outside view” in an attempt to get their point across, and their interlocutors decide that talking with them is not worth their time.[2]

More generally, I think experienced (and assimilated) members of the community tend to interpret a lack of fluency in the “language of EA” as a general lack of knowledge or skill. This, together with our tendency to miss writing that comes from outside the community,[3] leads to the community systematically deafening itself to communication by non-STEM folks.

Harm 2: EA is less actively welcoming towards non-STEM people, so we lose out on some of those people

This also factors into the ways promising people (e.g. students) who should be mentored and helped on their way to impact are identified by the EA community. My impression is that currently, this often happens through connections or random conversations where someone’s potential gets identified by someone well-placed in the community. I’m worried that we’re ending up with self-sustaining mechanisms by which interested and talented people who don’t speak numerically (or misuse “outside view”) are considered less promising and are not supported in earlier stages of their career.

(Notice that this can happen both ways: if EA always speaks the language of the STEM-y, less STEM-y people will potentially discount the EA community and think the theories it presents are stupid. This is somewhat related to the idea of inferential distances and this blog post about “Alike minds.”)

Of course, it’s true that a nuanced quantitative model (or even a simple Fermi estimate) of some phenomenon is often helpful, and can be a reasonable signal of quality.[4] But our focus on such quantitative elements misses other signals. Consider, for instance, the effect of illustrations or other visualizations, clarity in exposition, historical anecdotes, moving speech, etc..[5] Moreover, some aspects of the way the EA community talks are due to the community’s history rather than the inherent usefulness of those aspects. (The archetypal EA Forum post has an epistemic status, some technical terms, and maybe a calculation— which are all arguably useful. But it’s also got some rationalist jargon or a reference to hpmor.) (More on jargon in this post.)

I also suspect that talented less STEM-y people tend to get fewer chances to find out about EA and get involved than talented STEM-y people do, which would exacerbate the problem if true. So I think we should try to notice if we’re unusually likely to be the only point of contact someone has with EA. In particular, if you’re talking to a math major, you’re probably not the only person who can notice that this student should probably join a group and apply to an alignment bootcamp or something. But if you’re talking to a creative writing major who seems interested, you may be the only EA for a while who will get the chance to notice that this person is brilliant and altruistic and should join a group, become a comms person, write EA-inspired fiction, or take on a research project.[6]

I’m not claiming that numerical literacy is not important for EA. I absolutely think it is. But so are other skills.[7]

I think people who are more comfortable writing than modeling, or people who are better at managing interpersonal relations than at establishing a base rate or unraveling some technical issue— people who are not very STEM-y— can significantly help the world the way EA wants to do and are overlooked by the processes we use to notice and support promising people. In fact, all else equal, someone joining from an underrepresented field (like comparative religion) may be able to pick up more low-hanging fruit than someone coming in from a typical field for the community (like math or economics).

(Possible objections to this section. 1. We need to grow the community urgently, and STEM-y people are easier to reach. 2. Non-STEM people don’t have some skills that are fundamentally necessary for doing EA stuff. (As discussed, I don’t think this is the case.) 3. It’s currently too costly to fight Harm 2 for some reason.)

Harm 3: non-STEM folks have a worse experience in the EA community

I would guess that non-STEM people tend to have a worse experience in the community for reasons like the ones sketched out in the preamble above. I don’t think that their experience tends to be actively hostile, but I do think that it’s harder than it should be, and that we can improve it.

My suggestions

  1. Actively try to notice if you’re subconsciously dismissing someone because they speak a different language, not because the content of their thinking is unhelpful.

  2. Try a bit harder to understand people whose background is different from yours and be aware of the curse of knowledge when communicating.

  3. Create opportunities for talented-but-less-quantitative junior people who are into EA.

    1. Hire copy-writers! Notice potential awesome community organizers! Fund creative outreach projects! Etc.

    2. Some existing/​past projects: Humanities Ideas for Longtermists, Social science projects on AI/​x-risks, the Creative Writing Contest.

  4. Promote translation of EA concepts into less STEM-heavy writing or communication. (Conversely, import good foreign-to-EA ideas, e.g. by posting summaries).

  5. Use less jargon when possible, and help with overuse of jargon in other ways.

  6. If you’re involved with community building at a university, consider trying to reach out to non-STEM majors (and check that you aren’t accidentally excluding these people with your outreach process).

I would be very excited to hear more ideas on this front.

(Thanks to my brother and to Jonathan Michel for giving feedback on drafts of this post.)

Notes


  1. ↩︎

    Note: I use “STEM” here as a shorthand for the dominant field and culture in EA, which is related to STEM-iness, but isn’t exactly it.

  2. ↩︎

    I use the pronoun “they” for this less STEM-y group, but to a certain extent I identify with them and have made the mistakes I listed. Although I was also a math major.

  3. ↩︎

    Relatedly, it would be great if people posted more summaries and collections

  4. ↩︎

    And it’s also truly easier to talk to people who speak like us.

  5. ↩︎

    Additionally, some good ideas and concepts are simply hard to put into quantitative language, so if that’s our primary mode of signaling quality, we’ll miss out on those.

  6. ↩︎

    And even if they do get involved, it may be harder for them to identify their next steps.

  7. ↩︎

    Semi-relevant post that talks about different aptitudes people may have: Modelers and Indexers