Thanks for your thoughts. I wasn’t thinking about the submerged part of the EA iceberg (e.g. GWWC membership), and I do feel somewhat less confident in my initial thoughts.
Still, I wonder if you’d countenance a broader version of my initial point- that there is a way of thinking that is not itself explicitly quantitative, but that is nonetheless very common among quantitative types. I’m tempted to call this ‘rationality,’ but it’s not obvious to me that this thinking style is as all-encompassing as what LW-ers, for example, mean when they talk about rationality.
The examples you give of commonsensical versions of expected value and probability are what I’m thinking about here- perhaps the intuitive, informal versions of these concepts are soft prerequisites. This thinking style is not restricted to the formally trained, but it is more common among them (because it’s trained into them). So in my (revised) telling, the thinking style is a prerequisite and explicitly quantitative types are overrepresented in EA simply because they’re more likely to have been exposed to these concepts in either a formal or informal setting.
The reason I think this might be important is that I occasionally have conversations in which these concepts—in the informal sense—seem unfamiliar. “Do what has the best chance of working out” is, in my experience, a surprisingly rare way of conducting everyday business in the world, and some people seem to find it strange and new to think in that fashion. The possible takeaway is that some basic informal groundwork might need to be done to maximize the efficacy of different EA messages.
I basically agree that having intuitions similar to those I outlined seems very important and perhaps necessary for getting involved with EA. (I think you can be “interested” without those things, because EA seems shiny and impressive if you read certain things about it, but not having a sense for how you should act based on EA ideas will limit how involved you actually get.) Your explanation about exposure to related concepts almost definitely explains some of the variance you’ve spotted.
I spend a lot of my EA-centric conversations trying to frame things to people in a non-quantitative way (at least if they aren’t especially quantitative themselves).
I’m a huge fan of people doing “basic groundwork” to maximize the efficacy of EA messages. I’d be likely to fund such work if it existed and I thought the quality was reasonably high. However, I’m not aware of many active projects in this domain; ClearerThinking.org and normal marketing by GiveWell et al. are all that come to mind, plus things like big charitable matches that raise awareness of EA charities as a side effect.
Oh, and then there’s this contest, which I’m very excited about and would gladly sponsor more test subjects for if possible. Thanks for reminding me that I should write to Eric Schwitzgebel about this.