Epistemic status: Pure opinion, but based on a lot of real-world experience
Given the number of non-analytic people involved in EA, I donât think having a quantitative mindset is a prerequisite.
Iâve known or known of many people for whom the essential concept of âexpected valueâ read as âif you want to buy something good, buy it for a low price when you canâ, which doesnât require any major intuitive leaps from everyday life. Same for âprobabilityâ, which reads to many people as âdo what has the best chance of working outâ (a lot of people seem to understand this when it applies to EA issues like supporting GiveWell-type charities vs. charities with murkier missions).
I think having intellectual preparation of the kind you mentioned can be helpful, but I also think that there are more important reasons EA seems to have such a quantitative concentration:
The types of EA orgs that exist in the public eye tend to have roles that lean very analytical. Itâs not surprising that the average GiveWell researcher is very comfortable with quantitative thinking, but this doesnât tell us much about the average Giving What We Can member (most of whom quietly donate a lot of money to excellent charities without rising to âpublic attentionâ among EAs).
People interested in EA tend to promote it more to friends than strangers, which creates a natural bubble effect; people who get involved with EA now are more likely than chance to resemble people who got involved early on (e.g. philosophers, economists, and tech folks). If you look at people who got into EA because they read about it in a mainstream news outlet, or happened to pick up Will MacAskillâs book when it was on sale somewhere, I think youâll find a weaker quantitative skew than with people who were introduced by friends. (This is a prediction I donât yet have any way to validate.)
Thanks for your thoughts. I wasnât thinking about the submerged part of the EA iceberg (e.g. GWWC membership), and I do feel somewhat less confident in my initial thoughts.
Still, I wonder if youâd countenance a broader version of my initial point- that there is a way of thinking that is not itself explicitly quantitative, but that is nonetheless very common among quantitative types. Iâm tempted to call this ârationality,â but itâs not obvious to me that this thinking style is as all-encompassing as what LW-ers, for example, mean when they talk about rationality.
The examples you give of commonsensical versions of expected value and probability are what Iâm thinking about here- perhaps the intuitive, informal versions of these concepts are soft prerequisites. This thinking style is not restricted to the formally trained, but it is more common among them (because itâs trained into them). So in my (revised) telling, the thinking style is a prerequisite and explicitly quantitative types are overrepresented in EA simply because theyâre more likely to have been exposed to these concepts in either a formal or informal setting.
The reason I think this might be important is that I occasionally have conversations in which these conceptsâin the informal senseâseem unfamiliar. âDo what has the best chance of working outâ is, in my experience, a surprisingly rare way of conducting everyday business in the world, and some people seem to find it strange and new to think in that fashion. The possible takeaway is that some basic informal groundwork might need to be done to maximize the efficacy of different EA messages.
I basically agree that having intuitions similar to those I outlined seems very important and perhaps necessary for getting involved with EA. (I think you can be âinterestedâ without those things, because EA seems shiny and impressive if you read certain things about it, but not having a sense for how you should act based on EA ideas will limit how involved you actually get.) Your explanation about exposure to related concepts almost definitely explains some of the variance youâve spotted.
I spend a lot of my EA-centric conversations trying to frame things to people in a non-quantitative way (at least if they arenât especially quantitative themselves).
Iâm a huge fan of people doing âbasic groundworkâ to maximize the efficacy of EA messages. Iâd be likely to fund such work if it existed and I thought the quality was reasonably high. However, Iâm not aware of many active projects in this domain; ClearerThinking.org and normal marketing by GiveWell et al. are all that come to mind, plus things like big charitable matches that raise awareness of EA charities as a side effect.
Oh, and then thereâs this contest, which Iâm very excited about and would gladly sponsor more test subjects for if possible. Thanks for reminding me that I should write to Eric Schwitzgebel about this.
Epistemic status: Pure opinion, but based on a lot of real-world experience
Given the number of non-analytic people involved in EA, I donât think having a quantitative mindset is a prerequisite.
Iâve known or known of many people for whom the essential concept of âexpected valueâ read as âif you want to buy something good, buy it for a low price when you canâ, which doesnât require any major intuitive leaps from everyday life. Same for âprobabilityâ, which reads to many people as âdo what has the best chance of working outâ (a lot of people seem to understand this when it applies to EA issues like supporting GiveWell-type charities vs. charities with murkier missions).
I think having intellectual preparation of the kind you mentioned can be helpful, but I also think that there are more important reasons EA seems to have such a quantitative concentration:
The types of EA orgs that exist in the public eye tend to have roles that lean very analytical. Itâs not surprising that the average GiveWell researcher is very comfortable with quantitative thinking, but this doesnât tell us much about the average Giving What We Can member (most of whom quietly donate a lot of money to excellent charities without rising to âpublic attentionâ among EAs).
People interested in EA tend to promote it more to friends than strangers, which creates a natural bubble effect; people who get involved with EA now are more likely than chance to resemble people who got involved early on (e.g. philosophers, economists, and tech folks). If you look at people who got into EA because they read about it in a mainstream news outlet, or happened to pick up Will MacAskillâs book when it was on sale somewhere, I think youâll find a weaker quantitative skew than with people who were introduced by friends. (This is a prediction I donât yet have any way to validate.)
Thanks for your thoughts. I wasnât thinking about the submerged part of the EA iceberg (e.g. GWWC membership), and I do feel somewhat less confident in my initial thoughts.
Still, I wonder if youâd countenance a broader version of my initial point- that there is a way of thinking that is not itself explicitly quantitative, but that is nonetheless very common among quantitative types. Iâm tempted to call this ârationality,â but itâs not obvious to me that this thinking style is as all-encompassing as what LW-ers, for example, mean when they talk about rationality.
The examples you give of commonsensical versions of expected value and probability are what Iâm thinking about here- perhaps the intuitive, informal versions of these concepts are soft prerequisites. This thinking style is not restricted to the formally trained, but it is more common among them (because itâs trained into them). So in my (revised) telling, the thinking style is a prerequisite and explicitly quantitative types are overrepresented in EA simply because theyâre more likely to have been exposed to these concepts in either a formal or informal setting.
The reason I think this might be important is that I occasionally have conversations in which these conceptsâin the informal senseâseem unfamiliar. âDo what has the best chance of working outâ is, in my experience, a surprisingly rare way of conducting everyday business in the world, and some people seem to find it strange and new to think in that fashion. The possible takeaway is that some basic informal groundwork might need to be done to maximize the efficacy of different EA messages.
I basically agree that having intuitions similar to those I outlined seems very important and perhaps necessary for getting involved with EA. (I think you can be âinterestedâ without those things, because EA seems shiny and impressive if you read certain things about it, but not having a sense for how you should act based on EA ideas will limit how involved you actually get.) Your explanation about exposure to related concepts almost definitely explains some of the variance youâve spotted.
I spend a lot of my EA-centric conversations trying to frame things to people in a non-quantitative way (at least if they arenât especially quantitative themselves).
Iâm a huge fan of people doing âbasic groundworkâ to maximize the efficacy of EA messages. Iâd be likely to fund such work if it existed and I thought the quality was reasonably high. However, Iâm not aware of many active projects in this domain; ClearerThinking.org and normal marketing by GiveWell et al. are all that come to mind, plus things like big charitable matches that raise awareness of EA charities as a side effect.
Oh, and then thereâs this contest, which Iâm very excited about and would gladly sponsor more test subjects for if possible. Thanks for reminding me that I should write to Eric Schwitzgebel about this.