I think most will agree that it’s not advisable to simply try to persuade as many people as possible. That said, given the widespread recognition that poor or inept messaging can put people off EA ideas, the question of persuasion doesn’t seem to be one that we can entirely put aside.
A couple of questions (among others) will be relevant to how far we should merely offer and not try to persuade: how many people we think will be initially (genuinely) interested in EA and how many people we think would be potentially (genuinely) interested in EA were it suitably presented.
A very pessimistic view across these questions is that very few people are inclined to be interested in EA initially and very few would be interested after persuasion (e.g. because EA is a weird idea compelling only to a minority who are weird on a number of dimensions, and most people are highly averse to its core demands). On this view, offering and not trying to persuade, seems appealing, because few will be interested, persuasion won’t help, and all you can do is hope some of the well inclined minority will hear your message.
If you think very few will be initially inclined but (relatively) many more would be inclined with suitable persuasion (e.g. because EA ideas are very counter-intuitive, inclined to sound very off-putting, but can be appealing if framed adroitly), then the opposite conclusion follows: it seems like persuasion is high value (indeed a necessity).
Conversely, if you are a more optimistic (many people intuitively like EA: it’s just “doing the most good you can do + good evidence!”) then persuasion looks less important (unless you also think that persuasion can bring many additional gains even above the high baseline of EA-acceptance already).
-
Another big distinction which I assume is, perhaps, motivating the “offer, don’t persuade” prescription, is whether people think that persuasion tends to influence the quality of those counterfactual recruits negatively, neutrally or positively. The negative view might be motivated by thinking that persuading people (especially via dubious representations of EA) who wouldn’t otherwise have liked EA’s offer will disproportionately bring in people who don’t really accept EA. The neutral view might be motivated by positing that many people are turned-off (or attracted to) EA by considerations orthogonal to actual EA content (e.g. nuances of framing, or whether they instinctively non-rationally like/dislike ideas things EA happens to be associated with (e.g. sci-fi)). The positive view, might be motivated by thinking that certain groups are turned off, disproportionately, by unpersuasive messages (e.g. women and minorities do not find EA attractive, but would do with more carefully crafted, symbolically not off-putting outreach), and thinking that getting more of these groups would be epistemically salutary for some reason.
-
Another major consideration would simply be how many EAs we presently have relative to desired numbers. If we think we have plenty (or even, more than we can train/onboard), then working to persuade/attract more people seems unappealing and conversely if we highly value having more people, then the converse. I think it’s very reasonable that we switch our priorities from trying to attract more people to not, depending on present needs. I’m somewhat concerned that perceived present needs get reflected in a kind of ‘folk EA wisdom’ i.e. when we lack(ed) people, the general idea that movement building is many times more effective than most direct work, was popularised, whereas now we have more people (for certain needs), the general idea of ‘quality trumps quantity’ gets popularised. But I’m worried the very general memes aren’t especially sensitive to actual supply/demand/needs and would be hard/slow to update, if needs were different. This also becomes very tricky when different groups have different needs/shortages.
Comment mostly copied from Facebook:
I think most will agree that it’s not advisable to simply try to persuade as many people as possible. That said, given the widespread recognition that poor or inept messaging can put people off EA ideas, the question of persuasion doesn’t seem to be one that we can entirely put aside.
A couple of questions (among others) will be relevant to how far we should merely offer and not try to persuade: how many people we think will be initially (genuinely) interested in EA and how many people we think would be potentially (genuinely) interested in EA were it suitably presented.
A very pessimistic view across these questions is that very few people are inclined to be interested in EA initially and very few would be interested after persuasion (e.g. because EA is a weird idea compelling only to a minority who are weird on a number of dimensions, and most people are highly averse to its core demands). On this view, offering and not trying to persuade, seems appealing, because few will be interested, persuasion won’t help, and all you can do is hope some of the well inclined minority will hear your message.
If you think very few will be initially inclined but (relatively) many more would be inclined with suitable persuasion (e.g. because EA ideas are very counter-intuitive, inclined to sound very off-putting, but can be appealing if framed adroitly), then the opposite conclusion follows: it seems like persuasion is high value (indeed a necessity).
Conversely, if you are a more optimistic (many people intuitively like EA: it’s just “doing the most good you can do + good evidence!”) then persuasion looks less important (unless you also think that persuasion can bring many additional gains even above the high baseline of EA-acceptance already).
-
Another big distinction which I assume is, perhaps, motivating the “offer, don’t persuade” prescription, is whether people think that persuasion tends to influence the quality of those counterfactual recruits negatively, neutrally or positively. The negative view might be motivated by thinking that persuading people (especially via dubious representations of EA) who wouldn’t otherwise have liked EA’s offer will disproportionately bring in people who don’t really accept EA. The neutral view might be motivated by positing that many people are turned-off (or attracted to) EA by considerations orthogonal to actual EA content (e.g. nuances of framing, or whether they instinctively non-rationally like/dislike ideas things EA happens to be associated with (e.g. sci-fi)). The positive view, might be motivated by thinking that certain groups are turned off, disproportionately, by unpersuasive messages (e.g. women and minorities do not find EA attractive, but would do with more carefully crafted, symbolically not off-putting outreach), and thinking that getting more of these groups would be epistemically salutary for some reason.
-
Another major consideration would simply be how many EAs we presently have relative to desired numbers. If we think we have plenty (or even, more than we can train/onboard), then working to persuade/attract more people seems unappealing and conversely if we highly value having more people, then the converse. I think it’s very reasonable that we switch our priorities from trying to attract more people to not, depending on present needs. I’m somewhat concerned that perceived present needs get reflected in a kind of ‘folk EA wisdom’ i.e. when we lack(ed) people, the general idea that movement building is many times more effective than most direct work, was popularised, whereas now we have more people (for certain needs), the general idea of ‘quality trumps quantity’ gets popularised. But I’m worried the very general memes aren’t especially sensitive to actual supply/demand/needs and would be hard/slow to update, if needs were different. This also becomes very tricky when different groups have different needs/shortages.