Hey Michael, sorry I am slightly late with my comment.
To start I broadly agree that we should not be misleading about EA in conversation, however my impression is that this is not a large problem (although we might have very different samples).
I am unsure where I stand on moral inclusivity/exclusivity, although as I discuss later I think this is not actually a particularly major problem, as most people do not have a set moral theory.
I am wondering what your ideal inclusive effective altruism outreach looks like?
I am finding it hard to build up a cohesive picture from your post and comments, and I think some of your different points don’t quite gel together in my head (or at least not in an inconvenient possible world).
You give an example of this of beginning a conversation with global poverty before transitioning to explaining the diversity of EA views by:
Point out people understand this in different ways because of their philosophical beliefs about what matters: some focus on helping humans alive today, others on animals, others on trying to make sure humanity doesn’t accidentally wipe itself out, etc.
For those worried about how to ‘sell’ AI in particular, I recently heard Peter Singer give a talk when he said something like (can’t remember exactly): “some people are very worried about about the risks from artificial intelligence. As Nick Bostrom, a philosopher at the University of Oxford pointed out to me, it’s probably not a very good idea, from an evolutionary point of view, to build something smarter than ourselves.” At which point the audience chuckled. I thought it was a nice, very disarming way to make the point.
However trying to make this match the style of an event a student group could actually run, it seems like the closet match (other than a straightforward into to EA event) would be a talk on effective global poverty charity, follow by an addendum on EA being more broad at the end. (I think this due to a variety of practical concerns, such a there being far more good speakers and big names in global poverty, and it providing many concrete examples of how to apply EA concepts etc.)
I am however skeptical that a addendum on the end of a way would create nearly as strong an impression as the subject matter of the talk itself, and people would still leave with a much stronger impression of EA as being about global poverty than e.g. x-risk.
You might say a more diverse approach would be to have talks etc. roughly in proportion to what EAs actually believe is important, so if to make things simple, a third of EAs thought Global poverty was most important, a third x-risk and a third animal suffering, then a third of the talks should be a global poverty, a third on x-risk etc. Each of these could then end with this explanation of EA being more broad etc.
However if people’s current perception that global poverty events is best way to get new people into EA is in fact right (at least in the short term) either by having better attendance or conversion ratios this approach could still lead to the majority of new EAs first introduction to EA being through a global poverty talk.
This due to the previous problem of the addendum not really changing peoples impressions enough we could still end up with the situation you say we should want to avoid where:
People should not feel surprised about what EAs value when they get more involved in the movement.
I am approaching this all more from the student group perspective, and so don’t have strong views on the website stuff, although I will note that my impression was that 80k does a good job on being inclusive, and GWWC is more of an issue with a lack of updates than anything.
One thing you don’t particularly seem to be considering is that almost all people don’t actually have strongly formed moral views that conform to one of the common families (utilitarian, virtue ethics etc.) so I doubt (but could be wrong, as there would probably be a lot of survivor bias in this) that a high percentage of newcomers to EA feel excluded by the current implicit assumptions that might often be made of e.g. future people matter.
Thanks for the comments. FWIW, when I was thinking inclusive I had in mind 1) the websites of EA orgs and 2) introductory pitches at (student) events, rather than the talks involved in running a student group. I have no views on student groups being inclusive in their full roster of talks, not least because I doubt the groups would cohere enough to push a particular moral theory.
I agree that lots of people don’t have strong moral views and I think EA should be a place where they figure out what they think, rather than a place where various orgs push them substantially in one direction or another. As I stress, I think even the perception of a ‘right’ answer is bad for truth seeking. Bed Todd doesn’t seem to have responded to my comments on this, so I’m not really sure what he thinks.
And, again FWIW, survivorship bias is a concern. Anecdataly, I know a bunch of people that decided EA weirdness, particularly with reference to the far future, was want made them decide not to come back.
(Distinct comment on survivorship bias as it seems like a pretty separate topic)
I currently think good knowledge about what drives people away from EA would be valuable, although obviously fairly hard to collect, and can’t remember ever seeing a particularly large collection of reasons given.
I am unsure as to how much we should try and respond to some kinds of complaints though, for things such as people being driven away by weirdness for instance, it is not clear to me that there is much we can do to make EA more inclusive to them without losing a lot of the value of EA (pursuing arguments even if they lead to strange conclusions etc.)
In particular do you know of anyone who left because they only cared about e.g. global poverty and did not want to engage with the far future stuff, who you think would have stayed if EA had been presented to them as including far future stuff from the start? It seems like it might just bring the point when they are put off earlier.
Ah ok, I think I generally agree with your points then (that intro events and websites should be morally inclusive and explain (to some degree) the diversity of EA. My current impression is that this is not much of a problem at the moment. From talking to people working at EA orgs and the reading the advice given to students running into events I think people do advocate for honesty and moral inclusiveness, and when/if it is lacking this is more due to a lack of time/honest mistakes as opposed to conscious planning.
(Although possibly we should try to dedicate much more time to it to try and ensure it is never neglected?)
In particular I associate the whole ‘moral uncertainty’ thing pretty strongly with EA, and in particular CEA and GWWC (but this might just be due to Toby and Will’s work on it) which strikes fairly strongly against part 3 in your main post.
How much of a problem do you think this currently is? The title and tone (use of plea etc.) in your post makes me think you feel we are currently in pretty dire straights.
I also think that generally student run talks (and not specific intro to EA events) are the way most people initially hear about EA (although could be very wrong about this) and so actually the majority of the confusion about what EA is really about would not get addressed by people fully embracing the recommendations in your post. (Although I may just be heavily biased towards how the EA societies I have been involved with have worked).
Hey Michael, sorry I am slightly late with my comment.
To start I broadly agree that we should not be misleading about EA in conversation, however my impression is that this is not a large problem (although we might have very different samples).
I am unsure where I stand on moral inclusivity/exclusivity, although as I discuss later I think this is not actually a particularly major problem, as most people do not have a set moral theory.
I am wondering what your ideal inclusive effective altruism outreach looks like?
I am finding it hard to build up a cohesive picture from your post and comments, and I think some of your different points don’t quite gel together in my head (or at least not in an inconvenient possible world).
You give an example of this of beginning a conversation with global poverty before transitioning to explaining the diversity of EA views by:
However trying to make this match the style of an event a student group could actually run, it seems like the closet match (other than a straightforward into to EA event) would be a talk on effective global poverty charity, follow by an addendum on EA being more broad at the end. (I think this due to a variety of practical concerns, such a there being far more good speakers and big names in global poverty, and it providing many concrete examples of how to apply EA concepts etc.)
I am however skeptical that a addendum on the end of a way would create nearly as strong an impression as the subject matter of the talk itself, and people would still leave with a much stronger impression of EA as being about global poverty than e.g. x-risk.
You might say a more diverse approach would be to have talks etc. roughly in proportion to what EAs actually believe is important, so if to make things simple, a third of EAs thought Global poverty was most important, a third x-risk and a third animal suffering, then a third of the talks should be a global poverty, a third on x-risk etc. Each of these could then end with this explanation of EA being more broad etc.
However if people’s current perception that global poverty events is best way to get new people into EA is in fact right (at least in the short term) either by having better attendance or conversion ratios this approach could still lead to the majority of new EAs first introduction to EA being through a global poverty talk.
This due to the previous problem of the addendum not really changing peoples impressions enough we could still end up with the situation you say we should want to avoid where:
I am approaching this all more from the student group perspective, and so don’t have strong views on the website stuff, although I will note that my impression was that 80k does a good job on being inclusive, and GWWC is more of an issue with a lack of updates than anything.
One thing you don’t particularly seem to be considering is that almost all people don’t actually have strongly formed moral views that conform to one of the common families (utilitarian, virtue ethics etc.) so I doubt (but could be wrong, as there would probably be a lot of survivor bias in this) that a high percentage of newcomers to EA feel excluded by the current implicit assumptions that might often be made of e.g. future people matter.
Hello Alex,
Thanks for the comments. FWIW, when I was thinking inclusive I had in mind 1) the websites of EA orgs and 2) introductory pitches at (student) events, rather than the talks involved in running a student group. I have no views on student groups being inclusive in their full roster of talks, not least because I doubt the groups would cohere enough to push a particular moral theory.
I agree that lots of people don’t have strong moral views and I think EA should be a place where they figure out what they think, rather than a place where various orgs push them substantially in one direction or another. As I stress, I think even the perception of a ‘right’ answer is bad for truth seeking. Bed Todd doesn’t seem to have responded to my comments on this, so I’m not really sure what he thinks.
And, again FWIW, survivorship bias is a concern. Anecdataly, I know a bunch of people that decided EA weirdness, particularly with reference to the far future, was want made them decide not to come back.
(Distinct comment on survivorship bias as it seems like a pretty separate topic)
I currently think good knowledge about what drives people away from EA would be valuable, although obviously fairly hard to collect, and can’t remember ever seeing a particularly large collection of reasons given.
I am unsure as to how much we should try and respond to some kinds of complaints though, for things such as people being driven away by weirdness for instance, it is not clear to me that there is much we can do to make EA more inclusive to them without losing a lot of the value of EA (pursuing arguments even if they lead to strange conclusions etc.)
In particular do you know of anyone who left because they only cared about e.g. global poverty and did not want to engage with the far future stuff, who you think would have stayed if EA had been presented to them as including far future stuff from the start? It seems like it might just bring the point when they are put off earlier.
Ah ok, I think I generally agree with your points then (that intro events and websites should be morally inclusive and explain (to some degree) the diversity of EA. My current impression is that this is not much of a problem at the moment. From talking to people working at EA orgs and the reading the advice given to students running into events I think people do advocate for honesty and moral inclusiveness, and when/if it is lacking this is more due to a lack of time/honest mistakes as opposed to conscious planning. (Although possibly we should try to dedicate much more time to it to try and ensure it is never neglected?)
In particular I associate the whole ‘moral uncertainty’ thing pretty strongly with EA, and in particular CEA and GWWC (but this might just be due to Toby and Will’s work on it) which strikes fairly strongly against part 3 in your main post.
How much of a problem do you think this currently is? The title and tone (use of plea etc.) in your post makes me think you feel we are currently in pretty dire straights.
I also think that generally student run talks (and not specific intro to EA events) are the way most people initially hear about EA (although could be very wrong about this) and so actually the majority of the confusion about what EA is really about would not get addressed by people fully embracing the recommendations in your post. (Although I may just be heavily biased towards how the EA societies I have been involved with have worked).