De-emphasizing cause neutrality could(my guess is) probably would reduce the long-term impact of the movement substantially. Trying to answer the question “How do the most good”, without attempting to be neutral between causes we are passionate about and causes we don’t (intuitively) care that much about would bias us towards causes and paths that are interesting to us rather than particularly impactful causes. Personal fit and being passionate about what you do is absolutely important, but when we’re trying to compare causes and comparing actions/careers in terms of impact(or ITN), our answer shouldn’t be dependent on our personal interests and passions, but when we’re taking action based on those answers then we should think about personal fit and passions, as these prevent us from being miserable while we’re pursuing impact. And also, cause neutrality should nudge people against associating EA with a singular cause like AI Safety or global development or even 80k careers, I think extreme cause neutrality is a solution to the problem you describe, rather than being root of the problem. De-emphasizing cause neutrality would increase the likelihood of EA becoming mainstream and popular, but it would also undermine our focus and emphasis on impartiality and good epistemics, which were/are vital factors why EA was able to identify so many high-impact problems and take action to tackle those problems effectively imho.
I think this is actually a good example of the dynamic the author is pointing at.
While some people simply care about doing the most good, others will care about doing the most good in X area, and barring the second kind of person from EA is, in my opinion, both not optimal and not conducive to learning. More importantly, the assumption of cause neutrality in this fashion is precisely one of the differences between EA as a question and EA as an ideology.
Cause selection being strictly guided by neutral calculation will likely cause a lot of lost of potential, for reasons you’ve pointed to (I have some difficulty parsing this paragraph and am not sure where you think it’s appropriate or inappropriate to factor in personal fit and passion):
Personal fit and being passionate about what you do is absolutely important, but when we’re trying to compare causes and comparing actions/careers in terms of impact(or ITN), our answer shouldn’t be dependent on our personal interests and passions, but when we’re taking action based on those answers then we should think about personal fit and passions, as these prevent us from being miserable while we’re pursuing impact.
More importantly, the impact of many causes are a lot more difficult to measure quantifiably and definitely, let alone in a meaningful way. These causes are de facto left out of EA discussion or will “lose” to causes that allow for cleaner and easier quantitative analysis, which does not seem idea as it leads to a lot of lost potential.
If you’ll forgive the marketing terminology, I think cause neutrality is EA’s Unique Selling Point. It’s the main thing EA brings to the table, its value add, the thing that’s so hard to find anywhere else. It’s great that people committed to particular causes want to be as effective as possible within them—better than not caring much for effectiveness at all—but there are other places they can find company and support. EA can’t be for literally everyone otherwise it doesn’t mean anything, so it has to draw a line somewhere and I think that the most natural place is around the idea/behaviour/value that makes EA most distinctive (and, I would argue, most impactful).
To your second bullet point, I can’t think of an area where it’s more difficult to measure impact quantitatively and definitely than longtermism.
I agree that EA can’t be for everyone and I don’t think it should try to be, but I personally don’t think that cause neutrality is EA’s unique selling point or the main thing it brings to the table, although I do understand that there are different approaches to EA.
To your second bullet point, I can’t think of an area where it’s more difficult to measure impact quantitatively and definitely than longtermism.
I agree that longtermist impact isn’t really measurable, but this makes it hard for me reconcile cause neutrality with longtermism rather than feel like rigid cause neutrality would not have the effect I stated.
De-emphasizing cause neutrality could(my guess is) probably would reduce the long-term impact of the movement substantially. Trying to answer the question “How do the most good”, without attempting to be neutral between causes we are passionate about and causes we don’t (intuitively) care that much about would bias us towards causes and paths that are interesting to us rather than particularly impactful causes. Personal fit and being passionate about what you do is absolutely important, but when we’re trying to compare causes and comparing actions/careers in terms of impact(or ITN), our answer shouldn’t be dependent on our personal interests and passions, but when we’re taking action based on those answers then we should think about personal fit and passions, as these prevent us from being miserable while we’re pursuing impact. And also, cause neutrality should nudge people against associating EA with a singular cause like AI Safety or global development or even 80k careers, I think extreme cause neutrality is a solution to the problem you describe, rather than being root of the problem.
De-emphasizing cause neutrality would increase the likelihood of EA becoming mainstream and popular, but it would also undermine our focus and emphasis on impartiality and good epistemics, which were/are vital factors why EA was able to identify so many high-impact problems and take action to tackle those problems effectively imho.
I think this is actually a good example of the dynamic the author is pointing at.
While some people simply care about doing the most good, others will care about doing the most good in X area, and barring the second kind of person from EA is, in my opinion, both not optimal and not conducive to learning. More importantly, the assumption of cause neutrality in this fashion is precisely one of the differences between EA as a question and EA as an ideology.
Cause selection being strictly guided by neutral calculation will likely cause a lot of lost of potential, for reasons you’ve pointed to (I have some difficulty parsing this paragraph and am not sure where you think it’s appropriate or inappropriate to factor in personal fit and passion):
More importantly, the impact of many causes are a lot more difficult to measure quantifiably and definitely, let alone in a meaningful way. These causes are de facto left out of EA discussion or will “lose” to causes that allow for cleaner and easier quantitative analysis, which does not seem idea as it leads to a lot of lost potential.
If you’ll forgive the marketing terminology, I think cause neutrality is EA’s Unique Selling Point. It’s the main thing EA brings to the table, its value add, the thing that’s so hard to find anywhere else. It’s great that people committed to particular causes want to be as effective as possible within them—better than not caring much for effectiveness at all—but there are other places they can find company and support. EA can’t be for literally everyone otherwise it doesn’t mean anything, so it has to draw a line somewhere and I think that the most natural place is around the idea/behaviour/value that makes EA most distinctive (and, I would argue, most impactful).
To your second bullet point, I can’t think of an area where it’s more difficult to measure impact quantitatively and definitely than longtermism.
I agree that EA can’t be for everyone and I don’t think it should try to be, but I personally don’t think that cause neutrality is EA’s unique selling point or the main thing it brings to the table, although I do understand that there are different approaches to EA.
I agree that longtermist impact isn’t really measurable, but this makes it hard for me reconcile cause neutrality with longtermism rather than feel like rigid cause neutrality would not have the effect I stated.