I think calling this a problem of representation is actually understating the problem here.
EA has (at least to me) always been a community that inspires encourages and supports people to use all the information and tools available to them (including their individual priors intuitions and sense of morality) to reach a conclusion about what causes and actions are most important for them to take to make a better world (and of course to then take those actions).
Even if 90% of experienced EAs / EA community leaders currently converge on the same conclusion as to where value lies, I would worry that a strong focus on that issue would be detrimental. We’d be at risk of losing the emphasis on cause prioritisation—arguably most useful insight that EA has provided to the world.
We’d risk losing an ability to support people though cause prioritisation (coaching, EA or otherwise, should not pre-empt the answers or have ulterior motives)
we risk creating a community that is less able to switch to focus on the most important thing
we risk stifling useful debate
we risk creating a community that does not benefits from collaboration by people working in different areas
etc
(Note: Probably worth adding that if 90% of experienced EAs / EA community leaders converged on the same conclusion on causes my intuitions would suggest that this is likley to be evidence of founder effects / group-think as much as it is evidence for that cause. I expect this is because I see a huge diversity in people’s values and thinking and a difficulty in reaching strong conclusions in ethics and cause prioritisation)
Hi Joey, thank you for writing this.
I think calling this a problem of representation is actually understating the problem here.
EA has (at least to me) always been a community that inspires encourages and supports people to use all the information and tools available to them (including their individual priors intuitions and sense of morality) to reach a conclusion about what causes and actions are most important for them to take to make a better world (and of course to then take those actions).
Even if 90% of experienced EAs / EA community leaders currently converge on the same conclusion as to where value lies, I would worry that a strong focus on that issue would be detrimental. We’d be at risk of losing the emphasis on cause prioritisation—arguably most useful insight that EA has provided to the world.
We’d risk losing an ability to support people though cause prioritisation (coaching, EA or otherwise, should not pre-empt the answers or have ulterior motives)
we risk creating a community that is less able to switch to focus on the most important thing
we risk stifling useful debate
we risk creating a community that does not benefits from collaboration by people working in different areas
etc
(Note: Probably worth adding that if 90% of experienced EAs / EA community leaders converged on the same conclusion on causes my intuitions would suggest that this is likley to be evidence of founder effects / group-think as much as it is evidence for that cause. I expect this is because I see a huge diversity in people’s values and thinking and a difficulty in reaching strong conclusions in ethics and cause prioritisation)