I have thought for a long time that the EA power centers have a lack of curiosity or appropriate respect for the potential of the EA community and has a pretty specific set of markers that suggest that people should be given the time of day. I have the impression that there is a pretty small “nerve center” that sets priorities and considers how the EA community might be helpful to address these priorities along the paths that it sees.
This seems to me to limit the power of EA significantly: if we could be getting more perspectives and ideas taken seriously and dedicating resources in these direction, rather than just to relatively narrow agents of the “nerve center”, we might be able to accomplish a lot more. Right now it seems pretty sad that EA is often just identified with the people who have power in it, rather than its more basic and important idea of maximizing good with the resources that we have.
I suppose there have probably been posts along these lines, but I guess a “democratizing EA funding and power as maximizing epistemic hygiene and reach” would be appreciated.
I get the impression posts along these lines pop up from time to time, see e.g. the one ben linked below. Personally, Id like to see way more (cross) cause prioritisation than today, but this is constrained by funding rather than interest or talent.
I have thought for a long time that the EA power centers have a lack of curiosity or appropriate respect for the potential of the EA community and has a pretty specific set of markers that suggest that people should be given the time of day. I have the impression that there is a pretty small “nerve center” that sets priorities and considers how the EA community might be helpful to address these priorities along the paths that it sees.
This seems to me to limit the power of EA significantly: if we could be getting more perspectives and ideas taken seriously and dedicating resources in these direction, rather than just to relatively narrow agents of the “nerve center”, we might be able to accomplish a lot more. Right now it seems pretty sad that EA is often just identified with the people who have power in it, rather than its more basic and important idea of maximizing good with the resources that we have.
I suppose there have probably been posts along these lines, but I guess a “democratizing EA funding and power as maximizing epistemic hygiene and reach” would be appreciated.
I get the impression posts along these lines pop up from time to time, see e.g. the one ben linked below. Personally, Id like to see way more (cross) cause prioritisation than today, but this is constrained by funding rather than interest or talent.