So, pulling out this sentence, because it feels like it’s by far the most important and not that well highlighted by the format of the post:
what is desired is a superficial critique that stays within and affirms the EA paradigm while it also checks off the boxes of what ‘good criticism’ looks like and it also tells a story of a concrete win that justifies the prize award. Then everyone can feel good about the whole thing, and affirm that EA is seeking out criticism.
This reminds me a lot of a point mentioned in Bad Omens, about a certain aspect of EA which “has the appearance of inviting you to make your own choice but is not-so-subtly trying to push you in a specific direction”.
I’ve also anecdotally had this same worry as a community-builder. I want to be able to clear up misunderstandings, make arguments that folks might not be aware of, and make EA welcoming to folks who might be turned off by superficial pattern-matches that I don’t think are actually informative. But I worry a lot that I can’t avoid doing these things asymmetrically, and that maybe this is how the descent into deceit and dogmatism starts.
The problem at hand seems to be basically that EA has a common set of strong takes, which are leaning towards becoming dogma and screwing up epistemics. But the identity of EA encourages self-image as rational/impartial/unbiased, which makes it hard for us to discuss this out loud—it requires first considering that we strive to be rational/impartial/unbiased, but are nowhere near yet.
Alternatively, for some of the goals and assumptions of EA which are subjective in nature or haven’t been (or cannot be) assessed very well, e.g.:
“One ought to do the maximal possible amount of good”
“Donating to charity is a good way to improve the world”
“Representation of the recipients of aid is good, but optional”
...there’s value in getting critique that takes those as granted and tries to improve impact, but there may be even greater value in critiques about why the goals and assumptions themselves are bad or lacking.
So, pulling out this sentence, because it feels like it’s by far the most important and not that well highlighted by the format of the post:
This reminds me a lot of a point mentioned in Bad Omens, about a certain aspect of EA which “has the appearance of inviting you to make your own choice but is not-so-subtly trying to push you in a specific direction”.
I’ve also anecdotally had this same worry as a community-builder. I want to be able to clear up misunderstandings, make arguments that folks might not be aware of, and make EA welcoming to folks who might be turned off by superficial pattern-matches that I don’t think are actually informative. But I worry a lot that I can’t avoid doing these things asymmetrically, and that maybe this is how the descent into deceit and dogmatism starts.
The problem at hand seems to be basically that EA has a common set of strong takes, which are leaning towards becoming dogma and screwing up epistemics. But the identity of EA encourages self-image as rational/impartial/unbiased, which makes it hard for us to discuss this out loud—it requires first considering that we strive to be rational/impartial/unbiased, but are nowhere near yet.
Alternatively, for some of the goals and assumptions of EA which are subjective in nature or haven’t been (or cannot be) assessed very well, e.g.:
“One ought to do the maximal possible amount of good”
“Donating to charity is a good way to improve the world”
“Representation of the recipients of aid is good, but optional”
...there’s value in getting critique that takes those as granted and tries to improve impact, but there may be even greater value in critiques about why the goals and assumptions themselves are bad or lacking.