People co-opting the term ‘effective altruism’ to justify activities that they were already doing that clearly wouldn’t be supported by EA reasoning
I don’t know the particulars of the situation(s) that Will is referring to here, but as a general principle I think this is a very dangerous criterion to use for community censure and/or expulsion. What is “clearly supported by EA reasoning” is clearly in the eye of the beholder, if the endless debates on this forum and elsewhere are any indication.
I think the principle that Will is getting at is open-mindedness, or a lack thereof. Given that reason is so central to EA’s identity as a movement, we certainly don’t want to welcome or encourage ideologues who are unwilling to change their minds about things.
To me, however, there is a huge and very important difference between the following types of people:
Someone who brings strong opinions and perspectives based on prior knowledge and experience to the community, is willing to engage in good faith discussion with others about those opinions and why they might be wrong, and ultimately holds to their original views;
Someone who brings strong opinions and perspectives based on prior knowledge and experience to the community, is unwilling or unable to engage in good faith discussion with others about those opinions and why they might be wrong, and ultimately holds to their original views.
I feel that people who fit the former description can add tremendous value to the community in ways that people who fit the latter do not, especially when their views and reasoning are out of sync with the mainstream of EA thinking. But I would be very concerned about the former type of person being confused with the latter type when they decline to change their mind; after all, if one’s priors are sufficiently strong, it’s perfectly rational to require a high bar to change one’s mind! I worry that attempts to police use of the term “effective altruism” based on refusal to update visibly on non-mainstream ideas would ultimately harm intellectual diversity and be shortsighted in relation to EA’s goals.
(Edit: to be clear, I am not against the idea of a panel overall.)
I don’t know the particulars of the situation(s) that Will is referring to here, but as a general principle I think this is a very dangerous criterion to use for community censure and/or expulsion. What is “clearly supported by EA reasoning” is clearly in the eye of the beholder, if the endless debates on this forum and elsewhere are any indication.
I think the principle that Will is getting at is open-mindedness, or a lack thereof. Given that reason is so central to EA’s identity as a movement, we certainly don’t want to welcome or encourage ideologues who are unwilling to change their minds about things.
To me, however, there is a huge and very important difference between the following types of people:
Someone who brings strong opinions and perspectives based on prior knowledge and experience to the community, is willing to engage in good faith discussion with others about those opinions and why they might be wrong, and ultimately holds to their original views;
Someone who brings strong opinions and perspectives based on prior knowledge and experience to the community, is unwilling or unable to engage in good faith discussion with others about those opinions and why they might be wrong, and ultimately holds to their original views.
I feel that people who fit the former description can add tremendous value to the community in ways that people who fit the latter do not, especially when their views and reasoning are out of sync with the mainstream of EA thinking. But I would be very concerned about the former type of person being confused with the latter type when they decline to change their mind; after all, if one’s priors are sufficiently strong, it’s perfectly rational to require a high bar to change one’s mind! I worry that attempts to police use of the term “effective altruism” based on refusal to update visibly on non-mainstream ideas would ultimately harm intellectual diversity and be shortsighted in relation to EA’s goals.
(Edit: to be clear, I am not against the idea of a panel overall.)