One thing to worry about here is deception. All things being equal, it’s general a reason against doing something that is deceives people, and trying to ease people in gently can be a special case of that because it’s deceiving them about the beliefs you hold. It also might stop you yourself getting useful information, since if you only introduce your more and unusual and radical commitments to people who’ve already been convinced by your more mainstream ones, you are missing out on criticism of the radical commitments from the people most opposed to them.
This sort of thing has been an issue with EA historically: people have accused EA leaders (fairly or not) of leading with their beliefs about global poverty to give the impressiont that that is what they (the leader) and EA are really all about, when actually what the leader really cares about is a bunch of much more controversial things: AI safety, longtermism or niche animal welfare stuff like shrimp welfare.
I’m not saying that this means no one should ever introduce people to radical ideas gently, I think it can be reasonable, just that this is worth keeping in mind.
Thank you for the great insight. I agree with you about the deception concern, as one might end up with growing fearful skepticism when introducing ideas gradually and then going more radical, without genuine and constructive criticism driven by genuine curiosity and genuine interest in improving the movement.
However, could you please elaborate on this point “if you only introduce your more and unusual and radical commitments to people who’ve already been convinced by your more mainstream ones, you are missing out on criticism of the radical commitments from the people most opposed to them.”? It might be what I touched upon in the first part of this comment, but I’d appreciate your clarification.
One thing to worry about here is deception. All things being equal, it’s general a reason against doing something that is deceives people, and trying to ease people in gently can be a special case of that because it’s deceiving them about the beliefs you hold. It also might stop you yourself getting useful information, since if you only introduce your more and unusual and radical commitments to people who’ve already been convinced by your more mainstream ones, you are missing out on criticism of the radical commitments from the people most opposed to them.
This sort of thing has been an issue with EA historically: people have accused EA leaders (fairly or not) of leading with their beliefs about global poverty to give the impressiont that that is what they (the leader) and EA are really all about, when actually what the leader really cares about is a bunch of much more controversial things: AI safety, longtermism or niche animal welfare stuff like shrimp welfare.
I’m not saying that this means no one should ever introduce people to radical ideas gently, I think it can be reasonable, just that this is worth keeping in mind.
Thank you for the great insight. I agree with you about the deception concern, as one might end up with growing fearful skepticism when introducing ideas gradually and then going more radical, without genuine and constructive criticism driven by genuine curiosity and genuine interest in improving the movement.
However, could you please elaborate on this point “if you only introduce your more and unusual and radical commitments to people who’ve already been convinced by your more mainstream ones, you are missing out on criticism of the radical commitments from the people most opposed to them.”? It might be what I touched upon in the first part of this comment, but I’d appreciate your clarification.