I think it’s definitely bad to “Use framings, arguments and examples that you don’t think hold water but work at getting people to join your group”. If I understand correctly it can cause point 5. Also “getting people to join your group” is rarely an instrumental goal, and “getting people to join your group for the wrong reasons” is probably not that useful in the long term.
Agree about the “not holding water”, I was trying to say that “addresses cruxes you don’t have” might look similar to this bad thing, but I’m not totally sure that’s true.
I disagree about getting people to join your group—that definitely seems like an instrumental goal, though definitely “get the relevant people to join your group” is more the thing—but different people might have different views on how relevant they need to be, or what their goal with the group is.
Something that I think is very important that seems missing from this is that there’s a significant probability that we’re wrong about important things (i.e. EA as a question).
I kind of agree here; I think there are things in EA I’m not particularly uncertain of, and while I’m open to being shown I’m wrong, I don’t want to pretend more uncertainty than I have.
The way I think about this, on first approximation, is that I want people to work on maximising their values (and not their wellbeing). If they think altruism is not important and are solipsistic egoists and only value their own wellbeing, I don’t think EA can help them. If value the wellbeing of others then EA can help them achieve their values better.
I’ve definitely heard that frame, but it honestly doesn’t resonate for me. I think some people are wrong about what values are right and arguing with me sometimes convinces them of that. I’ve definitely had my values changed by argumentation! Or at least values on some level of abstraction—not on the level of solipsism vs altruism, but there are many layers between that and “just an empirical question”.
I don’t want to push other people to work on my values because from an outside view I don’t think my values are more important than their values, or more likely to be “correct”
I incorporate an inside view on my values—if I didn’t think they were right, I’d do something else with my time!
Thanks, Lorenzo!
Agree about the “not holding water”, I was trying to say that “addresses cruxes you don’t have” might look similar to this bad thing, but I’m not totally sure that’s true.
I disagree about getting people to join your group—that definitely seems like an instrumental goal, though definitely “get the relevant people to join your group” is more the thing—but different people might have different views on how relevant they need to be, or what their goal with the group is.
I kind of agree here; I think there are things in EA I’m not particularly uncertain of, and while I’m open to being shown I’m wrong, I don’t want to pretend more uncertainty than I have.
I’ve definitely heard that frame, but it honestly doesn’t resonate for me. I think some people are wrong about what values are right and arguing with me sometimes convinces them of that. I’ve definitely had my values changed by argumentation! Or at least values on some level of abstraction—not on the level of solipsism vs altruism, but there are many layers between that and “just an empirical question”.
I incorporate an inside view on my values—if I didn’t think they were right, I’d do something else with my time!