I think steps 1 and 2 in your chain are also questionable, not just 3-5.
Want to maximize number of EAs
Why do we want to maximize number of EAs, this seems very non-obvious to me? Some people would add much more to the community than others via epistemics, culture, direct talent, etc. If we added enough of certain types of people to the community, especially too quickly, it could easily be net negative.
2. Use framings, arguments and examples that you don’t think hold water but work at getting people to join your group [I don’t think EAs do this, I’m gesturing at the extreme other end]
[...]
2 is complicated—when people have different cruxes than you is it dishonest to talk about what should convince them based on their cruxes?
I think sometimes/often talking about people’s cruxes rather than your own is good and fine. The issue is Goodharting via an optimal message to convert as many people to EA as quickly as possible, rather than messages that will lead to a healthy community over the long run.
I think steps 1 and 2 in your chain are also questionable, not just 3-5.
Why do we want to maximize number of EAs, this seems very non-obvious to me? Some people would add much more to the community than others via epistemics, culture, direct talent, etc. If we added enough of certain types of people to the community, especially too quickly, it could easily be net negative.
I think sometimes/often talking about people’s cruxes rather than your own is good and fine. The issue is Goodharting via an optimal message to convert as many people to EA as quickly as possible, rather than messages that will lead to a healthy community over the long run.