One reason for avoiding talking about “1-to-N” moral progress on a public EA forum is that it is inherently political. I agree with you on essentially all the issues you mentioned in the post, but I also realise that most people in the world and even in developed nations will find at least one of your positions grossly offensive—if not necessarily when stated as above, then certainly after they are taken to their logical conclusions.
Discussing how to achieve concrete goals in “1-to-N” moral progress would almost certainly lead “moral reactionaries” to start attacking the EA community, calling us “fascists” / “communists” / “deniers” / “blasphemers” depending on which kind of immorality they support. This would make life very difficult for other EAs.
Maybe the potential benefits are large enough to exceed the costs, but I don’t even know how we could go about estimating either of these.
One reason for avoiding talking about “1-to-N” moral progress on a public EA forum is that it is inherently political. I agree with you on essentially all the issues you mentioned in the post, but I also realise that most people in the world and even in developed nations will find at least one of your positions grossly offensive—if not necessarily when stated as above, then certainly after they are taken to their logical conclusions.
Discussing how to achieve concrete goals in “1-to-N” moral progress would almost certainly lead “moral reactionaries” to start attacking the EA community, calling us “fascists” / “communists” / “deniers” / “blasphemers” depending on which kind of immorality they support. This would make life very difficult for other EAs.
Maybe the potential benefits are large enough to exceed the costs, but I don’t even know how we could go about estimating either of these.