It’s possible I’ve flipped the sign on what you’re saying, but if I haven’t, I’m pretty sure most EAs are not moral realists, so I don’t know where you got the impression that it’s an underlying assumption of any serious EA efforts.
If I did flip the sign, then I don’t think it’s true that moral realism is “too unquestioned”. At this point it might be more fair to say that too much time & ink has been spilled on what’s frankly a pretty trivial question that only sees as much engagement as it does because people get caught up in arguing about definitions of words (and, of course, because some other people are deeply confused).
I think this might be a crux here: I do think that this question matters a lot, and I will point to implications if moral realism is false. Thankfully more EAs are moral anti-realists than I thought.
EA would need to recognize that their values aren’t superior or inferior to other people’s values, just different values. In other words it would need to stop believing it’s actually objectively right at all to hold (X values).
The moral progress thesis is not correct, that is the changes in the 19th and 20th centuries were mostly not made on moral progress. At the very least moral progress is very much subjective.
Values are points of view, not fundamental truths that humans have to abide by.
Now this isn’t the most important question, but it is a question that I think matters, especially for moral movements like EA.
It’s possible I’ve flipped the sign on what you’re saying, but if I haven’t, I’m pretty sure most EAs are not moral realists, so I don’t know where you got the impression that it’s an underlying assumption of any serious EA efforts.
If I did flip the sign, then I don’t think it’s true that moral realism is “too unquestioned”. At this point it might be more fair to say that too much time & ink has been spilled on what’s frankly a pretty trivial question that only sees as much engagement as it does because people get caught up in arguing about definitions of words (and, of course, because some other people are deeply confused).
I think this might be a crux here: I do think that this question matters a lot, and I will point to implications if moral realism is false. Thankfully more EAs are moral anti-realists than I thought.
EA would need to recognize that their values aren’t superior or inferior to other people’s values, just different values. In other words it would need to stop believing it’s actually objectively right at all to hold (X values).
The moral progress thesis is not correct, that is the changes in the 19th and 20th centuries were mostly not made on moral progress. At the very least moral progress is very much subjective.
Values are points of view, not fundamental truths that humans have to abide by.
Now this isn’t the most important question, but it is a question that I think matters, especially for moral movements like EA.