I think this might be a crux here: I do think that this question matters a lot, and I will point to implications if moral realism is false. Thankfully more EAs are moral anti-realists than I thought.
EA would need to recognize that their values aren’t superior or inferior to other people’s values, just different values. In other words it would need to stop believing it’s actually objectively right at all to hold (X values).
The moral progress thesis is not correct, that is the changes in the 19th and 20th centuries were mostly not made on moral progress. At the very least moral progress is very much subjective.
Values are points of view, not fundamental truths that humans have to abide by.
Now this isn’t the most important question, but it is a question that I think matters, especially for moral movements like EA.
I think this might be a crux here: I do think that this question matters a lot, and I will point to implications if moral realism is false. Thankfully more EAs are moral anti-realists than I thought.
EA would need to recognize that their values aren’t superior or inferior to other people’s values, just different values. In other words it would need to stop believing it’s actually objectively right at all to hold (X values).
The moral progress thesis is not correct, that is the changes in the 19th and 20th centuries were mostly not made on moral progress. At the very least moral progress is very much subjective.
Values are points of view, not fundamental truths that humans have to abide by.
Now this isn’t the most important question, but it is a question that I think matters, especially for moral movements like EA.