As we discuss in our post, imagine the worst possible world. Most humans are comfortable in saying that this would be very bad, and any steps towards it would be bad, and if you disagree and think that steps towards the WPW are good, then you’re wrong. In the same vein, holding a ‘version of ethics’ that claims that moving towards the WPW is good, you’re wrong.
To address you second point, humans are not AGIs, our values are fluid.
I completely fail to understand how your WPW example addresses my point. It is absolutely irrelevant what most humans are comfortable in saying. Truth is not a democracy, and in this case the claim is not even wrong (it is ill defined since there is no such thing as “bad” without specifying the agent from whose point of view it is bad). It is true that some preferences are nearly universal for humans but other preferences are less so.
How is the fluidity of human values a point in your favor? If anything it only makes them more subjective.
As we discuss in our post, imagine the worst possible world. Most humans are comfortable in saying that this would be very bad, and any steps towards it would be bad, and if you disagree and think that steps towards the WPW are good, then you’re wrong. In the same vein, holding a ‘version of ethics’ that claims that moving towards the WPW is good, you’re wrong.
To address you second point, humans are not AGIs, our values are fluid.
I completely fail to understand how your WPW example addresses my point. It is absolutely irrelevant what most humans are comfortable in saying. Truth is not a democracy, and in this case the claim is not even wrong (it is ill defined since there is no such thing as “bad” without specifying the agent from whose point of view it is bad). It is true that some preferences are nearly universal for humans but other preferences are less so.
How is the fluidity of human values a point in your favor? If anything it only makes them more subjective.