I’ve heard this point being made elsewhere too but I am not sure I fully understand that. What exactly are the values on reflection you are referring to here? Is it values that is typically shared by those with a utilitarian bent or other philosophical schools that focus roughly on the well-being of all beings that are capable of experiencing pleasure and pain. A value system that is not narrowly focused on maximization for a minority at the exclusion of others?
Now even in the real world the systems are setup in clear violation of such principles which is part of the reason for inequality, exploitation, marginalization etc. And while one may argue that over centuries we would become more enlightened to collectively recognize these evils, it is not entirely obvious we would eliminate them.
In any event, why do we assume that a different advanced civilization (especially one arising post-extinction from some of our common ancestors) would not converge to something like it especially since we recognize that our source of empathy and cooperation that form the basis for more sophisticated altruistic goals have played a role in our survival as a species?
Maybe I am missing something but even probabilistically speaking why assume one is more likely than the other?
I’ve heard this point being made elsewhere too but I am not sure I fully understand that. What exactly are the values on reflection you are referring to here? Is it values that is typically shared by those with a utilitarian bent or other philosophical schools that focus roughly on the well-being of all beings that are capable of experiencing pleasure and pain. A value system that is not narrowly focused on maximization for a minority at the exclusion of others?
Now even in the real world the systems are setup in clear violation of such principles which is part of the reason for inequality, exploitation, marginalization etc. And while one may argue that over centuries we would become more enlightened to collectively recognize these evils, it is not entirely obvious we would eliminate them.
In any event, why do we assume that a different advanced civilization (especially one arising post-extinction from some of our common ancestors) would not converge to something like it especially since we recognize that our source of empathy and cooperation that form the basis for more sophisticated altruistic goals have played a role in our survival as a species?
Maybe I am missing something but even probabilistically speaking why assume one is more likely than the other?