I think humans are probably more likely in expectation than aliens to generate high value as EAs understand it, simply because our idea of what âhigh valueâ is, is well, a human idea of it.
This seems like a strange viewpoint. If value is something about which one can make âtruthyâ and âfalseyâ - or something that we converge to given enough time and intelligent thought if you preferâthen itâs akin to maths and aliens would be a priori as likely to âdiscoverâ it as we are. If itâs arbitrary, then longtermism has no philosophical justification, beyond the contingent caprices of people who like to imagine a universe filled with life.
Also if itâs arbitrary, then over billions of years even human-only descendants would be very unlikely to stick with anything resembling our current values.
I think humans are probably more likely in expectation than aliens to generate high value as EAs understand it, simply because our idea of what âhigh valueâ is, is well, a human idea of it.
This seems like a strange viewpoint. If value is something about which one can make âtruthyâ and âfalseyâ - or something that we converge to given enough time and intelligent thought if you preferâthen itâs akin to maths and aliens would be a priori as likely to âdiscoverâ it as we are. If itâs arbitrary, then longtermism has no philosophical justification, beyond the contingent caprices of people who like to imagine a universe filled with life.
Also if itâs arbitrary, then over billions of years even human-only descendants would be very unlikely to stick with anything resembling our current values.
Yes. See also Yudkowskyâs novella Three Worldâs Collide for an illustration of why finding aliens could be very bad from our perspective.