If we avoid extinction, plenty of people will have the time to take care of humanity’s future. I’ll leave it to them. Both topic have a lot of common ground anyway, like “not messing up with the biosphere” or “keeping control of ASI”
Intelligent life extinction could be prevented by creating a misaligned AI locking-in bad moral values, no?Maybe see the comment by MacAskill https://forum.effectivealtruism.org/posts/TeBBvwQH7KFwLT7w5/william_macaskill-s-shortform?commentId=jbyvG8sHfeZzMqusJ
If we avoid extinction, plenty of people will have the time to take care of humanity’s future. I’ll leave it to them. Both topic have a lot of common ground anyway, like “not messing up with the biosphere” or “keeping control of ASI”
Intelligent life extinction could be prevented by creating a misaligned AI locking-in bad moral values, no?
Maybe see the comment by MacAskill https://forum.effectivealtruism.org/posts/TeBBvwQH7KFwLT7w5/william_macaskill-s-shortform?commentId=jbyvG8sHfeZzMqusJ