Yeah, I don’t think filling the finite universe we know about is where the the highest expected value is. It’s likely some form of possible infinite value, since it’s not implausible that this could exist. But ultimately, I agree that the implications of this are minor and our response should basically be the same as if we lived in a finite universe (keep humanity alive, move values towards total hedonic utilitarianism, and build safe AI).
Yeah, I don’t think filling the finite universe we know about is where the the highest expected value is. It’s likely some form of possible infinite value, since it’s not implausible that this could exist. But ultimately, I agree that the implications of this are minor and our response should basically be the same as if we lived in a finite universe (keep humanity alive, move values towards total hedonic utilitarianism, and build safe AI).