Nick Bostrom’s argument is that, in order to maximize the length of the cosmic endowment, we need to minimize existential risks, instead of speeding up expansion.
Another way of arguing agaist this is claiming that we do not know whether the future is positive or negative, so making it larger has unclear effects.
If G(t) doesn’t depend on the number of computations possible
Personally, I think the probability of this being true is sufficiently low to be negligible.
Hi Lysandre,
I really enjoyed the post!
Another way of arguing agaist this is claiming that we do not know whether the future is positive or negative, so making it larger has unclear effects.
Personally, I think the probability of this being true is sufficiently low to be negligible.