Or maybe we could invest in server capacity in readiness of a EM future.
This one seemed out of place to me. Conditioned on the time we start expanding and the rate at which we expand, we’re going to have access to some fixed set of resources at a given point in the future, so I don’t see how investing in server capacity now affects our server capacity in the far future. (though I do agree that affecting the start time and rate of expansion could be permanent improvements.)
Establishing norms that will protect biological humans and EMs from Hansonian competition—like a right to retire.
If uploads are not conscious, it might be important to agree on this before EMs massively outnumber biological humans; after that point it would become much harder.
These seem to be about simply picking the right policies now and locking them in. It might also be important to lock in the right policies vis-a-vis privacy, the death penalty, property rights, etc etc, but why should we think that we can lock such policies in now? This reduces to either “minimize value drift” or “create a singleton”, both of which I agree with but you already listed them.
This one seemed out of place to me. Conditioned on the time we start expanding and the rate at which we expand, we’re going to have access to some fixed set of resources at a given point in the future, so I don’t see how investing in server capacity now affects our server capacity in the far future. (though I do agree that affecting the start time and rate of expansion could be permanent improvements.)
These seem to be about simply picking the right policies now and locking them in. It might also be important to lock in the right policies vis-a-vis privacy, the death penalty, property rights, etc etc, but why should we think that we can lock such policies in now? This reduces to either “minimize value drift” or “create a singleton”, both of which I agree with but you already listed them.