Yeah this was initially my overwhelming opinion too. Coming from a QC background normalising according to amplitude is just instinct, but just because the operators we have nice models for behave like this doesnt mean we should expect this very different type of effect to act the same (for one, gravity doesnt seem to!). There are some justifications** you could generate against this approach but ultimately the point of the post is “we are uncertain”, it could be normalised, it could be exponential, or it could be some polynomial function. Given the scale it seems its worth someone capable attempting a paper.
2.Fully agree with this point, this is exactly the question we wanted to address with the next blog post but in the interest of time, haven’t written yet. You would essentially have a universe with “value” growing with an exponential according to the amount of superposition generating interactions happening in any instant (which is a lot). If you believed each superposition had some non-normalised value, this would mean you care about the long run future way more (since its been multipled by such a large value). Which might mean your only goal is to make sure there is some vaguely happy being as far into the future as possible.
It gets even worse when you include your point about infinite dimensional Hilbert spaces, suddenly the future becomes an infinite set of futures growing by infinity every second, and I know better to pretend I understand infinity ethics on this level!
As you say, this is not a settled debate, I also land on the side of many worlds but I am far from certain in this belief.
**Suppose (for the sake of argument) you believe that a brain experiencing one second of happiness is worth one utiliton and increasing the size of the brain increases the size of its moral worth (most people think a human is worth more than an ant). This brain can be simulated by some classical computation requiring a certain time, there exists some quantum computation which is equivalent. Due to the large number of processes needed to simulate a brain at least some probably have a quantum speedup associated. Now you can run the same brain (say) 10x faster, this seems like it would be worth 10x more, because there are 10x more experiences. Which implies that the increased power of the QC is worth more than just normalising it to one. As you scale up this brain the quantum speedup scales too, which implies some scaling associated. Ultimately the exp vs poly debate comes down to what the most efficient utility generating quantum computation is.
Hi Evn, thanks for your points,
Yeah this was initially my overwhelming opinion too. Coming from a QC background normalising according to amplitude is just instinct, but just because the operators we have nice models for behave like this doesnt mean we should expect this very different type of effect to act the same (for one, gravity doesnt seem to!). There are some justifications** you could generate against this approach but ultimately the point of the post is “we are uncertain”, it could be normalised, it could be exponential, or it could be some polynomial function. Given the scale it seems its worth someone capable attempting a paper.
2.Fully agree with this point, this is exactly the question we wanted to address with the next blog post but in the interest of time, haven’t written yet. You would essentially have a universe with “value” growing with an exponential according to the amount of superposition generating interactions happening in any instant (which is a lot). If you believed each superposition had some non-normalised value, this would mean you care about the long run future way more (since its been multipled by such a large value). Which might mean your only goal is to make sure there is some vaguely happy being as far into the future as possible. It gets even worse when you include your point about infinite dimensional Hilbert spaces, suddenly the future becomes an infinite set of futures growing by infinity every second, and I know better to pretend I understand infinity ethics on this level! As you say, this is not a settled debate, I also land on the side of many worlds but I am far from certain in this belief.
**Suppose (for the sake of argument) you believe that a brain experiencing one second of happiness is worth one utiliton and increasing the size of the brain increases the size of its moral worth (most people think a human is worth more than an ant). This brain can be simulated by some classical computation requiring a certain time, there exists some quantum computation which is equivalent. Due to the large number of processes needed to simulate a brain at least some probably have a quantum speedup associated. Now you can run the same brain (say) 10x faster, this seems like it would be worth 10x more, because there are 10x more experiences. Which implies that the increased power of the QC is worth more than just normalising it to one. As you scale up this brain the quantum speedup scales too, which implies some scaling associated. Ultimately the exp vs poly debate comes down to what the most efficient utility generating quantum computation is.