Though implausible (at least to me), you could imagine an Astronomical Waste-style argument about delaying or preventing the instantiation of artificial consciousness with positively valenced states. In this case, a moratorium would be a seriously ethically negative event.
I think the astronomical waste paper argues that the EV loss from a “small” delay seems negligible; quoting from it:
For example, a single percentage point of reduction of existential risks would be worth (from a utilitarian expected utility point-of-view) a delay of over 10 million years.
I think the astronomical waste paper argues that the EV loss from a “small” delay seems negligible; quoting from it: