[Question] Confusion about implications of “Neutrality against Creating Happy Lives”

As I understand, the following two positions are largely accepted in the EA community:

  1. Temporal position should not impact ethics (hence longtermism)

  2. Neutrality against creating happy lives

But if we are time-agnostic, then neutrality against making happy lives seems to imply a preference for extinction over any future where even a tiny amount of suffering exists.

So am I missing something here? (Perhaps “neutrality against creating happy lives” can’t be expressed in a way that’s temporally agnostic?)