It only definitely follows from humans being net negative in expectation that we should try to make humans go extinct if you are both a full utilitarian and “naive” about it, i.e. prepared to break usually sacrosanct moral rules when you personally judge that to be likely to have the best consequences, something which most utilitarians take to be likely to usually result in bad consequences and therefore to be discouraged. Another way to describe ‘make humanity more likely to go extinct’ is ‘murder more people than all the worst dictators in history combined’. That is the sort of thing that is going to be look like a prime candidate for “do not do this, even if it has the best consequences’ on non-utilitarian moral views. And it’s also obviously breaking standard moral rules.
It only definitely follows from humans being net negative in expectation that we should try to make humans go extinct if you are both a full utilitarian and “naive” about it, i.e. prepared to break usually sacrosanct moral rules when you personally judge that to be likely to have the best consequences, something which most utilitarians take to be likely to usually result in bad consequences and therefore to be discouraged. Another way to describe ‘make humanity more likely to go extinct’ is ‘murder more people than all the worst dictators in history combined’. That is the sort of thing that is going to be look like a prime candidate for “do not do this, even if it has the best consequences’ on non-utilitarian moral views. And it’s also obviously breaking standard moral rules.