Great job identifying some relevant uncertainties to investigate. I will think about that some more.
My goal here is not so much to resolve the question of “should we prepare a biotic hedge?” but rather “Does utilitarian Longtermism imply that we should prepare it now, and if faced with a certain threshold of confidence that existential catastrophe is imminent, deploy it?” So I am comfortable not addressing the moral uncertainty arguments against the idea for now. If I become confident that utilitarian Longtermism does imply that we should, I would examine how other normative theories might come down on the question.
Me: “A neglected case above is where weapon X destroys life on earth, earth engages in directed panspermia, but there was already life in the universe unbeknownst to earth. I think we agree that B is superior to this case, and therefore the difference between B and A is greater. The question is does the difference between this case and C surpass that between A and B. Call it D. Is D so much worse than C that a preferred loss is from B to A? I don’t think so.”
You: “Hmm, I don’t quite follow… Does the above change the relative order of preference for you, and if so, to which order?”
No it would not change the relative order of A B C. The total order (including D) for me would be C > B > D > A, where |v(B) - v(A)| > |v(C) - v(D)|.
I was trying to make a Parfit style argument that A is so very bad that spending significant resources now to hedge against it is justified. Given that we fail to reach the Long Reflection, it is vastly preferable that we engage in a biotic hedge. I did a bad job of laying it out, and it seems that reasonable people think the outcome of B might actually be worse than A, based on your response.
Oh yeah, I was also talking about it only from utilitarian perspectives. (Except for one aside, “Others again refuse it on deontological or lexical grounds that I also empathize with.”) Just utilitarianism doesn’t make a prescription as to the exchange rate of intensity/energy expenditure/… of individual positive experiences to individual negative experiences.
It seems that reasonable people think the outcome of B might actually be worse than A, based on your response.
Yes, I hope they do. :-)
Sorry for responding so briefly! I’m falling behind on some reading.
Great job identifying some relevant uncertainties to investigate. I will think about that some more.
My goal here is not so much to resolve the question of “should we prepare a biotic hedge?” but rather “Does utilitarian Longtermism imply that we should prepare it now, and if faced with a certain threshold of confidence that existential catastrophe is imminent, deploy it?” So I am comfortable not addressing the moral uncertainty arguments against the idea for now. If I become confident that utilitarian Longtermism does imply that we should, I would examine how other normative theories might come down on the question.
Me: “A neglected case above is where weapon X destroys life on earth, earth engages in directed panspermia, but there was already life in the universe unbeknownst to earth. I think we agree that B is superior to this case, and therefore the difference between B and A is greater. The question is does the difference between this case and C surpass that between A and B. Call it D. Is D so much worse than C that a preferred loss is from B to A? I don’t think so.”
You: “Hmm, I don’t quite follow… Does the above change the relative order of preference for you, and if so, to which order?”
No it would not change the relative order of A B C. The total order (including D) for me would be C > B > D > A, where |v(B) - v(A)| > |v(C) - v(D)|.
I was trying to make a Parfit style argument that A is so very bad that spending significant resources now to hedge against it is justified. Given that we fail to reach the Long Reflection, it is vastly preferable that we engage in a biotic hedge. I did a bad job of laying it out, and it seems that reasonable people think the outcome of B might actually be worse than A, based on your response.
Oh yeah, I was also talking about it only from utilitarian perspectives. (Except for one aside, “Others again refuse it on deontological or lexical grounds that I also empathize with.”) Just utilitarianism doesn’t make a prescription as to the exchange rate of intensity/energy expenditure/… of individual positive experiences to individual negative experiences.
Yes, I hope they do. :-)
Sorry for responding so briefly! I’m falling behind on some reading.