What would convince you that preventing s-risks is a bigger priority than preventing x-risks?
Suppose that humanity unified to pursue a common goal, and you faced a gamble where that goal would be the most morally valuable goal with probability p, and the most morally disvaluable goal with probability 1-p. Given your current beliefs about those goals, at what value of p would you prefer this gamble over extinction?
What would convince you that preventing s-risks is a bigger priority than preventing x-risks?
Suppose that humanity unified to pursue a common goal, and you faced a gamble where that goal would be the most morally valuable goal with probability p, and the most morally disvaluable goal with probability 1-p. Given your current beliefs about those goals, at what value of p would you prefer this gamble over extinction?
I like how you operationalized the second question.