The motivation for focusing on global catastrophic risks is that these could dramatically limit humanity’s potential. If, per your population ethics, such a limitation wouldn’t be concerning, then it’s not surprising that you wouldn’t find work aiming to avert or mitigate such risks compelling.
I think the post would be clearer if it were explicit about this up front: the disagreement here isn’t about the relative scale of biorisk vs factory farming, but instead about how much value there is in averting civilizational collapse and/​or extinction.
I’m sorry if the title was misleading, that was not my intention. I think you and I have different views on the average forum user’s population ethics. If I believed that more people reading this had a totalist (or similar) view, I would have been much more up front about my take not being valid for them. Believing the opposite, I put the conclusion you’d get from non-person-affecting views as a caveat instead.
That aside, I’d be happy to see the general discourse spell out more that population ethics is a crux for x-risks. I’ve only gotten—and probably at some points given—the impression that x-risks are similarly important to other cause areas under all population ethics. This runs the risk of baiting people into working on things they logically shouldn’t believe to be the most pressing problem.
On a personal note, I concede that extinction is much worse than 10 billion humans dying. This is however for non-quantitative reasons. Tegmark has said something along the lines of a universe without sapience being terribly boring, and that weighs quite heavily into my judgement of the disutility of extinction.
The motivation for focusing on global catastrophic risks is that these could dramatically limit humanity’s potential. If, per your population ethics, such a limitation wouldn’t be concerning, then it’s not surprising that you wouldn’t find work aiming to avert or mitigate such risks compelling.
I think the post would be clearer if it were explicit about this up front: the disagreement here isn’t about the relative scale of biorisk vs factory farming, but instead about how much value there is in averting civilizational collapse and/​or extinction.
I’m sorry if the title was misleading, that was not my intention. I think you and I have different views on the average forum user’s population ethics. If I believed that more people reading this had a totalist (or similar) view, I would have been much more up front about my take not being valid for them. Believing the opposite, I put the conclusion you’d get from non-person-affecting views as a caveat instead.
That aside, I’d be happy to see the general discourse spell out more that population ethics is a crux for x-risks. I’ve only gotten—and probably at some points given—the impression that x-risks are similarly important to other cause areas under all population ethics. This runs the risk of baiting people into working on things they logically shouldn’t believe to be the most pressing problem.
On a personal note, I concede that extinction is much worse than 10 billion humans dying. This is however for non-quantitative reasons. Tegmark has said something along the lines of a universe without sapience being terribly boring, and that weighs quite heavily into my judgement of the disutility of extinction.