I mildly agree that depopulation is bad, but not by much. Problem is I just suspect our starting views and premises are so different on this i can’t see how they could converge. Very briefly, mine would be something like this:
-Ethics is about agreements between existing agents. -Future people matter only to the degree that current people care about them. -No moral duty exists to create people. -Existing people should not be made worse off for the sake of hypothetical future ones.
I don’t think there’s a solid argument for the dangers of overpopulation right now or in the near future, and I mostly trust the economic arguments about increased productivity and progress that come from more people. Admittedly, there are some issues that I can think of that would make this less clear:
-If AGI takes off and doesn’t kill us all, it is very likely we can offshore most of the productivity and creativity to it, denying the advantage of bigger populations
-A lot of the increase in carbon emissions come from developing countries that are trying to increase the consumer capacities and lifestyle of their citizens. If scientific breakthroughs do not allow for progress, more people with more Western-like lifestyles will make it incredibly difficult to lower fossil fuel consumption, so if technology doesn’t make the breakthroughs, it makes sense to want less people so that more can enjoy our type of lifestyle.
-Again, with technology, we’ve been extremely lucky in finding low hanging fruit that allowed us to expand food production (i.e., fertilizers, the Green Revolution). Again, one can be skeptic of indefinite future breakthroughs, which could push us down to some Malthusian state.
Do people, on average, have positive or negative externalities (instrumental value)?
I imagine both yes. Most current calculations would say the positive outweigh the negative, but I can imagine how this can cease to be so.
Do people’s lives, on average, have positive intrinsic value (of a sort that warrants promotion, all else equal)?
Can’t really debate this, as I don’t think I believe in any sort of intrinsic value to begin with.
I mildly agree that depopulation is bad, but not by much. Problem is I just suspect our starting views and premises are so different on this i can’t see how they could converge. Very briefly, mine would be something like this:
-Ethics is about agreements between existing agents.
-Future people matter only to the degree that current people care about them.
-No moral duty exists to create people.
-Existing people should not be made worse off for the sake of hypothetical future ones.
I don’t think there’s a solid argument for the dangers of overpopulation right now or in the near future, and I mostly trust the economic arguments about increased productivity and progress that come from more people. Admittedly, there are some issues that I can think of that would make this less clear:
-If AGI takes off and doesn’t kill us all, it is very likely we can offshore most of the productivity and creativity to it, denying the advantage of bigger populations
-A lot of the increase in carbon emissions come from developing countries that are trying to increase the consumer capacities and lifestyle of their citizens. If scientific breakthroughs do not allow for progress, more people with more Western-like lifestyles will make it incredibly difficult to lower fossil fuel consumption, so if technology doesn’t make the breakthroughs, it makes sense to want less people so that more can enjoy our type of lifestyle.
-Again, with technology, we’ve been extremely lucky in finding low hanging fruit that allowed us to expand food production (i.e., fertilizers, the Green Revolution). Again, one can be skeptic of indefinite future breakthroughs, which could push us down to some Malthusian state.
Do people, on average, have positive or negative externalities (instrumental value)?
I imagine both yes. Most current calculations would say the positive outweigh the negative, but I can imagine how this can cease to be so.
Do people’s lives, on average, have positive intrinsic value (of a sort that warrants promotion, all else equal)?
Can’t really debate this, as I don’t think I believe in any sort of intrinsic value to begin with.