My main concern is that the arrival of AGI completely changes the situation in some unexpected way.
e.g. in the recent 80k podcast on fertility, Rob Wiblin opines that the fertility crash would be a global priority if not for AI likely replacing human labor soon and obviating the need for countries to have large human populations. There could be other effects.
My guess is that due to advanced AI, both artificial wombs and immortality will be technically feasible in the next 40 years, as well as other crazy healthcare tech. This is not an uncommon view
Before anything like a Delphi forecast it seems better to informally interview a couple of experts, and then write your own quick report on what the technical barriers are to artificial wombs. This way you can incorporate this into the structure of any forecasting exercise, e.g. by asking experts to forecast when each of hurdles X, Y, and Z will be solved, whereupon you can do things like identifying where the level of agreement is highest and lowest, as well as consistency checks against the overall forecast.
Most infant mortality still happens in the developing world, due to much more basic factors like tropical diseases. So if the goal is reducing infant mortality globally, you won’t be addressing most of the problem, and for maternal mortality, the tech will need to be so mature that it’s affordable for the average person in low-income countries, as well as culturally accepted.
“Rob Wiblin opines that the fertility crash would be a global priority if not for AI likely replacing human labor soon and obviating the need for countries to have large human populations”
This is a case where it really matters whether you are giving an extremely high chance that AGI is coming within 20-30 years, or merely a decently high chance. If you think the chance is like 75%, and the claim that conditional on no AGI, low fertility would be a big problem is correct, then the problem is only cut by 4x, which is compatible with it still being large and worth working on. Really, you need to get above 97-8% before it starts looking clear that low fertility is not worth worrying about, if we assume that conditional on no AGI it will be a big problem.
My main concern is that the arrival of AGI completely changes the situation in some unexpected way.
e.g. in the recent 80k podcast on fertility, Rob Wiblin opines that the fertility crash would be a global priority if not for AI likely replacing human labor soon and obviating the need for countries to have large human populations. There could be other effects.
My guess is that due to advanced AI, both artificial wombs and immortality will be technically feasible in the next 40 years, as well as other crazy healthcare tech. This is not an uncommon view
Before anything like a Delphi forecast it seems better to informally interview a couple of experts, and then write your own quick report on what the technical barriers are to artificial wombs. This way you can incorporate this into the structure of any forecasting exercise, e.g. by asking experts to forecast when each of hurdles X, Y, and Z will be solved, whereupon you can do things like identifying where the level of agreement is highest and lowest, as well as consistency checks against the overall forecast.
Most infant mortality still happens in the developing world, due to much more basic factors like tropical diseases. So if the goal is reducing infant mortality globally, you won’t be addressing most of the problem, and for maternal mortality, the tech will need to be so mature that it’s affordable for the average person in low-income countries, as well as culturally accepted.
“Rob Wiblin opines that the fertility crash would be a global priority if not for AI likely replacing human labor soon and obviating the need for countries to have large human populations”
This is a case where it really matters whether you are giving an extremely high chance that AGI is coming within 20-30 years, or merely a decently high chance. If you think the chance is like 75%, and the claim that conditional on no AGI, low fertility would be a big problem is correct, then the problem is only cut by 4x, which is compatible with it still being large and worth working on. Really, you need to get above 97-8% before it starts looking clear that low fertility is not worth worrying about, if we assume that conditional on no AGI it will be a big problem.