Surely most neartermist funders think that the probability that we get transformative AGI this century is low enough that it doesn’t have a big impact on calculations like the ones you describe?
This is not obvious to me; I’d guess that for a lot of people, AGI and global health live in separate magisteria, or they’re not working on alignment due to perceived tractability.
This is not obvious to me; I’d guess that for a lot of people, AGI and global health live in separate magisteria, or they’re not working on alignment due to perceived tractability.
This could be tested with a survey.