It seems like you might be under-weighing the cumulative amount of resources—even if you have some pretty heavy decay rate (which it’s unclear you should—usually we think of philanthropic investments compounding over time), avoiding nuclear war was a top global priority for decades, and it feels like we have a lot of intellectual and policy “legacy infrastructure” from that.
I agree people often overlook that (and also future resources).
I think bio and climate change also have large cumulative resources.
But I see this as a significant reason in favour of AI safety, which has become less neglected on an annual basis recently, but is a very new field compared to the others.
Also a reason in favour of the post-TAI causes like digital sentience.
It seems like you might be under-weighing the cumulative amount of resources—even if you have some pretty heavy decay rate (which it’s unclear you should—usually we think of philanthropic investments compounding over time), avoiding nuclear war was a top global priority for decades, and it feels like we have a lot of intellectual and policy “legacy infrastructure” from that.
I agree people often overlook that (and also future resources).
I think bio and climate change also have large cumulative resources.
But I see this as a significant reason in favour of AI safety, which has become less neglected on an annual basis recently, but is a very new field compared to the others.
Also a reason in favour of the post-TAI causes like digital sentience.