I think that existential risks from various issues with AGI (especially if one includes trajectory changes) are high enough that one needn’t accept fanatical views to prioritise them
I think the argument you linked to is reasonable. I disagree, but not strongly. But I think it’s plausible enough that AGI concerns (from an impartial cause prioritization perspective) require fanaticism that there should still be significant worry about it. My take would be that this worry means an initially general EA org should not overwhelmingly prioritize AGI.
I think the argument you linked to is reasonable. I disagree, but not strongly. But I think it’s plausible enough that AGI concerns (from an impartial cause prioritization perspective) require fanaticism that there should still be significant worry about it. My take would be that this worry means an initially general EA org should not overwhelmingly prioritize AGI.