I really like your idea of a GCR response fund-I was thinking about something similar (though did you mean it was in category b) not a)?). It seems that there could be quite a few EAs who think that contributing to AI is the highest priority, but if there were a global catastrophe, they might recognize that it could jeopardize all the work on AI and there are things we could do to make it go better.
I really like your idea of a GCR response fund-I was thinking about something similar (though did you mean it was in category b) not a)?). It seems that there could be quite a few EAs who think that contributing to AI is the highest priority, but if there were a global catastrophe, they might recognize that it could jeopardize all the work on AI and there are things we could do to make it go better.