I think a lot of GCRs could be more tractable than AI risk (possibly by a large margin) if someone went through the work of identifying more opportunities to fund risk reduction for those GCRs, then made it available to small donors.
This is definitely an important point. I think that if someone did identify opportunities like this, that’s one of the most likely reasons why I might change where I donate. Right now it doesn’t look like any GCR is substantially more important/tractable/neglected than AI risk (biosecurity is probably a bigger risk but not by a huge margin, geoengineering might be more tractable but not for small donors), but this could change in the future.
This is definitely an important point. I think that if someone did identify opportunities like this, that’s one of the most likely reasons why I might change where I donate. Right now it doesn’t look like any GCR is substantially more important/tractable/neglected than AI risk (biosecurity is probably a bigger risk but not by a huge margin, geoengineering might be more tractable but not for small donors), but this could change in the future.