or we could add a long-term future fund that focused on areas other than AI-Safety.
+1 differentiation. A Fund specifically for AI Safety would probably have demand—I’d donate. Other Funds for other specific GCRs could be created if there’s enough demand too.
A mild consideration against would be if there are funding opportunities in the Long Term Future area that would benefit both AI Safety and the other GCRs, such as the cross-disciplinary Global Catastrophic Risks Institute, and splitting would make it harder for these to be funded, maybe?
+1 differentiation. A Fund specifically for AI Safety would probably have demand—I’d donate. Other Funds for other specific GCRs could be created if there’s enough demand too.
A mild consideration against would be if there are funding opportunities in the Long Term Future area that would benefit both AI Safety and the other GCRs, such as the cross-disciplinary Global Catastrophic Risks Institute, and splitting would make it harder for these to be funded, maybe?