[Question] After recent FTX events, what are alternative sources of funding for longtermist projects?

Now that we know “that it looks likely that there are many committed grants that the Future Fund will be unable to honor” according to the former team of FTX Future Funds, it would be useful for a number of us to have alternatives.

Current large funders are re-considering their grant strategy (such as OpenPhil). These donors are going to be extremely busy in the next few weeks. If you’re not too funding-constrained, it seems like a good and pro-social strategy is to wait until after the storm and let others who have urgent and important grants figure things out.

However, it seems that this may not be true if you are very funding-restrained right now, or if some grants or fellowships have deadlines coming up soon that it may be useful to have on your radar.

So, what are good places to apply for funding now? (and in the future too)

To start, the FLI AI Existential Safety PhD Fellowship has a deadline on November 15.

The Vitalik Buterin PhD Fellowship in AI Existential Safety is for PhD students who plan to work on AI existential safety research, or for existing PhD students who would not otherwise have funding to work on AI existential safety research. It will fund students for 5 years of their PhD, with extension funding possible. At universities in the US, UK, or Canada, annual funding will cover tuition, fees, and the stipend of the student’s PhD program up to $40,000, as well as a fund of $10,000 that can be used for research-related expenses such as travel and computing. At universities not in the US, UK or Canada, the stipend amount will be adjusted to match local conditions. Fellows will also be invited to workshops where they will be able to interact with other researchers in the field. Applicants who are short-listed for the Fellowship will be reimbursed for application fees for up to 5 PhD programs, and will be invited to an information session about research groups that can serve as good homes for AI existential safety research.

More about the fellowship here.

What are other options?

No comments.