Longtermism Fund: August 2023 Grants Report
Introduction
In this grants report, the Longtermism Fund team is pleased to announce that the following grants have been recommended by Longview and are in the process of being disbursed:
Two grants promoting beneficial AI:
Supporting AI interpretability work conducted by Martin Wattenberg and Fernanda Viégas at Harvard University ($110,000 USD)
Funding AI governance work conducted by the evaluations project at the Alignment Research Center ($220,000 USD)
Two biosecurity and pandemic prevention grants:
Supporting a specific project at NTI | Bio working to disincentivize biological weapons programmes ($100,000 USD)
Partially funding the salary of a Director of Research and Administration for the Center for Communicable Disease Dynamics ($80,000 USD)
One grant improving nuclear security:
Funding a project by the Carnegie Endowment for International Peace to better understand and advocate for policies that avoid escalation pathways to nuclear war ($52,000 USD).
This report will provide information on what the grants will fund, and why they were made. It was written by Giving What We Can, which is responsible for the Fundâs communications. Longview Philanthropy is responsible for the Fundâs research and grantmaking.[1]
We would also like to acknowledge and apologise for the report being released two months later than we would have liked, in part due to delays in the process of disbursing these grants. In future, we will aim to take potential delays into account so that we can better keep to our target of releasing a report once every six months.
Scope of the Fund
These grants were decided by the general grantmaking process outlined in our previous grants report and the Fundâs launch announcement.
As a quick summary, the Fund supports work that:
Reduces existential and catastrophic risks, such as those coming from misaligned artificial intelligence, pandemics, and nuclear war.
Promotes, improves, and implements key longtermist ideas.
In addition, the Fund focuses on organisations with a compelling and transparent case in favour of their cost-effectiveness, and/âor that will benefit from being funded by a large number of donors. Longview Philanthropy decides the grants and allocations based on its past and ongoing work to evaluate organisations in this space.
Grantees
AI interpretability work at Harvard University â $110,000
This grant is to support the work of Martin Wattenberg and Fernanda ViĂ©gas to develop their AI interpretability work at Harvard University. The grant aims to fund research that enhances our understanding of how modern AI systems function â better understanding how these systems work is among the more straightforward ways we can ensure these systems are safe. Profs. Wattenberg and ViĂ©gas have a strong track record (with both having excellent references from other experts) and their future plans are likely to advance the interpretability field.
Longview: âWe recommended a grant of $110,000 to support Martin Wattenberg and Fernanda ViĂ©gasâ interpretability work on the basis of excellent reviews of their prior work. These funds will go primarily towards setting up a compute cluster and hiring graduate students or possibly postdoctoral fellows.â
ARC Evals â $220,000
The evaluations project at the Alignment Research Center (âARC Evalsâ) works on âassessing whether cutting-edge AI systems could pose catastrophic risks to civilization.â ARC Evals is contributing to the following AI governance approach:
Before a new large-scale system is released, assess whether it is capable of potentially catastrophic activities.
If so, require strong guarantees that the system will not carry out such activities.
ARC Evals works primarily on the first step of this approach.
The organisation is relatively new, and is now scaling up after seeing success. For example, ARC Evals built partnerships with frontier labs OpenAI and Anthropic to evaluate GPT-4 and Claude for certain dangerous capabilities prior to their release. At least as of the time this was published, the organisation has substantial room for more funding â on the order of millions of dollars needed to support its plans over the coming 18 months.
Longview: âWe recommended a grant of $220,000 to ARC Evals on the basis of ARC Evalsâ strong plan for contributing to AI governance and promising early progress. These funds will go primarily towards staff costs, and possibly computation, depending on ARC Evalsâ overall fundraising.â
Nuclear Threat Initiativeâs Biosecurity Programme (NTI | Bio) project to develop a research agenda for disincentivizing state biological weapons programmes â $100,000
This grant will support NTI | Bio for a specific project aiming to strengthen international capabilities to uphold the norm against bioweapons development and use. Concretely, this involves organising a workshop with leading experts on the topic to develop a list of key recommendations. To read about the kind of work involved in this project, we recommend reading the NTI | Bio paper âGuarding Against Catastrophic Biological Risks: Preventing State Biological Weapon Development and Use by Shaping Intentionsâ. The grant is restricted to this project.[2]
Longview: âWe recommended a grant of $100,000 to support this work on the basis that it was likely the most promising work which NTI | Bio would not otherwise have funding available for, and NTIâs track record of running similar projects. These funds will go primarily towards the workshop, with a smaller portion towards staff costs.â
Center for Communicable Disease Dynamics (CCDD) â $80,000
This grant provides funding for CCDD to employ a Director of Research and Administration to support CCDDâs work. This role acts as a force multiplier on all of CCDDâs work, which after reviewing their work several times over the last few years, Longview believes is impactful.
CCDDâs research contributes to planning for and reducing the chance of global catastrophic biological risks including influencing policy such as by estimating disease spread, researching vaccine trials (such as by publishing the original research on the potential value of human challenge trials to address COVID-19), and training future epidemiologists (such as training several Epidemic Intelligence Service officers). Its director, Professor Marc Lipsitch, spends around a quarter of his time as the Senior Advisor for the CDCâs Center for Forecasting and Outbreak Analytics, where he was founding co-director, and former CCDD Postdoctoral Research Fellow Rebecca Kahn was also on the founding team. Prof. Lipsitch is also a nuanced contributor to important debates such as around research with the potential to be used for both good and harm. Donors can learn more about these topics via his appearance on various podcasts and media.
This grant helps fill a particular funding gap that CCDD reported could otherwise be difficult to fill â CCDD is mostly funded via the US government and large foundations, but generally this funding is restricted to direct research (rather than supporting research via funding operational or administrative roles).
Longview: âWe recommended a grant of $80,000 to support Laurie Coeâs position as CCDDâs Director of Research and Administration on the basis that this will be a force multiplier on work increasing the worldâs readiness for and reducing the chance of catastrophic pandemics, and because CCDD has a pressing need for funding to support this role.â
Learn more about the Center for Communicable Disease Dynamics.
Carnegie Endowment for International Peace (CEIP)â $52,000
This grant supports CEIP to run a project aiming to develop a common understanding about escalation pathways to nuclear war and which policy interventions are most likely to contribute to risk mitigation. More specifically, this grant will help fund research workshops whereby a diverse range of experts within fields relevant to nuclear security and risk analysis convene to analyse potential escalation pathways, attempt to estimate their likelihood, identify potential levers to reduce or mitigate this risk, and compare these various pathways and levers more holistically.
The project will be run by James Acton and Jamie Kwong and will result in a report with policy recommendations and outreach to decision makers to promote these policy changes.
Longview: âWe recommended a grant of $52,000 to support the project on escalation pathways on the basis of its direct relevance to reducing the most extreme risks from nuclear weapons, and the CEIP teamâs strong track record of high-quality analysis which is taken seriously by policymakers. These funds will go primarily towards workshops and project staff time.â
Conclusion
The Fund is approaching the end of its first year, and the team is extremely grateful to the 598 donors who have cumulatively raised over $750,000 USD so far. We can all help solve funding constraints â your donations, and your advocacy, can make an enormous difference to protect the lives of future generations.
- ^
Prior to disbursing funds, we need to conduct due-diligence on the grantee (and occasionally the grantee needs to conduct due-diligence on us) and form a grant agreement. For this report, we decided to delay publishing until the grants were further through this process than we had done in the last round of grants. Each of the August 2023 grants have passed the due-diligence stage, and the final grant agreements are in the process of being signed as of this report being published. We share this because, in retrospect, we shouldnât have stated in the last grants report that the funds would be paid out in January, as we didnât have full control over this (as due diligence and grant agreement processes were still ongoing).
- ^
This grant is restricted for support of work which is unlikely to go ahead without the grant. Therefore, SoGiveâs recent post about the Nuclear Threat Initiativeâs funding reserves was not directly relevant to the merit of the grant, and the Fund did not come to a view on the content of SoGiveâs post.
Congratulations! Martin and Fernanda do great work, and Iâm glad to see them being supported.
awesome write up. and happy to be supporting this fund.
Congratulations! Exciting to see that the fund has been success in its first year!