New Nuclear Security Grantmaking Programme at Longview Philanthropy
Key points
Longview Philanthropy, where I work, launched a nuclear security grantmaking programme in December 2021.
We are hiring a grantmaker to co-lead this programme alongside Carl Robichaud.
The co-leads will make grants potentially totalling up to $10 million initially, a figure which could grow substantially if they find or create sufficiently strong opportunities. [Update December 2022: This is now to be determined as we seek new funders for this work.]
So far, we have committed a single $1.6 million grant to the Council on Strategic Risks. Future grants will be directed by the programme co-leads.
We are also hiring a grantmaker who will work on other existential risks.
Background
A nuclear war would be a horror beyond comprehension. In the worst case, nuclear war could even lead to the permanent loss of humanity’s potential, putting it on a very short list of issues with potentially astronomical consequences. Nuclear weapons also interact with, and could exacerbate, other existential risks.
Longview was prompted to consider entering this space by the winding down of the MacArthur Foundation’s Nuclear Challenges work, which will close in 2023. In recent years, MacArthur represented roughly half of all philanthropic work in this sector. (In 2018 the Peace and Security Funding Index tracked MacArthur’s grantmaking at 54% of the $51.9 million of nuclear security grants; according to ORS Impact, MacArthur was 45% of an $81 million field.)
The contraction in funding in the field likely means that there will be many exceptional people who could be empowered to work on nuclear security issues with additional philanthropic support. For the same reason, now is likely to be a good time to refocus the field on especially important goals and especially promising approaches.
Based on reasoning such as the above, we began considering concrete grants in this area in December 2021. The conflict in Ukraine, and the corresponding worsening of relations between NATO and Russia, will of course change what is important and what is feasible.
Initial progress
Longview has conducted basic investigations into both the nuclear security space as a whole and into levers by which we might reduce the most extreme nuclear risks. In these initial investigations, we have benefited greatly from consultations with experts in the field.
Our first grant is a $1.6 million grant to the Council on Strategic Risks, primarily to support the development and promotion of stabilising policies, seeking procurement and posture which reduces the risk of a conflict escalating into nuclear war. Smaller elements of the grant are a fellowship to train new nuclear security staff and a pilot project exploring how to develop more widely-accepted calculations of the impacts of nuclear weapon use.
In short, we value that the Council on Strategic Risks:
Focuses on the most extreme nuclear risks (conflict involving the United States and allies, Russia, and/or China);
Proposes concrete policy actions which would reduce the risk of escalation into nuclear war; and
Has strong networks within U.S. national and international security communities.
As we build out our programme, we expect Longview will support a wider range of approaches than that exemplified by the Council on Strategic Risks.
Hiring a two-person team
Carl Robichaud is contracting with Longview and will join full-time in September 2022.
Until August 2021, Carl ran the second-largest nuclear security grantmaking programme, at the Carnegie Corporation.
He has worked in global security since 2001, holding positions at The Century Foundation and the Global Security Institute, and achieving a Masters in Public and International Affairs at Princeton.
We seek a programme co-lead with a complementary skillset.
For this role, we are especially interested in applicants from the effective altruism community because a strong understanding of the implications of longtermism will be crucial.
We are simultaneously advertising for an additional grantmaker in other longtermist areas, who would work primarily with me. You can use the same application form to apply to either or both of these roles.
Looking forward
The future of the nuclear security programme will be primarily in the hands of this two-person team, with support from the rest of the organisation and our networks.
We do not have a fixed budget. Rather, the programme’s spending will grow to match the grants which fit our priorities. As some rough guidance, we anticipate that if the co-leads find or create $10 million of opportunities fitting the programme’s priorities, we would be able to make that quantity of grants. Further, if early work on the programme demonstrates that growing funding significantly beyond that level would deliver substantial impact, we believe we would be able to do that.
- Results from the First Decade Review by 13 May 2022 15:01 UTC; 163 points) (
- Nuclear risk research ideas: Summary & introduction by 8 Apr 2022 11:17 UTC; 103 points) (
- War Between the US and China: A case study for epistemic challenges around China-related catastrophic risk by 12 Aug 2022 2:19 UTC; 76 points) (
- Future Matters #1: AI takeoff, longtermism vs. existential risk, and probability discounting by 23 Apr 2022 23:32 UTC; 57 points) (
- Early Reflections and Resources on the Russian Invasion of Ukraine by 18 Mar 2022 14:54 UTC; 57 points) (
- Nuclear weapons – Problem profile by 19 Jul 2024 17:17 UTC; 53 points) (
- Nuclear weapons safety and security—Career review by 16 May 2024 15:30 UTC; 51 points) (
- 8 possible high-level goals for work on nuclear risk by 29 Mar 2022 6:30 UTC; 46 points) (
- EA Updates for April 2022 by 31 Mar 2022 16:43 UTC; 32 points) (
- 10 Feb 2023 11:36 UTC; 27 points) 's comment on PHILANTHROPY AND NUCLEAR RISK REDUCTION by (
- EA Organization Updates: April-May 2022 by 12 May 2022 14:38 UTC; 25 points) (
- 17 Mar 2022 21:47 UTC; 9 points) 's comment on Experimental longtermism: theory needs data by (
- 20 Sep 2022 14:38 UTC; 8 points) 's comment on EA on nuclear war and expertise by (
- 29 Mar 2022 7:34 UTC; 4 points) 's comment on 8 possible high-level goals for work on nuclear risk by (
- 4 Jul 2022 20:43 UTC; -1 points) 's comment on The established nuke risk field deserves more engagement by (
Really excellent that you spotted this gap in the philanthropic market and moved in to fill it. Well done! Hope you hire someone excellent.
Very excited to see this!
For anyone interested in applying for these roles, Training for Good is running a beta test of our grantmaking training programme starting w/c 28th March. We’re looking for 5-10 people with relevant experience to participate in an 8 week scaled down programme. Time commitment would be ~5 hours per week.
Here’s a very rough overview of the programme for anyone who’s interested. It will culminate in a “capstone project” which could hopefully be used in your application for these (or other) roles as an example of your thinking & work.
If you’re interested, send me a message here or an email cillian@trainingforgood.com
That’s fantastic news. Strong organization filling an important gap in funding.
If anybody’s interested, I recently wrote up some thoughts about nuclear security engineering as a career path. It focuses on nuclear weapons as the last step in many stories about existential risk (including from misaligned AI), and brainstorms avenues to reduce the threat via engineering.
Would love any feedback, you can find the writeup here: https://forum.effectivealtruism.org/posts/7ZZpWPq5iqkLMmt25/aidan-o-gara-s-shortform?commentId=rnM3FAHtBpymBsdT7
I’m very excited about this development!