Executive summary: Despite the increasing risks of nuclear war, philanthropic funding for nuclear security has significantly decreased, presenting a critical funding gap that smaller donors could potentially fill.
Key points:
Annual philanthropic funding for nuclear security has dropped from $50m to $30m due to the MacArthur Foundation’s withdrawal from the field in 2020.
Nuclear security receives less funding compared to other neglected EA causes like factory farming, catastrophic biorisks, and AI safety.
Nuclear risk seems to be increasing with reports of Russia considering nuclear weapons against Ukraine, China’s expanding arsenal, and North Korea’s possession of at least 30 nuclear weapons.
The collapse of FTX prevented the Future Fund from filling the funding gap, and Open Philanthropy has decided to focus on AI safety and biosecurity instead.
Providing $3 million or more per year to experienced grantmakers like Carl Robichaud and Matthew Gentzel at Longview Philanthropy could help address the funding gap and support important nuclear policy efforts.
This funding opportunity may be particularly attractive to donors who are skeptical about AI safety but agree that the world underrates catastrophic risks.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, andcontact us if you have feedback.
Executive summary: Despite the increasing risks of nuclear war, philanthropic funding for nuclear security has significantly decreased, presenting a critical funding gap that smaller donors could potentially fill.
Key points:
Annual philanthropic funding for nuclear security has dropped from $50m to $30m due to the MacArthur Foundation’s withdrawal from the field in 2020.
Nuclear security receives less funding compared to other neglected EA causes like factory farming, catastrophic biorisks, and AI safety.
Nuclear risk seems to be increasing with reports of Russia considering nuclear weapons against Ukraine, China’s expanding arsenal, and North Korea’s possession of at least 30 nuclear weapons.
The collapse of FTX prevented the Future Fund from filling the funding gap, and Open Philanthropy has decided to focus on AI safety and biosecurity instead.
Providing $3 million or more per year to experienced grantmakers like Carl Robichaud and Matthew Gentzel at Longview Philanthropy could help address the funding gap and support important nuclear policy efforts.
This funding opportunity may be particularly attractive to donors who are skeptical about AI safety but agree that the world underrates catastrophic risks.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.