Thanks for this post! I broadly agree with your introduction but I’d say the “research vs. advocacy” distinction is more important than “501(c)(3) vs. 501(c)(4)”. There are 501(c)(3) advocacy orgs including MIRI, Palisade, PauseAI US/PauseAI Global, Lightcone Infrastructure (sort of), etc. Lobbying on policy measures is one component of advocacy, but not the only component. I’m uncertain about how valuable it is compared to other kinds of advocacy that 501(c)(3)s can do.
Hi Michael! Thanks for the comment. Agreed that research vs. advocacy matters more than the c3/c4 distinction; I have a footnote clarifying this, but in retrospect it probably belongs in the body of the post.
That said, I think c3/c4 is a useful shorthand for donors specifically because it maps onto who else will fund this work. c3 advocacy orgs like the ones you mention (MIRI, Palisade, PauseAI, maybe even Lightcone but uncertain / don’t know enough about their work beyond Lighthaven lol) can receive funding from private foundations, DAFs, and corporate matching programs (whether or not they do in practice is a different question, and I don’t know enough about that to comment, but my sense is that broadly, c3s are way less funding constrained than c4s for structural reasons as c4 lobbying orgs largely can’t receive funding from these kinds of sources at all).
So while the intellectual distinction is research vs. advocacy, I’d say the funding gap is sharpest at the c3/c4 boundary, and that’s where an individual donor’s marginal dollar is most counterfactually valuable—other than perhaps donations to PACs/specific politicians.
I’m also uncertain about the relative value of lobbying vs. other forms of advocacy. My intuition is that direct lobbying is underfunded relative to its impact, partly because the structural barriers to funding it are so high, but I’d love to see more analysis on this.
All the c3 orgs I mentioned are funding-constrained (except perhaps MIRI). AI x-risk advocacy is unpopular among big funders (with a couple exceptions like SFF), which means both the c3s and c4s are funding-constrained. At least that’s true currently; I’m not sure if that will still be true a year from now.
I agree with you about the structural funding disadvantage to c4s, but empirically it doesn’t look obviously important for driving funding constraints.
I’m also uncertain about the relative value of lobbying vs. other forms of advocacy. My intuition is that direct lobbying is underfunded relative to its impact, partly because the structural barriers to funding it are so high, but I’d love to see more analysis on this.
Yeah this is a difficult question, I don’t know. Another problem is that lobbying is opaque so it’s harder to tell who’s doing a good job. (ControlAI is my favorite lobbying org because they write a lot about what they do and I like what they write. But also I reached out to them and they said they were not accepting donations edit: only accepting donations of $100K or more.)
Thanks Michael! Will be doing more research into the c3s you listed, as I’m working on compiling a complete set of funding recommendations for a future post—if you have recs on things to read, would be useful :)
Re: ControlAI, seems that they’re currently fundraising, but only accepting gifts of $100k+. Have updated my post to reflect this.
Coefficient Giving and Longview Philanthropy often write about their grantmaking decisions (although not in as much detail as I’d like). I tend to have some pretty big disagreements with them, but they’re still worth reading.
Most reasoning on grantmaking/donations happens in private, so there’s not a whole lot to read. If you broaden the question to writings on general strategy (not just donations), there’s a ton of stuff worth reading. I will just link two that align best with my personal views:
No Winners – Q&A on why an international halt on AI development should be the goal
A Narrow Path – ControlAI’s long-term plan for avoiding extinction
Thanks for this post! I broadly agree with your introduction but I’d say the “research vs. advocacy” distinction is more important than “501(c)(3) vs. 501(c)(4)”. There are 501(c)(3) advocacy orgs including MIRI, Palisade, PauseAI US/PauseAI Global, Lightcone Infrastructure (sort of), etc. Lobbying on policy measures is one component of advocacy, but not the only component. I’m uncertain about how valuable it is compared to other kinds of advocacy that 501(c)(3)s can do.
Hi Michael! Thanks for the comment. Agreed that research vs. advocacy matters more than the c3/c4 distinction; I have a footnote clarifying this, but in retrospect it probably belongs in the body of the post.
That said, I think c3/c4 is a useful shorthand for donors specifically because it maps onto who else will fund this work. c3 advocacy orgs like the ones you mention (MIRI, Palisade, PauseAI, maybe even Lightcone but uncertain / don’t know enough about their work beyond Lighthaven lol) can receive funding from private foundations, DAFs, and corporate matching programs (whether or not they do in practice is a different question, and I don’t know enough about that to comment, but my sense is that broadly, c3s are way less funding constrained than c4s for structural reasons as c4 lobbying orgs largely can’t receive funding from these kinds of sources at all).
So while the intellectual distinction is research vs. advocacy, I’d say the funding gap is sharpest at the c3/c4 boundary, and that’s where an individual donor’s marginal dollar is most counterfactually valuable—other than perhaps donations to PACs/specific politicians.
I’m also uncertain about the relative value of lobbying vs. other forms of advocacy. My intuition is that direct lobbying is underfunded relative to its impact, partly because the structural barriers to funding it are so high, but I’d love to see more analysis on this.
All the c3 orgs I mentioned are funding-constrained (except perhaps MIRI). AI x-risk advocacy is unpopular among big funders (with a couple exceptions like SFF), which means both the c3s and c4s are funding-constrained. At least that’s true currently; I’m not sure if that will still be true a year from now.
I agree with you about the structural funding disadvantage to c4s, but empirically it doesn’t look obviously important for driving funding constraints.
Yeah this is a difficult question, I don’t know. Another problem is that lobbying is opaque so it’s harder to tell who’s doing a good job. (ControlAI is my favorite lobbying org because they write a lot about what they do and I like what they write. But also I reached out to them and they said they were
not accepting donationsedit: only accepting donations of $100K or more.)Thanks Michael! Will be doing more research into the c3s you listed, as I’m working on compiling a complete set of funding recommendations for a future post—if you have recs on things to read, would be useful :)
Re: ControlAI, seems that they’re currently fundraising, but only accepting gifts of $100k+. Have updated my post to reflect this.
I missed that, thanks for calling it out!
Since I’m biased on the matter, I will start by linking the posts I’ve written:
Where I Am Donating in 2024 – see the comments too, there were some disagreements with my reasoning
AI Safety Landscape & Strategic Gaps
Where I Am Donating in 2025
Eric Neyman and Zach Stein-Perlman write recommendations on AI risk advocacy, most of their work is non-public but Eric wrote Consider donating to Alex Bores, author of the RAISE Act.
Zvi wrote The Big Nonprofits Post [2024] and 2025. He’s a grantmaker for SFF which is my favorite of the big grantmakers.
Coefficient Giving and Longview Philanthropy often write about their grantmaking decisions (although not in as much detail as I’d like). I tend to have some pretty big disagreements with them, but they’re still worth reading.
Most reasoning on grantmaking/donations happens in private, so there’s not a whole lot to read. If you broaden the question to writings on general strategy (not just donations), there’s a ton of stuff worth reading. I will just link two that align best with my personal views:
No Winners – Q&A on why an international halt on AI development should be the goal
A Narrow Path – ControlAI’s long-term plan for avoiding extinction