Thoughts on my relationship to EA (and please donate to PauseAI US)

Hey, EAs. Feels a little strange to address you through EA forum since I stopped considering myself one of you. For those who don’t know I was an EA leader for a decade, organizing at Harvard and working at Rethink Priorities before pursuing PauseAI forced me to quit my job and eventually give up all the comforts of a community I loved. I stopped commenting on the forum (unless I have to clear up org stuff) after the mods reprimanded me for being upset about many of you being the problem. (That same week I had a curated post. lol, lmao.) I didn’t want to lose the high karma account and I felt like I had basically been told I can’t make my case here if it makes people feel bad so I’ve mostly gone dormant.

Still, seems like PauseAI and EA have a lot of common cause and should be working together, right? Alas, no. The main reason I didn’t participate in the donation election this year was that my Dad was dying and I didn’t have time. It’s not like I didn’t do it on principle or anything. I would have done it with more time, and I was invited to do so, but it wasn’t a high enough fundraising or org priority to do it under time constraints and I’m going to say a little about why. It’s revealing about the issues with my work and EA in general.

The experience I had in the donation election last year was of spending tons of time answering very demanding questions, doing really well in the rankings, and still not getting the election money. We did get some separate donations because the donors read the discussion. But it wasn’t high-yield. It was worth my time financially, but maybe not counterfactually. I had hoped it would help EAs to understand PauseAI and be worth it for that reason. But I no longer think it is worth my time or energy to try to convince you.

I think, if it survives at all, EA will eventually split into pro-AI industry, who basically become openly bad under the figleaf of Abundance or Singulatarianism, and anti-AI industry, which will be majority advocacy of the type we’re pioneering at PauseAI. I think the only meaningful technical safety work is going to come after capabilities are paused, with actual external regulatory power. The current narrative (that, for example, Anthropic wishes it didn’t have to build) is riddled with holes and it will snap. I wish I could make you see this, because it seems like you should care, but you’re actually the hardest people to convince because you’re the most invested in the broken narrative.

I don’t think talking with you on this forum with your abstruse culture and rules is the way to bring EA’s heart back to the right place— where the reward is not status or friends or working on your favorite technology, but helping people and animals. What will change your minds is Pause sentiment becoming more popular, not me beating you in argument. I’m already beating you and you just define the game so that the conclusion of moving toward advocacy can’t win. Or you interpret justified, straightforward disapproval at your complicity in the greatest risk of our time as simply too impolite.

You’ve lost the plot, you’re tedious to deal with, and the ROI on talking to you just isn’t there. I’ve taken a lot of emotional damage having a group that was kind of my extended family force me to choose between them and doing the right thing. I can’t disentangle my work from EA, but I’m giving myself a lot of space.

I think you’re using specific demands for rigor (rigor feels virtuous!) to avoid thinking about whether Pause is the right option for yourselves. Like if you can make me seem incompetent, that means you can forget about AI Safety advocacy or international cooperation or Pause as avenues. Case in point: EAs wouldn’t come to protests, then they pointed to my protests being small to dismiss Pause as a policy or messaging strategy!

I believe many of you will eventually side with me, and you will be welcomed into PauseAI whether you identify with EA by then or not. But PauseAI is not an EA organization. (That ship sailed in 2023 when Open Phil gave me nonsensical reasons they wouldn’t explore Pause as a policy or advocacy, which I now know they did because they serve Anthropic’s interests. They just lied to me and tried to make me feel like I was stupid to make me drop it. Many of you took your cues from them.) Something EAs tend to misunderstand is that PauseAI is not an ideology— it’s a grassroots coalition of all types of people who think a Pause would be good. PauseAI is for everyone. This is one of my favorite things about it.

EA used to be about doing the most good wherever it could be found, and that used to take people a lot of places— spreadsheets, yes, but also RCT field trials and pledge drives and giving games. Now it’s about working at an AI lab or wishing you could work at an AI lab. Most of you can’t do that. (Which is great, because it’s evil! It is literally being the problem.) Aren’t you bored of hanging on around here watching AI developments like a spectator sport and fanboying for the cool kids? Some of you are too young to remember what it was like to be surrounded by people whose reward was helping people, and who were excited and honored to have the chance to do unglamorous high-impact work others were unwilling to do. It was inspiring— spiritual food and a shining example to me— and that is the energy in PauseAI now. Because no one is here for an in-group or glory or an easy life. We’re here to protect the world.

—————————————————————

Our projected budget this year (with no major upgrades, in line with the previous year that produced the results on our flyer) is $440k. Of course with all of the infrastructure we’ve built and the community of volunteers we have amassed in 2025, our 2026 results will go much further. My 2026 raise goal is $1M, both because we need runway and so we have the security to hire beyond our 3-person staff.

Donate here instantly: https://​​www.zeffy.com/​​en-US/​​donation-form/​​donate-to-help-pause-ai

If you’re serious about donating over $1k, you can contact me to talk in more detail.