Why People Use Burner Accounts: A Commentary on Blacklists, the EA “Inner Circle”, and Funding

I wanted to offer a different perspective on this post on why people use burner accounts from someone who has used one previously. This is not intended to be a rebuttal of the arguments made there. It is meant to add to the public discourse and made more sense as a separate post than a comment.

I hope that any upvotes/​downvotes are given based on the effort you think I put in offering a different perspective instead of whether you agree/​disagree with my comments (NB: I think there should be separate upvote/​downvote agree/​disagree buttons for forum posts too).

Disclaimer

Note that...

  • I used a burner account in early 2021 after finding myself unhappy with the EA Community.

  • I’ve been visiting the EA Forums weekly since and occasionally still going to EAGs.

  • I work at an EA organisation that receives significant funding from Open Philanthropy.

Reasons for Using Burners

The two biggest reasons for using burners are the potential operation of blacklists and funding. I’ll use them together (with anecdotes and some evidence) to make an overarching point at the end, so read all the way through.

Blacklists

  • Unfortunately, some EA groups and organisations use blacklists (this seems more common in the Bay Area).

    • Note that this is difficult to prove as...

    • I’d have to directly ask them, “Are you using a blacklist? I’ve heard X rumour that seems to suggest this”, and they’re very unlikely to say yes (if they are) as it’s not in their interests.

    • I don’t want to be seen as a “troublemaker” by suggesting an organisation is using a blacklist when I have a strong reason to believe they are. If they operate a blacklist, I’d likely be blacklisted from events for being a “troublemaker”.

  • [Anecdote suggesting the existence of a blacklist removed on request]

  • People have been told that they may have been supposedly blacklisted by organisers for being “epistemically weak” and “not truth-seeking” enough.

    • These are common terms used amongst Bay Area rationalists and even some funders.

    • Strong Personal Opinion: I think the problem here is rationality. It provides a camouflage of formalism, dignity, and an intellectual high ground for when you want to be an absolute asshole, justify contrarian views, and quickly dismiss other people’s opinions. By contrarian, as an example, you think diversity is not important (as many Bay Area rationalists do) when most of Western society and the media do think it is important.

      • Justifying this strong personal opinion with 20+ anecdotes would take a post on its own. I may write this in the future if there’s enough interest, and I can do so without the risk of doxing myself by accidentally de-anonymizing details.

      • Previously, I’ve pushed some rationalists on why they thought someone wasn’t “truth-seeking” enough or “epistemically weak”. Around half the time, they couldn’t give a clear answer which makes me believe they’re using these as buzzwords to camouflage the fact they don’t like someone for whatever reason.

Funding

  • Another claim that is hard to prove is that ‘there is/​has been an intermingling of funding and romantic relationships’. This becomes more complicated with the prevalence of polyamory.

    • I realised this to be disturbingly common about two years ago, but I chose not to speak up for fear of being blacklisted.

    • I’ve been inspired by courageous people making posts on this (with burner accounts). I didn’t have the courage to write a post when I first noticed this two years ago and still didn’t have the courage in the many successive times since.

    • A rumour about a senior program officer at Open Philanthropy and a grantee in a metamour-relationship has previously been ‘verified’ on the Forums.

  • There seems to be an inner circle of funders and grantees (predominately in the Bay Area) where the grantees often don’t need to write grant applications and can just ask for money (often retroactively).

    • Note that this is also hard to prove. I could email Open Philanthropy asking this, but my organisation receives significant funding from them, and I don’t want to be seen as a “troublemaker”. I like to think they don’t operate a blacklist, but even if there’s a 1-20% chance they do, questioning a grant only to be later put on a blacklist with my organisation being defunded is not in my interests, given I have to make a living.

      • This touches on a wider point about having your entire life: relationships (professional, personal and romantic), professional identity, and personal identity wrapped up in EA with the need to also made a living. When this is true for hundreds of people in a tight-knit community with regular conferences and meetups, it leads to strange dynamics/​decision-making.

    • Therefore, I feel comfortable questioning these grants using burner accounts.

      • I’ll do some of this now.

      • Despite Holden pausing “pausing most new longtermist funding commitments” in November 2022 only to later unpause it in January 2023, the Atlas Fellowship (which falls under longtermism) received $1.8 million in December 2022. (Note the use of “most”, this suggests to me that the same rules don’t apply to everyone, as you’ll see below).

        • I find this problematic as the Atlas Fellowship share the same offices with Open Philanthropy in the Bay Area. The offices are called Constellation.

          • In the Forum post linked on the word “Constellation”, it says, “Constellation is run by Redwood Research. Aside from the Redwood team, which is about 30 people, they have a coworking space that is more targeted at organisations rather than independent researchers or smaller projects. Thus, Constellation hosts staff from Open Phil, ARC, the FTX Future Fund, CEA, AI Impacts, Atlas Fellowship, MIRI, Lightcone, Alvea, and GCP. Access to Constellation is typically more limited than Lightcone. Currently (as of July 2022) there is no application form, and they are mostly focused on supporting members from the organisations in the space.

            • Note that all these organisations (besides the FTX Future Fund) receive significant funding from Open Philanthropy. This includes Redwood Research.

            • The fact there is no application form seems to add to the cliquey-ness and the fact there may be an “inner circle” of funders and grantees who are held less accountable by each other.

          • I have visited Constellation before, and all three individuals in question on Rumour 4 on this post currently do work there regularly /​ have worked in the past there regularly.

            • I didn’t realise until this post that a Senior Program Officer at Open Philanthropy is married to the CTO of Redwood Research. I find this disturbing as Redwood Research received $10.7m from Open Philanthropy without mention of the grant investigator.

              • (Encrypted in rot13) Ol Fravbe Cebtenz Bssvpre ng Bcra Cuvynaguebcl, V’z ersreevat gb Pynver Mnory. Ol PGB bs Erqjbbq Erfrnepu, V’z ersreevat gb Ohpx Fpuyrtrevf.

              • More widely, I’d be interested in how power dynamics work at Open Philanthropy. I imagine even if the grant investigator for Redwood Research is one of the other Program Officers at Open Philanthropy, does the Senior Program Officer (who, remember, is married to the CTO of Redwood Research) have to agree/​disagree with the grant amount/​decision that their subordinate makes?

              • I imagine there are power dynamics where a program officer wants a promotion, doesn’t want to risk being fired, or wouldn’t want to disappoint their superior (who is married to the CTO of Redwood Research) by giving her husband a smaller amount than requested.

              • Things become significantly more complicated once you throw polyamory into the mix.

    • By “ask for money (often retroactively)”, I am referring to the grant made to the Future Forum (a conference held Aug 4 − 7, 2022).

      • What is true is that the Future Forum was kicked out of the advertised venue (in the Neogenesis Group House) due to noise complaints from neighbours (an attorney showed up on the driveway and told everyone to leave). The problem was this happened on the Day 1 of the conference, Day 2 was still in the group house, but after Day 2, the volunteers had to work through the night (reportedly with no breaks) to set up a new venue for Day 3 and Day 4.

        • From the volunteers, I’ve heard that the Future Forum’s organisers were so bad that anyone from the CEA Events Team still in the Bay Area post-EAG SF (July 29 − 31, 2022) had to step in to clean up their mess.

        • Cleaning up their mess included getting a new venue last minute (which was very expensive), which took them into large debt, and then, reportedly, being bailed out by Open Philanthropy (retroactively).

          • They could have been bailed out because the organisers were on good terms with the funders.

          • Whilst I can’t verify this, I believe this is true as I’ve seen instances of smaller grants (often in the $ 1000s) where a well-connected grantee will spend the money, go into debt, and ask to be bailed out by a funder instead of applying for a grant to begin with. This is a bad culture.

      • NB: this grant also doesn’t have a grant investigator listed. I think all OP grants should have their grant investigators listed with an indication of who made the final call and what percentage of time each grantmaker spent on the application (which shouldn’t be hard to do with most time-tracking software).

My Overall Point

Ultimately, my overall point is that one reason for using a burner account (like in my case) is that if you don’t belong to the “inner circle” of funders and grantees, then I believe that different rules apply to you. And if you want to join that inner circle, you need not question grants by directly emailing OP. And once you’re inside the inner circle, but you want to criticise grants, you must use a burner account or risk being de-funded or blacklisted. If you ask why you were blacklisted, you’ll have the reason of “we don’t like you, or you’re a trouble-maker” camouflaged in you being “epistemically weak” or “not truth-seeking enough.”

Edit 1: I edited the beginning of this post as per this comment.

Edit 2: Retracted Future Forum statement because of this comment.

Edit 3: Anecdote removed on request.