Big EA and Its Infinite Money Printer

Link post

I started the student group at the University of Illinois this semester and after talking to about a dozen student group organizers at the Boston and London conferences I was encouraged by a few people to post about what our group is like. This is probably the first of three posts and I’ll follow it with a post about the way I’ve used/​adapted the intro materials and a post about ideology/​evangelism/​AI safety/​longtermism stuff. [1]

TLDR:

  • Our new student group’s relationship to EA is closely related to the funding situation of the CEA/​OpenPhil/​FTX Fund/​etc constellation, which we’ve dubbed “Big EA.” One of my primary activities is assuring people that it’s not suspect that Big EA provides us so much money.

  • There’s a dynamic where the typical club member is sympathetic to EA but would not do direct work except for the money, and might even see this behavior as a kind of “grifting” even though I and Big EA wouldn’t.

  • But I think this is fine and that concerns about EA spending described in recent posts are overrated.

The Prometheus Science Bowl

I’m involved in the national high school/​college quiz bowl circuit. In January, a friend who worked with me on a business where we ran online quiz bowl tournament after the COVID lockdowns heard about an online science bowl tournament run by Devansh with no entry fee and $28,000 in prizes. Quiz bowl tournaments don’t typically offer cash prizes and science bowl (which I’m not well acquainted with) only offers prizes like this for the national championships.

Having run many online quiz bowl tournaments I scheduled a call to give some advice about preventing cheating and give some leads on finding volunteer readers or teams, only to discover that the readers wouldn’t be volunteering but would instead make $25/​hr (quiz bowl tournaments usually give a stipend for lunch at most). I told Devansh that he could double the field size goal with this kind of money and remember replying to one question with “normally yes, but you have an infinite money printer.”

The staff payment was probably perceived as decadent spending by the UIUC guys because the tournament could have reached its staff goals with half the wage, and the OpenPhil investment as liberal because the tournament director was a high school student.

The way it was explained to me is that OpenPhil values a switch to an AI safety research career as +$20M in expected value, so a $100k science bowl tournament only has to have a 0.5% chance of convincing at least one of the ~800 participants to make a career change in order to be net positive. By this logic Devansh clearly exceeded reasonable metrics and the tournament ran very well. It was just a novelty for our first interaction with EA to be this hilarious leviathan with more money than God.

EAGx Boston

The contract work for the tournament brought EA back to the front of my mind. A friend and I had considered starting an EA group here in fall 2020 and I even contacted CEA/​some nearby student groups but we were ultimately occupied with other things.

I learned that CEA funds student groups for things like food, books, posters, and T-shirts. I had trouble seeing the value proposition of books and posters so I just asked for money to buy food at our weekly meetings. The enthusiasm level for the club was low enough that the meetings wouldn’t be attended at all without free food or if I moved the meeting to a building on the other side of campus. A typical view was “This EA stuff is good (except AI safety) and someone should be doing it, but not me, I have my career at Amazon/​Nvidia/​academia to attend to.”

Then I brought 7 members of the group to the Boston conference and their view of EA might have actually gone down slightly overall from exposure to AI safety stuff.[2] At the conference they also heard about this retreat stuff which was universally considered decadent.

After the conference, a bunch of them effectively said “Arjun, if EA is so talent-constrained and not funding-constrained, you should go and find some technical projects and get grants from Big EA to pay us $100/​hr or whatever to complete these projects.” We had been thinking about an idea like this for some time and I think its EV overwhelms the cost spent by CEA in funding our pizza and Boston trips (which kept the club activity at the level needed to produce the idea).

So among some stories of undergrads mostly hanging out or professionals doing networking that’s unrelated to EA I can cite this as an example of the money delivering +EV.

“Grifting”

Anecdotally, I’ve spoken to several organisers who aren’t convinced of longtermism but default to following the money nevertheless. I’ve even heard (joking?) conversations about whether it’s worth ‘pretending’ to be EA for the free trip.

The message is out. There’s easy money to be had. And the vultures are coming. On many internet circles, there’s been a worrying tone. “You should apply for [insert EA grant], all I had to do was pretend to care about x, and I got $$!” Or, “I’m not even an EA, but I can pretend, as getting a 10k grant is a good instrumental goal towards [insert-poor-life-goals-here]” . . . Or, “All you have to do is mouth the words community building and you get thrown bags of money.”

The weird part of this dynamic is when members of the club themselves might think of themselves as grifting even though I and Big EA don’t:

Toy example: Big EA pays EA UIUC $100/​hr to build some project which we think is altruistically worth less than $50/​hr but which Big EA thinks is worth $200/​hr.

Real Example: I’m planning to spend a lot of time competing for one of the $100,000 blog prizes. I’m not fully convinced that the altruistic value of the contestants’ output justifies funding the prize—maybe it does—but when you add the personal value in expected earnings from a chance at winning the prize and improvements to my writing skill or network or whatever else, I’m happy to compete.

There’s nothing wrong with this (this is how trade normally works), but it’s weird because the Big EA position on the expected value is usually less commonsensical, so new people wandering into the club or spectating in the vicinity encounter a bunch of people apparently profiting from the areas where their skills and interest align with a wealthy benefactor who is spending money in silly ways. This is bad optics.

You could divide the grifter spectrum into groups:

  • True believers think it’s worthwhile to effect the most utility possible and are personally interested in pursuing this by convincing Big EA to help them work on the most important problems.

    • This can be divided into people who are confident in their evaluations in what’s most effective and try to convince Big EA of their conclusions, and people who trust Big EA to support the most important causes/​ideas without always agreeing with its reasoning

  • Coincidental allies are people with a personal interest that coincidentally overlaps with areas that Big EA is interested in funding. A member of our club often talks about his pet interest in securing the electrical grid or something and was pleased to report that FTX thinks this is important.[3]

  • EA sympathizers think it’s worthwhile to effect the most utility possible but aren’t personally interested in pursuing this opportunity because of the money/​status/​intrinsic interest of the work, but they will do work for money. This describes many members of EA UIUC, who mostly have technical backgrounds.

  • Literal grifters try to get money by misrepresenting their beliefs or making fraudulent claims about the work they’re doing.

And I think all the non-grifters are fine. I only see two downsides:

1. Since coincidental allies and sympathizers have motivations other than EA, then new discoveries about the nature of impact could leave their plans incompatible with EA goals.

For example, if Big EA no longer thinks that electrical grid stuff is worth funding, but building a university is worth funding, my electrical grid friend isn’t going to work on the university even if he has a comparative advantage from an altruistic perspective. This might imply that funding for general career skills is best spent on true believers but not much else.

2. Literal grifters could masquerade as sympathizers.

This would create barriers to legitimate funding since grantmakers would have to adjust for the risk of grifting and honest grant-seekers might have to engage in wasteful signaling games. Things like this certainly take place:

I can speak of one EA institution, which I will not name, that suffers from this. Math and cognitive science majors can get a little too far in EA circles just by mumbling something about AI Safety, and not delivering any actual interfacing with the literature or the community.

  • quinn, commenting on “The Vultures Are Circling”

Beyond deliberate misleading, motivated reasoning is always easy. As someone without any strong position on longtermism yet, I get a very strong impression that it would be in my material/​social interest to support it.

That said, the reputation of grantmakers to discern genuine applicants from frauds seems good and my prior is that people tend to be over-concerned with fraud in cases like these (e.g. in American politics it’s widely believed that people fixate on welfare or voter fraud beyond their actual importance). Tightening the purse strings to excess is also a failure mode.

My impression of the Boston and London conferences is that most of the people there are “coincidental allies” by this framing. They’d be in their chosen careers regardless of whether it was an EA cause area.

An even better model is probably something like Eric Neyman’s in My bargain with the EA Machine:

I think realistically for the UIUC club members (and myself) it’s better to think of thresholds of money/​status/​career capital, especially career capital that hedges against a risk like EA collapsing.

A moment that struck me as exciting at Boston was when I found some crypto people to talk to a few UIUC guys who took an interest and one of them mentioned a mutual acquaintance undertaking a project to “mint new EA crypto millionaires” and another immediately replied “what’s the point?” I was reminded of the Sam Altman quote:

A common criticism of people in Silicon Valley … are people who say some version of the following sentence. My life’s work is to build rockets, so … I’m going to make a hundred million dollars in the next … four years, trading crypto currency with my crypto hedge fund, because I don’t want to think about the money problem anymore, and then I’m going to build rockets. They never do either. I believe if these people would just pick one thing or the other, you know, I’m obsessed with money, so I’m going to go make a hundred million dollars, however long that takes, but with a long enough time horizon that’s reasonable. I’m just going to build rockets right now. I believe they would succeed at either.

In fact, I pitch EA to my friends by mostly listing cause areas in which Big EA takes an interest and asking if they’ve had any career aspirations or moonshot ideas in those areas. Ultimately I assume that most people have somewhere in their hearts a romantic notion of a life spent in service of mankind that they give up in exchange for material security or status or risk, and a huge value of EA is presenting an opportunity to overcome those obstacles and just build philanthropic rockets.

  1. ^

    A bunch of the organizers I spoke to were interested just by my descriptions of the experiences other organizers had described in earlier conversations. There’s probably an audience for narratives of what you’ve been up to even if you don’t think of what you’re doing as very interesting.

  2. ^

    Will be treated in the next post.

  3. ^

    He’ll probably post about this in the near future but I can give you his contact information now if you take an interest in the topic.