Open EA Global
I think EA Global should be open access. No admissions process. Whoever wants to go can.
I’m very grateful for the work that everyone does to put together EA Global. I know this would add much more work for them. I know it is easy for me, a person who doesn’t do the work now and won’t have to do the extra work, to say extra work should be done to make it bigger.
But 1,500 people attended last EAG. Compare this to the 10,000 people at the last American Psychiatric Association conference, or the 13,000 at NeurIPS. EAG isn’t small because we haven’t discovered large-conference-holding technology. It’s small as a design choice. When I talk to people involved, they say they want to project an exclusive atmosphere, or make sure that promising people can find and network with each other.
I think this is a bad tradeoff.
...because it makes people upset
This comment (seen on Kerry Vaughan’s Twitter) hit me hard:
A friend describes volunteering at EA Global for several years. Then one year they were told that not only was their help not needed, but they weren’t impressive enough to be allowed admission at all. Then later something went wrong and the organizers begged them to come and help after all. I am not sure that they became less committed to EA because of the experience, but based on the look of delight in their eyes when they described rejecting the organizers’ plea, it wouldn’t surprise me if they did.
Not everyone rejected from EAG feels vengeful. Some people feel miserable. This year I came across the Very Serious Guide To Surviving EAG FOMO:
Part of me worries that, despite its name, it may not really be Very Serious…
...but you can learn a lot about what people are thinking by what they joke about, and I think a lot of EAs are sad because they can’t go to EAG.
...because you can’t identify promising people.
In early 2020 Kelsey Piper and I gave a talk to an EA student group. Most of the people there were young overachievers who had their entire lives planned out, people working on optimizing which research labs they would intern at in which order throughout their early 20s. They expected us to have useful tips on how to do this.
Meanwhile, in my early 20s, I was making $20,000/year as an intro-level English teacher at a Japanese conglomerate that went bankrupt six months after I joined. In her early 20s, Kelsey was taking leave from college for mental health reasons and babysitting her friends’ kid for room and board. If either of us had been in the student group, we would have been the least promising of the lot. And here we were, being asked to advise! I mumbled something about optionality or something, but the real lesson I took away from this is that I don’t trust anyone to identify promising people reliably.
...because people will refuse to apply out of scrupulosity.
I do this.
I’m not a very good conference attendee. Faced with the challenge of getting up early on a Saturday to go to San Francisco, I drag my feet and show up an hour late. After a few talks and meetings, I’m exhausted and go home early. I’m unlikely to change my career based on anything anyone says at EA Global, and I don’t have any special wisdom that would convince other people to change theirs.
So when I consider applying to EAG, I ask myself whether it’s worth taking up a slot that would otherwise go to some bright-eyed college student who has been dreaming of going to EAG for years and is going to consider it the highlight of their life. Then I realize I can’t justify bumping that college student, and don’t apply.
I used to think I was the only person who felt this way. But a few weeks ago, I brought it up in a group of five people, and two of them said they had also stopped applying to EAG, for similar reasons. I would judge both of them to be very bright and dedicated people, exactly the sort who I think the conference leadership are trying to catch.
In retrospect, “EAs are very scrupulous and sensitive to replaceability arguments” is a predictable failure mode. I think there could be hundreds of people in this category, including some of the people who would benefit most from attending.
...because of Goodhart’s Law
If you only accept the most promising people, then you’ll only get the people who most legibly conform to your current model of what’s promising. But EA is forever searching for “Cause X” and for paradigm-shifting ideas. If you only let people whose work fits the current paradigm to sit at the table, you’re guaranteed not to get these.
At the 2017 EAG, I attended some kind of reception dinner with wine or cocktails or something. Seated across the table from me was a man who wasn’t drinking and who seemed angry about the whole thing. He turned out to be a recovering alcoholic turned anti-alcohol activist. He knew nobody was going to pass Prohibition II or anything; he just wanted to lessen social pressure to drink and prevent alcohol from being the default option—ie no cocktail hours. He was able to rattle off some pretty impressive studies about the number of QALYs alcohol was costing and why he thought that reducing social expectations of drinking would be an effective intervention. I didn’t end up convinced that this beat out bednets or long-termism, but his argument has stuck with me years later and influenced the way I approach social events.
This guy was a working-class recovering alcoholic who didn’t fit the “promising mathematically gifted youngster” model—but that is the single conversation I think about most from that weekend, and ever since then I’ve taken ideas about “class diversity” and “diversity of ideas” much more seriously.
(even though the last thing we need is for one more category of food/drink to get banned from EA conferences)
...because man does not live by networking alone
In the Facebook threads discussing this topic, supporters of the current process have pushed back: EA Global is a networking event. It should be optimized for making the networking go smoothly, which means keeping out the people who don’t want to network or don’t have anything to network about. People who feel bad about not being invited are making some sort of category error. Just because you don’t have much to network about doesn’t make you a bad person!
On the other hand, the conference is called “EA Global” and is universally billed as the place where EAs meet one another, learn more about the movement, and have a good time together. Everyone getting urged not to worry because it’s just about networking has to spend the weekend watching all their friends say stuff like this:
Some people want to go to EA Global to network. Some people want to learn more about EA and see whether it’s right for them. Some people want to update themselves on the state of the movement and learn about the latest ideas and causes. Some people want to throw themselves into the whirlwind and see if serendipity makes anything interesting happen. Some people want to see their friends and party.
All of these people are valid. Even the last group, the people who just want to see friends and party, are valid. EA spends I-don’t-even-know-how-many millions of dollars on community-building each year. And here are people who really want to participate in a great EA event, one that could change their lives and re-energize them, basically the community trying to build itself. And we’re telling them no?
...because you can have your cake and eat it too.
There ought to be places for elites to hang out with other elites. There ought to be ways for the most promising people to network with each other. I just don’t think these have to be all of EA Global.
For example, what if the conference itself was easy to attend, but the networking app was exclusive? People who wanted to network could apply, the 1500 most promising could get access to the app, and they could network with each other, same as they do now. Everyone else could just go to the talks or network among themselves.
Or what if EA Global was easy to attend, but there were other conferences—the Special AI Conference, the Special Global Health Conference—that were more selective? Maybe this would even be more useful, since the Global Health people probably don’t gain much from interacting with the AI people, and vice versa.
Some people on Facebook worried that they wanted to offer travel reimbursement to attendees, but couldn’t afford to scale things up and give 10,000 travel reimbursement packages. So why not have 10,000 attendees, and they can apply for 1,500 travel reimbursement packages which organizers give based on a combination of need and talent? Why not make ordinary attendees pay a little extra, and subsidize even more travel reimbursements?
I don’t know, there are probably other factors I don’t know about. Still, it would surprise me if, all things being considered, the EA movement would be worse off by giving thousands of extra really dedicated people the chance to attend their main conference each year.
At the closing ceremony of EA Global 2017, Will MacAskill urged attendees to “keep EA weird”
I don’t know if we are living up to that. Some of the people who get accepted are plenty weird. Still, I can’t help thinking we are failing to fully execute that vision.
- The Cost of Rejection by 8 Oct 2021 12:35 UTC; 304 points) (
- How has FTX’s collapse impacted EA? by 17 Oct 2023 17:02 UTC; 241 points) (
- Case Study of EA Global Rejection + Criticisms/Solutions by 23 Sep 2022 11:38 UTC; 231 points) (
- Much EA value comes from being a Schelling point by 10 Sep 2022 7:26 UTC; 132 points) (
- Apply Now: EAGxVirtual (17-19 November) by 20 Sep 2023 13:27 UTC; 83 points) (
- Proposed improvements to EAG(x) admissions process by 30 Jan 2023 6:10 UTC; 62 points) (
- Proposal — change the name of EA Global by 19 Dec 2022 17:34 UTC; 57 points) (
- EA & LW Forums Weekly Summary (28 Aug − 3 Sep 22’) by 6 Sep 2022 11:06 UTC; 51 points) (LessWrong;
- EA & LW Forums Weekly Summary (28 Aug − 3 Sep 22’) by 6 Sep 2022 10:46 UTC; 42 points) (
- 23 Sep 2022 16:38 UTC; 29 points) 's comment on Case Study of EA Global Rejection + Criticisms/Solutions by (
- Smuggled assumptions in “declining epistemic quality” by 16 Jan 2023 17:26 UTC; 28 points) (
- Reflections on applications, rejections and feedback by 28 Mar 2023 18:13 UTC; 27 points) (
- 24 Sep 2022 11:21 UTC; 16 points) 's comment on Case Study of EA Global Rejection + Criticisms/Solutions by (
- Monthly Overload of EA—October 2022 by 1 Oct 2022 12:32 UTC; 13 points) (
- 26 Sep 2022 11:17 UTC; 10 points) 's comment on Case Study of EA Global Rejection + Criticisms/Solutions by (
- 2 Sep 2022 22:49 UTC; 9 points) 's comment on Chesterton Fences and EA’s X-risks by (
- 20 Jul 2023 2:32 UTC; 8 points) 's comment on Which CEA-funded community-building events are cost-effective? by (
- 28 Sep 2022 2:01 UTC; 8 points) 's comment on Case Study of EA Global Rejection + Criticisms/Solutions by (
- 9 Feb 2023 9:04 UTC; 5 points) 's comment on Solidarity for those Rejected from EA Global by (
- Who’s interested in an EA unconference style virtual convening on gather dot town to mirror EA global? by 1 Sep 2022 20:07 UTC; 5 points) (
- 1 Sep 2022 20:14 UTC; 3 points) 's comment on Open Thread: June — September 2022 by (
- 25 Sep 2022 22:08 UTC; 3 points) 's comment on ‘Where are your revolutionaries?’ Making EA congenial to the social justice warrior. by (
- 23 Feb 2023 18:36 UTC; 2 points) 's comment on How can we improve discussions on the Forum? by (
Hi Scott — I work for CEA as the lead on EA Global and wanted to jump in here.
Really appreciate the post — having a larger, more open EA event is something we’ve thought about for a while and are still considering.
I think there are real trade-offs here. An event that’s more appealing to some people is more off-putting to others, and we’re trying to get the best balance we can. We’ve tried different things over the years, which can lead to some confusion (since people remember messaging from years ago) but also gives us some data about what worked well and badly when we’ve tried more open or more exclusive events.
We’ve asked people’s opinion on this. When we’ve polled our advisors including leaders from various EA organizations, they’ve favored more selective events. In our most recent feedback surveys, we’ve asked attendees whether they think we should have more attendees. For SF 2022, 34% said we should increase the number, 53% said it should stay the same, and 14% said it should be lower. Obviously there’s selection bias here since these are the people who got in, though.
To your “...because people will refuse to apply out of scrupulosity” point — I want to clarify that this isn’t how our admissions process works, and neither you nor anyone else we accept would be bumping anyone out of a spot. We simply have a specific bar for admissions and everyone above that bar gets admitted (though previous comms have unfortunately mentioned or implied capacity limits). This is why the events have been getting larger as the community grows.
I wanted to outline the case for having an admissions process and limiting the size of the event, which is roughly:
We host different events for different purposes. EAG is intended as a more selective event for people who mostly already have a lot of context on EA and are taking significant action based on EA principles. The EAGx conference series (which will serve nearly 5000 unique attendees across the different events this year) is intended to reach a broader, newer-to-EA audience.
EAG is primarily a networking event, as one-to-one conversations are consistently reported to be the most valuable experiences for attendees. I think there’s less value in very new folks having such conversations — a lot of the time they’re better off learning more about EA and EA cause areas first (similar to how I should probably learn how ML works before I go to an ML conference).
Very involved and engaged EAs might be less eager to come to EAG if the event is not particularly selective. (This is a thing we sometimes get complaints about but it’s hard for people to voice this opinion publicly, because it can sound elitist). These are precisely the kinds of people we most need to come — they are the most in-demand people that attendees want to talk to (because they can offer mentorship, job opportunities, etc.).
We think that some of our most promising newer attendees would also have a worse experience if the event were fully open.
Using an admissions process lets us try to screen out applicants who have caused problems at past events or who seem likely to cause problems.
I don’t think this is really what your post is about, but I wanted to clarify: EAG exists to make the world a better place, rather than serve the EA community or make EAs happy. This unfortunately sometimes means EAs will be sad due to decisions we’ve made — though if this results in the world being a worse place overall, then we’ve clearly made a mistake.
I agree it’s hard to identify promising people reliably, but I don’t think it’s impossible to get some signal here. I do think our admissions process could improve though, and we adjust the process every year. We’re currently in the process of revisiting the application/admissions process with the aim of identifying promising people more reliably — though of course it’s hard to make this perfect.
“The conference is called “EA Global” and is universally billed as the place where EAs meet one another, learn more about the movement, and have a good time together.” It’s possible we should rename the event, and I agree this confusion and reputation is problematic, but I would like to clarify that we don’t define the event like this anywhere (though perhaps we used to in previous years). It’s now explicitly described as an event with a high bar for highly engaged EAs (see here). We also have the EAGx conference series, which is more introductory and has a lower bar for admissions. If someone is excited to learn more about EA, they’d likely be better suited to an EAGx event (and they’d be more likely to get accepted, too).
Having different levels of access to the conference app seems like it might worsen rather than improve the problem of some people feeling like second-class citizens.
Regarding the specific volunteer case you mentioned, I’m not exactly sure what the details were here and it’s not something anyone on the team recalls. It does sound like something that easily could have happened — just perhaps a few years ago. FWIW, as of 2019, all volunteers had to meet the general bar for admission.
I think I would also be in favor of other more specialized conferences, such as those on AI safety or global health, but these are unlikely to be things we’ll have capacity to run at the moment (though I encourage people to apply for CEA event support and run events like these).
Thanks again for the post, hope these points are helpful!
FWIW I generally agree with Eli’s reply here. I think maybe EAG should 2x or 3x in size, but I’d lobby for it to not be fully open.
I suspect that 2x or 3x will happen naturally within a year, given that there is a bar on fit for the event rather than a bar on quantity. People who aren’t getting in this year will surely, if they are dedicated EAs, have more to list on their EAG applications next year.
Thanks for commenting, Eli.
I’m a bit confused by one of your points here. You say: “I want to clarify that this isn’t how our admissions process works, and neither you nor anyone else we accept would be bumping anyone out of a spot”. OK, cool.
However, when I received my acceptance email to EAG it included the words “If you find that you can’t make it to the event after all, please let us know so that we can give your spot to another applicant.”
That sure sounds like a request that you make when you have a limited number of spots and accepting one person means bumping another.
To be clear, I think it’s completely reasonable to have a set number of places—logistics are a thing, and planning an event for an unknown number of people is extremely challenging. I’m just surprised by your statement that it doesn’t work that way.
I also want to make a side note that I strongly believe that making EA fun is important. The movement asks people to give away huge amounts of money, reorient their whole careers, and dedicate themselves to changing the world. Those are big asks! It’s very easy for people to just not do them!
It’s hard to get people to voluntarily do even small, easy things when they feel unappreciated or excluded. I agree that making EAs happy is not and should not be a terminal value but it absolutely should be an instrumental value.
Hi Nathan, thanks for flagging this. What’s going on here is just that our comms/email templates were old, confusing, and out of date — I’ve now amended our acceptance email to remove the implication of capacity limits. It is helpful for people to let us know if they aren’t coming (for example, so that we can get accurate numbers for catering), but it’s not the case that people would be bumping each other in this way (for now at least — it’s possible that we get a weirdly large number of strong applications for a future EAG and have to turn away people due to capacity limits, I just don’t expect this to be the case any time soon).
I’ve also provided more context about capacity in my response to Jeff’s comment on this thread.
This is an EAG DC email from 5 days ago. The term “release” suggests to me that someone else can now use it.
I think this is probably new wording, but I think it still implies the thing you were trying to avoid.
Thanks for the flag — I’ve edited the wording now!
Maybe the conference could be renamed or its description amended to say “for EA leaders”. Then people who get rejected would take it less personally that they weren’t accepted.
Thanks for your response. I agree that the goal should be trying to hold the conference in a way that’s best for the world and for EA’s goals. If I were to frame my argument more formally, it would be something like—suppose that you reject 1000 people per year (I have no idea if this is close to the right number). 5% get either angry or discouraged and drop out of EA. Another 5% leave EA on their own for unrelated reasons, but would have stayed if they had gone to the conference because of some good experience they had there. So my totally made up Fermi estimate is that we lose 100 people from EA each time we run a closed conference. Are the benefits of the closed conference great enough to compensate for that?
I’m not sure, because I still don’t understand what those benefits are. I mentioned in the post that I’d be in favor of continuing to have a high admissions bar for the networking app (or maybe just sorting networkers by promise level). You write that:
I think maybe our crux is that I don’t understand this impulse, beyond the networking thing I mentioned above. Is the concern that the unpromising people will force promising people into boring conversations and take up too much of their time? That they’ll disrupt talks?
My understanding is that people also sometimes get rejected from EAGx and there is no open admission conference, is this correct?
Hi Scott — it’s hard to talk about these things publicly, but yes a big concern of opening up the conference is that attendees’ time won’t end up spent on the most valuable conversations they could be having. I also worry that a two-tiered app system would cause more tension and hurt feelings than it would prevent. A lot of conversations aren’t scheduled through the app but happen serendipitously throughout the event. (Of the things you mentioned, I’m not particularly worried about attendees disrupting talks.)
We’ve thought a fair bit about the “how costly is rejection” question, and think there’s a real but relatively small discouragement effect where rejected applicants are less likely to re-apply to our events (or engage with EA in general). In an internal report we wrote recently about this, we felt more concerned about whether rejection makes it less likely for people to apply in the first place (but we think we can reduce this with clearer comms about the admissions bar).
It is true that people can get rejected from EAGx’s, but the bar is lower — and often people get rejected from EAGx’s because some of these events are for specific regions (such as for EAs based in India). It’s correct that there is currently no open admission conference.
For what it’s worth, I still don’t feel like I understand CEA’s model of how having extra people present hurts the most prestigious attendees.
If you are (say) a plant-based meat expert, you are already surrounded by many AI researchers, epidemiologists, developmental economists, biosecurity analyists, community-builders, PR people, journalists, anti-factory-farm-activists, et cetera. You are probably going to have to plan your conversations pretty deliberately to stick to people within your same field, or who you are likely to have interesting things to say to. If the conference were twice as big, or three times, and filled with eg people who weren’t quite sure yet what they wanted to do with their careers, would that be the difference between that expert having a high chance of productive serendipitious conversations vs. not?
I also don’t get this. I can;t help thinking about the Inner Ring essay by C.S. Lewis. I hope that’s not what’s happening.
I’m not sure I agree with Scott that EAG should be open access, but since you mention this as a concern, I thought I’d mention that, yep, I haven’t bothered applying to EAG for several years. The discussion around EAG the last few years made it seem incredibly obvious that I wouldn’t be wanted anyway, so I didn’t even bother weighing the pros and cons of trying to attend. Now that I actually think about it, I’m not at all sure that I should have been so convinced I couldn’t get in. I attended EAG in 2018 as a volunteer because I was told that the organizers couldn’t find anyone more qualified to run a discussion group about EA and religion, and I still have my 2018 EAG name tag that labels me a “Speaker”. In terms of more recent involvement, I won a second prize in the recent EA forum writing contest, I’m theoretically a mod for the EA Corner discord server, and I’ve been working on putting together an essay about the most effective ways of preventing miscarriages for people who place high credence on the possibility unborn children having moral worth (though I’m still working on contacting various people involved in that work and getting cost estimates of their operations, so it’s not ready yet).
...but I figured that Everyone Knows that you don’t get a spot unless you’re professionally involved in EA direct work, have been involved in one of the various formal EA fellowships, or have a bunch of personal brand recognition, so I never got to the point of weighing the pros and cons of attendance; I assumed that I couldn’t attend, and immediately turned my attention to being okay with not being an important EA member like some of my friends are.
Not sure this is anyone’s fault, or whether I would have wanted to go to EAG even if I could—I assume there’s an attendance fee, and I might not have wanted to shell out—but I saw your comment and wanted to mention it as an experience that some people do have right now.
For what it’s worth, my experience is that this is entirely not the case. Most people from my country who have gone to EAG were not any of those.
There’s a ticket fee, but you can choose to pay a discounted amount, and if you receive financial aid to come to the conference, the ticket fee is waived.
I don’t know if you still want to attend EAGs, but I hope this tells you that you can definitely apply.
This doesn’t seem right to me? For example:
In setting the bar I expect you consider, among other things, the desired conference size. For example, if you got a lot of “this conference felt too big” feedback, you’d probably respond by raising the bar for the next one.
If someone applies late, I would expect whether you’re able to make room for the would depend on whether you have capacity.
In setting the bar, desired conference size is not really a factor in our decision making, though perhaps it should be (and it possibly will be if the events get much larger) — we mostly just think about what type of applicants would be a good fit for the event. We seem to receive more feedback about the types of attendees that come (or don’t come) rather than feedback about the raw size of the conference, and so we mostly action on the former. If we started receiving lots of “this conference felt too big” feedback, then yes we would possibly action on that, but that hasn’t really happened yet and I don’t expect it to in the near future.
For EAG SF 2022, it looked like we might hit capacity limits for the venue, but we actually never needed to turn people away because of capacity. For the next few EAGs we’ve selected venues that can expand to be much larger than our expected needs (e.g., for our next bay area conference, a venue that could fit at least 2500 people if we really needed), so I’m not expecting us to need to think about capacity limits in this way in the near future.
To clarify, I’m referring to the EA Global conferences only. EAGx admissions and processes are handled differently between events, and different organizers may have different requirements or setups (such as perhaps actually needing to reject people for capacity reasons).
This directly contradicts this December 2019 EA Forum post about EAG admissions, which has the following as a reply to “Why not make [EAG] larger?” (emphasis mine):
I’m not too familiar with EA Global 2016 but I’ll note that we did ask attendees whether they felt the conference was too big at EA Global: SF 2022 and they generally thought the size of the event was fine.
Since 2016, we introduced Swapcard (our networking app), which changes the dynamic somewhat and allows people to more easily find relevant people to meet (and hence make people feel less lost in the shuffle). We’ve also introduced more production staff and overall support since 2016, meaning we’ve gotten larger venues with spaces that are more suitable for bigger audiences.
I was definitely apprehensive going to a larger EAG more recently, so was pleasantly surprised at how Swapcard enabled the conference to be scaled up while maintaining the high rate of useful connections of a smaller more curated conference.
Hi Eli, Thanks for your detailed replies here!
This year I was waitlisted for EAG London, and then promptly rejected. I didn’t assign it much importance as the time because I assumed it was caused by capacity limits and I had applied relatively late.
How does having a waitlist make sense if every applicant is considered by a uniform bar independent of capacity?
The rejection email did say I could update my application, but I didn’t understand it at the time and only noticed it after reading this thread. I think maybe the communication around this could be improved, although I’m only one data point. The main point is that it only appears after a bunch of other text and can be easily ignored when you already know that the main message is “you got rejected”.
Hi Guy — thanks for the feedback. I’m not entirely sure what happened re your London application, as I wasn’t on the team then. However we didn’t really use the waitlist for SF and I don’t expect us to use it for the foreseeable future. We’ve since updated a lot of our email templates, so I’m hoping the issue you mentioned is at least partially resolved.
Thanks, glad to hear that.
If nothing else presumably at some point venues have fire code capacity limits, though maybe past conferences have been small enough these haven’t been binding.
Thanks a lot for taking the time to elaborate!
Two points of feedback on how EA Global is currently presented a little bit more like an event for the EA community:
This is the headline description from https://www.eaglobal.org :
I think from this description I personally interpret more Scott’s caricature than EAG being intended as a high bar networking event:
“for the effective altruism community” → makes it sound like it’s a community event, which I’d expect to be inclusive
“community members who already have a solid understanding” → does not sound particularly exclusive to me
“make connections, discuss ideas, develop skills” → sounds somewhat vague and general for me, and “make connections” sounds to me like “connect to your fellow EAs”
Secondly, the first picture series on the website also makes it look to me more like a community event and less like a professional networking event. Half of them are photos of large groups and speakers. Only one of the pictures seems to be a 1-to-1 conversation.
Thanks for the thoughts here — a lot of what’s going on is just that our website is pretty out of date, and we’re in the process of refreshing/updating it currently. We’re also going to make some slight edits to our front page ~now to make things a bit clearer.
Thanks, Max! I agree that’s confusing.
As Eli said, we are planning to revamp our website.
In the meantime, I’ve edited the homepage to be more accurate / to match the information on our FAQ page and admissions page to say:
”EA Global is designed for people who have a solid understanding of the main concepts of effective altruism, and who are making decisions and taking significant actions based on them.
EA Global conferences are not the only events for people interested in effective altruism! EAGx conferences are locally-organized conferences designed primarily for people:
Familiar with the core ideas of effective altruism
Interested in learning more about what to do
From the region or country where the conference is taking place (or living there)
See our FAQ page for more information.”
The edits should show up shortly if they haven’t already.
From my understanding, this new description seems fairly misleading, given the following EA Forum comments:
From Zach Stein-Perlman:
From Kevin Kuruc:
From Lauren Maria:
Cool, thanks for the extremely quick responses! :)
In my experience on hiring committees, this is actually quite difficult to do. I think in practice it is much more common to operate with two clear bars: one above which everyone gets hired and one below which nobody gets hired. The ones in the middle get a bunch of situational criteria applied to them and it’s pretty impossible to keep “we’re feeling tight on space” out of the equation there.
Whether or not the admissions process actually is capacity constrained and has replacement effects, most people will assume it does because:
The vast majority of other admissions processes are capacity-constrained.
Past and current messaging says so.
Here is a quote from the admissions FAQ https://www.eaglobal.org/admissions/: “The most common reason for rejecting someone is that we have limited space and think other applicants would get more out of the conference. We don’t have any particular concerns about applicants we reject for this reason, but simply need to save the space for attendees who might be a better fit for the event.”
Thanks for the flag here — I’ve amended the language on the admissions FAQ accordingly (note that we’re planning on revamping our website soon anyway). I’m not aware of any other messaging that implies replacement effects, but LMK if you’re aware of anything.
A) Does this represent a change from previous years? Previous comms have gestured at a desire to get a certain mixture of credentials, including beginners. This is also consistent with private comms and my personal experience.
B) Its pretty surprising that Austin, a current founder of a startup that received 1M in EA related funding from FTX regrants, would be below that bar!
Maybe you are saying that there is a bar above which you will get in, but below which you may or may not get in.
I think lack of clarity and mixed signals around this stuff might contribute unnecessarily to hurt feelings.
A) Yes we had different admissions standards a few years ago. I agree that’s confusing and I think we could have done better communication around the admissions standards. I think our FAQ page and admissions page are the most up-to-date resources.
B) I can’t comment in too much depth on other people’s admissions, but I’ll note that Austin was accepted into SF and DC 22 after updating his application.
It’s currently the case that there’s a particular bar for which we’ll admit people, though it’s not an exact science and we make each judgement call on its own — but regardless, capacity limits will not be a reason people get rejected (at least for the next few EAGs). I’m not entirely sure what you mean here, but it’s not the case that there’s a separate bar for which we’ll sometimes let people in depending on capacity. Apologies for any confusion caused here!
Thanks for clarifying
So me and some other EAs just talked to the person in the tweet that got rejected.
Far as I could tell they have a stellar “EA resume” and was even encouraged to apply by leaders in their EA community.
Why were they rejected? What is this “specific bar for admissions and everyone above that bar gets admitted” and why are so many applying and then surprised when they don’t meet this bar? Or is my perception off here?
This isn’t an accusation. I’m in the camp that thinks the conference should not be a free-for-all. But I can’t figure out why the person in the tweet would be rejected from EAG. And as a community organiser it would be great if I can know how best to help the bright-eyed enthusiastic young promising students in my community get into EAG.
See also my other comment asking if the rejection process is possibly too opaque. Maybe that’s the real issue here. Imagine if every person who got rejected knew exactly why and what they could do to not get rejected next time. I almost feel like we wouldn’t be having this discussion because far fewer people would be upset.
I can see why people are confused by this situation. I don’t think it would be appropriate for me to give more detail publicly — it’s our policy to not discuss the specifics of people’s applications with other people besides them.
We do want people who aren’t sure if they’ll get in, including students, to apply! But we suggest they should also consider applying to their nearest EAGx and not only to EAG.
We don’t plan to tell people a recipe for getting accepted beyond the overall info we share with everyone about the event and the application process, and info about getting more involved in events like EAGx and local groups for people who have been away from the community for a while or who aren’t yet that involved.
In some cases, the things that would need to change aren’t realistic to change. In other cases, telling people essentially what we want to hear would largely defeat the purpose of those aspects of the application.
We know people are concerned and confused sometimes about EAG rejections. Sometimes there are genuine uncertainties. In our experience, in many of the cases where people have been upset, there were clear reasons to reject them that we cannot share based on background or behavior, and we would recommend keeping that hypothesis in mind.
Somewhat of a tangential question but what is the point of making EAGx region-specific? If these are the only events with a relatively low bar of entry, why are we not letting people attend them until one happens to come along near where they live? Without this restriction I could easily see EAGx solving most of the problems Scott is bringing up with EAG.
[I run the EAGx conference series]
- I think there are significant benefits to local coordination
- It’s very expensive to fly everyone from around the world to one location
- I think attending 1-2 conferences a year is probably the right amount (I’m aiming to eventually have 1 conference per region per year, with lots of overlap with other regions)
EAGxVirtual does kind of solve the problem Scott is bringing up, and I’m very excited about it.
Just flagging that in my view the goal to have just 1 EAGx per region, and make the EAGx regionally focused, with taking very few people from outside the region, is really bad, in my view. Reasons for this are in the effects on network topology, and subsequently on the core/periphery dynamic.
I find the argument about the cost of “flying everyone from around the world to one location” particularly puzzling, because this is not what happens by default: even if you don’t try to push events to being regional at all, they naturally are, just because people will choose the event which is more conveniently located closer to them. So it’s not like everyone flying everywhere all the time (which may be the experience of the events team, but not of typical participants).
EAGx events are primarily for people in that region, but not exclusively. We do invite some speakers and contributors from outside the region and some others.
Sorry, could you spell these out? I don’t know what you mean.
Yes, that’s what I expect too, though we do see lots of people apply for conferences very far away from them.
Sorry, I think “flying everyone from around the world to one location” was a bit of a strawman. I expect they’ll mostly look like EAGxPrague did this year, which had a strong continental European swing but wasn’t exclusively for continental Europeans.
There is an EAGx Virtual. This EAGx happens online.
I hope they are friendly for the less western timezones as well!
I’m not answering on their behalf, only guessing.
On the one hand, I was accepted to two EAGx’s this year, both in Europe (and could eventually attend one of them). I’m not sure if Israel counts as “in the region” of Oxford or Prague, but I suspect it doesn’t. So some EAGx are probably not really region-specific.
On the other hand, some of these conferences were (or will be) in places which are far away from EA hubs, like India, Latin America, or Kenya. And in that case there’s some trade-off between connecting the local community to other EAs, and strengthening the local community itself. If you let too many people from EA hubs come (and they will), you’ll get another EAG that just happens to be further away, leaving behind the local community that’s being built.
Just to clarify: EAGx conferences earlier this year have been accepting some applicants from outside their region when the applicant’s region doesn’t have a conference yet.
Oxford and Prague were primarily for the UK and continental Europe respectively, but accepted some people from e.g. Israel because there isn’t a conference in Israel (yet...) :)
There were people from Israel at EAGx Prague. There were also people from further away.
Eli, has CEA looked at their admission rates according to age and gender? Are you willing to share this data? It would be interesting to see if there is any systemic bias in the admissions process.
I think EA currently is much more likely to fail to achieve most of its goals by ending up with a culture that is ill-suited for its aims, being unable to change direction when new information comes in, and generally fail due to the problems of large communities and other forms of organization (like, as you mentioned, the community behind NeurIPS, which is currently on track to be an unstoppable behemoth racing towards human extinction that I so desperately wish was trying to be smaller and better coordinated).
I think EA Global admissions is one of the few places where we can apply steering on how EA is growing and what kind of culture we are developing, and giving this up seems like a cost, without particularly strong commensurate benefits.
On a more personal level, I do want to be clear that I am glad about having a bigger EA Global this year, but I would probably also just stop attending an open-invite EA Global since I don’t expect it would really share my culture or be selected for people I would really want to be around. I think this year’s EA Global came pretty close to exhausting my ability to be thrown into a large group of people with a quite different culture of differing priorities, and I expect less selection would cause me to hit that limit quite reliably.
I do think there are potentially ways to address many of the problems you list in this post by changing the admissions process, which I do think is currently pretty far from perfect (in-particular, I would like to increase the number of people who don’t need to apply to attend because they are part of some group or have some obvious signal that means they should pass the bar).
I wonder if this is overstated.
I feel like we have a number of other strong channels, including:
funding
frank discussions on EAF and other public fora
private discussions/coordination by leaders
Yeah, I agree these are all candidates, though I think these are all actually somewhat downstream of general movement growth:
Funding has been getting a lot less centralized and in-general funding is a lot more flush, at least in the longtermist space, so I think this has been serving much less as a thing that meaningfully steers the culture
Frank discussion on the EA Forum I do think is quite important, though I also think that outside of a few people like Nuno we see very little actual critique of projects, and I think a lot of the people who tend to have written critical things have stopped in the last few years (Larks is stopping his AI Alignment review, I am no longer writing long LTFF writeups, and broadly I have a feeling that there is a lot more mincing of words on the forum than a few years ago), so while I do think this is quite important, I also think it’s becoming a weaker force
Leadership has also been growing and I think leadership is now actually distributed enough and large enough that I feel like this isn’t really doing a ton in terms of shaping culture and changing community growth. I feel far from getting to consensus with people at 80k on how they are thinking about community growth, and my sense is everyone is just really busy and very few people among the de-facto leadership think of themselves as actually responsible for shaping community culture and growth (at least I got a relatively strong feeling of powerlessness from people trying to shape community growth at the latest Coordination Forum, though I might also be projecting my own feelings too much here, so take this with a grain of salt)
I actually think the strongest channel we currently have for shaping culture and growth are things like Lightcone and Constellation, which are more consistent spaces with boundaries that allow some people to maintain more of a walled garden, though I also have complicated feelings about the dynamics here.
[epistemic status: idle uninformed speculation]
I basically agree with this comment, which makes me like the idea of an open EAG. Closed EAG is theoretically good for shaping culture by selecting good participants, but CEA faces a knowledge problem for which people are “good culture fits”, and it moves towards promoting some weird homogenization thing. I have some inchoate instinct that a more decentralized network of smaller walled gardens can preserve and signal good parts of culture, while avoiding frustrating “CEA as kingmaker” dynamics, and allowing an open EAG to introduce novelty into the system.
I’m kind of surprised by this—if EAG was open, what sort of people do you think would come, and in what way would they not share your culture?
Like as an example: this year I didn’t get into EAG when I first applied, and then I reapplied when I got an internship at an EA org, and got in. This is understandable—I didn’t have very strong ‘legible’ signals of EA engagement apart from the internship, arguably. But also, like, my culture/who I am as a person clearly didn’t change much between the two applications! So I guess my expectation for who would come if it were open are:
-people who are engaged in their local EA communities and ‘into’ EA but haven’t shown legible enough signs of ‘promise’ to get accepted under the status quo
-people like the person screenshotted in the post, who have changed their career on EA principles, who maybe wouldn’t get much out of networking, but who are excited to talk to others who share their values
I’m mostly sympathetic to this point.
Your last paragraph doesn’t make sense to me; I don’t see how changing admissions in that way would help with the issues Scott discusses.
A few more thoughts:
(EDIT: Seems like in an above comment that current culture-steering is trying to encourage perspective and stuff, so what I’m irked about here isn’t really an issue). Limiting EA globals to only people that fit into certain cultures could severely limit perspective/diversity, and in bad cases could make EAs as “cult”-y as some people want to believe it is. I think this is probably only true for some misguided attempts to steer culture, and in fact you can steer culture through admissions to encourage things like diversity/perspective. Selecting attendees based on “people Oliver Habryka would hang out with outside the conference” does seem to be sort of narrow, but I’m guessing this wasn’t what you actually want admissions to look like?
In respone to “I would like to increase the number of people who don’t need to apply to attend because they are part of some group or have some obvious signal that means they should pass the bar”: I guess it depends what you mean here, but I agree with Zach in that it doesn’t seem obvious why this would actually help (although I’d be intersted in the specific details of what you mean). I mean, some ways of doing this could make people upset, since the outside perspective could be something like certain “inner circles” of EA have easy backdoor access to EAGs. This could also make even very impressive/high-impact EAs feel excluded if they’re not a part of said groups. Seems to me like there are tons of ways this could go wrong.
I think this means a good chunk of people don’t have to apply, and then those people don’t have to deal with the costs of applying. I do think it doesn’t address most of the things mentioned as problems in the OP.
Which chunks of people?
I strong upvoted since this makes sense to me, but is EA global admissions actually making an effort to steer culture in any way? I was under the impression that EAG admissions was mostly just based on engagement with EA and how impressive someone is, which doesn’t really seem to qualify as steering culture. This certainly seems to line up with some things said in this post about people being rejected for not being impressive.
I think EA Global admissions is currently checking for something like “does this person have a very different impression of what EA is about than most of the current leadership?”, which I think has a pretty substantial effect on culture.
It is hard to talk about admissions in too much detail publicly. I agree that we want to make sure attendees have an understanding of EA but we also want to avoid the “guessing the teacher’s password” problem. We also check for reasoning skills/epistemics. In other words, some people don’t know much about EA principles, but manage to exhibit good reasoning skills as they make the case for a clear plan, or by explaining that they are uncertain and laying out which options they are thinking about.
Is there any attempt to increase diversity (of experience, perspectives, gender, race) through admissions?
I’m asking because this kind of idea was made in another comment here, and sounds good to me, but contrasts with your description.
I also have a bit of a hard time understanding this. If there are some objective criteria that you use to assess those other things you mentioned, then yeah, I wouldn’t want people to just start optimizing for them and ruin the process. But so far from CEA staff comments here, it sounds much more like a judgement call that you can’t really game.
Like, from my perspective as a musician, if I wanted to get into music school I know what the basic criteria in an audition would be, but they’re subjective and optimizing for them is almost identical to “training to be a good musician”, so there’s no problem in making them publicly known.
Thanks! Makes sense.
Not opinionating on the general point, but:
IIRC, Kelsey was in fact the president of the Stanford EA student group, and I do not think she would’ve been voted “least likely to succeed” by the other members.
Quite. I was in that Stanford EA group, I thought Kelsey was obviously very promising and I think the rest of us did too, including when she was taking a leave of absence.
A bit of a tangent though these comments strike me as indicative that EA is a very small community in many ways.
Yeah it’s very small, especially for people working professionally in subfields.
Also early Stanford EA had a very good hit rate, like I think there were <10 regular members, and that group included Claire, Kelsey, Caroline and Michael.
And Buck Shlegeris and Nate Thomas and Eitan Fischer and Adam Scherlis (though Buck didn’t attend Stanford and just hung out with us because he liked us). I wish I knew how to replicate whatever we were smoking back then. I’ve tried a couple times but it’s a hard act to follow.
Fwiw, I gave Scott permission to mention the above; I think by some metrics of promisingness as an EA I was obviously a promising EA even when I was also failing out of college, and in particular my skillset is public communications which means people could directly evaluate my EAmpromisingness via my blog posts even when by legible societal metrics of success I was a bit of a mess.
This comment did not age well.
Agree that was a weird example.
Other people around the group (e.g. many of the non-Stanford people who sometimes came by & worked at tech companies) are better examples. Several weren’t obviously promising at the time, but are doing good work now.
I had a pretty painful experience where I was in a pretty promising position in my career, already pretty involved in EA, and seeking direct work opportunities as a software developer and entrepreneur. I was rejected from EAG twice in a row while my partner, a newbie who just wanted to attend for fun (which I support!!!) was admitted both times. I definitely felt resentful and jealous in ways that I would say I coped with successfully but wow did it feel like the whole thing was lame and unnecessary.
I felt rejected from EA at large and yeah I do think my life plans have adjusted in response. I know there were many such cases! In the height of my involvement I was a very devoted EA, really believed in giving as much as I could bear (time etc included).
This level of devotion juxtaposed with being turned away from even hanging out with people, it’s quite a shock. I think the high devotion version of my life would be quite fulfilling and beautiful, and I got into EA seeking a community for that, but never found it. EAG admissions is a pretty central example of this mismatch to me.
I’m really sorry to hear this. It is concerning to hear that being rejected from EAG made you feel like you were “turned away from even hanging out with people.” This is not our intention, and I’d be happy to chat with you about other resources and opportunities for in-person meetings with other EAs.
We also get things wrong sometimes so I’m sad to hear you feel like our decision impacted your trajectory away from a highly devoted version of your life. The EAG admissions process is not intended to evaluate you as a person, it is for determining whether you would be a fit for a particular event. It seems possible that you applied at a time when we were experimenting with a policy that prioritized people who were not yet highly engaged but were in a position to become highly engaged (I’m guessing this because you say your “newbie” partner got in). Our admissions process has changed over time and currently we consider things like engagement with EA, epistemics, and ability to gain things from the event or provide mentorship to others (for example, if people are currently making a decision and have a plan to use conversations at the conference to influence them).
As an example of the imperfection of the process, EA Global once rejected an application from someone who then went on to work at Open Philanthropy less than 2 years later. One change we have made since 2020 is to not outright reject sparse applications, but rather send a message saying that we did not have adequate information to approve an application, and suggest the applicant update their application if there is anything more they think we should know.
Thanks for your comment and I’m sorry to hear how our admissions process impacted you.
Damn, that really sucks. :| Thanks for sharing.
Adding my three related cents:
I personally would very likely have felt really sad about being rejected from EAG as well, and knowing this played a role in me not being particularly excited about applying in the past.
A good friend of mine who’s like a role model highly-engaged EA was told a year or so ago by a very senior EA (they knew each other well-ish) that he shouldn’t take for granted being admitted to EAG, which IIRC felt pretty bad for him, as if he’s still not doing “enough”.
Another good friend of mine from my local chapter got rejected from one of the main local community events in Germany due to capacity limitations a few years ago, and that felt very bad to me and IIRC he said he was at least a little sad.
(IIRC the admission process afterwards switched to being fairly inclusive and adding a lottery in case of capacity limitations.)
I generally directionally agree with Eli Nathan and Habryka’s responses. I also weak-downvoted this post (though felt borderline about that), for two reasons.
(1) I would have preferred a post that tried harder to even-handedly discuss and weigh up upsides and downsides, whereas this mostly highlighted upsides of expansion, and (2) I think it’s generally easier to publicly call for increased inclusivity than to publicly defend greater selectivity (the former will generally structurally have more advocates and defenders). In that context I feel worse about (1) and wish Scott had handled that asymmetry better.
But I wouldn’t have downvoted if this had been written by someone new to the community, I hold Scott to a higher standard and I’m pretty uncertain about the right policy with respect to voting differently in response to the same content on that basis.
That comment hit me hard too.
In general, it hurts to make people feel bad and if I was optimizing the event for making myself/EAs feel good it would look different.
I had an hour long call with the person who made that post and was able to connect them with resources and explain the admissions process and considerations that go into it in a way that seemed to help. I think we could do a better job of explaining these things publicly and I think we should do that.
We should at least try this once and see what happens
We haven’t tried a fully open event, but our 2016 was closer to open than our more recent events and came with various drawbacks.
Hm, from an organiser perspective (I’m organising EAGxBerlin), even just trying this once seems costly.
- Organising a conference with 10,000 people takes a huge amount of work and funding. Would you trade in five 1000p conferences to have one 10,000p conference?
- If this event fails, up to 10,000 people will have had a bad experience and even if just 10% of the people get upset that’s 1000 people upset with EA.
(there are obviously a lot more costs and more benefits which I currently lack time to write up, I just wanted to point out that it may be more costly than you’d think to try out such a large conference)
I think EAGxVirtual 2020 came close to this—about 1400 attendees, a decent chunk had only recently heard about EA
+100 on this. I think the screening processes for these conferences overweight legible in-groupy accomplishments like organizing an EA group in your local town/college, and underweights regular impressive people like startup founders who are EA-curious—and this is really really bad for movement diversity.
Yes, I might be salty because I was rejected from both EAG London and Future Forum this year…
But I also think the bar for me introducing friends to EA-curious is higher, because there isn’t a cool thing I can invite them into. Anime conventions such as Anime Expo or Crunchyroll Expo are the opposite of this—everyone is welcome, bring your friends, have a good time—and it works out quite well for keeping people interested in the subject.
I like the idea of an EA expo as a different thing!
Though I wouldn’t be surprised if a lot of more established EAs didn’t want to go and people felt sad about how EA expo isn’t legit enough.
I think anime/gaming expos/conventions might be a good example actually—in those events, the density of high quality people is less important than just “open for anyone who’s interested to come”. Like, organizers will try to have speakers and guests lined up who are established/legit, but 98% of the people visiting are just fans of anime who want to talk to other fans.
Notably, it’s not where industry experts converge to do productive work on creating things, or do 1:1s; but they sure do take advantage of cons and expos to market their new work to audiences. By analogy, a much larger EA Expo would have the advantage of promoting the newest ideas to a wider subset of the movement.
Plus, you get really cool emergent dynamics when the audience size is 10x’d. For example, if there are a 1-2 people in 1000 who enjoy creating EA art, at 10000 people you can have 10-20 of them get together and meetup and talk to each other
Super seconded! I have had a couple of EA-curious friends (who would be a great fit for EA, very passionate and smart and dedicated to positively impacting the world) ask if I would recommend attending a conference, and had to awkwardly explain that although I loved my experience at EAG, they would probably not get in. I was able to recommend EAGx as a more accessible alternative, but the American EAGx conferences are pretty student-oriented, still have illegible admissions criteria, and wouldn’t necessarily present the benefits of EAG as efficiently to post-grads. There NEEDS to be an accessible event I can invite people to. ETA: I think it’s fine to have events with different levels of accessibility, it’s just frustrating that the current combo doesn’t really provide a good entry point/thing to invite people to. Making admissions criteria more legible, especially for EAGx, could help address this.
I quite like the idea of an EAG: Open, but presumably as a complement, rather than replacement, to the current networking-focused EAGlobal.
One thing that seems missing from the EA ecosystem is a single place where there are talks which convey new information to lots of interested, relevant people in one go, and those ideas can be discussed.
This used to happen at EAGlobal, but it doesn’t anymore because (for understandable reasons) the event is very networking focused, so talks basically got canned. I find it odd there’s now so little public discussion at the EA community’s flagship event. (The only major communication that happens is at the opening and closing ceremonies, and is (always?) done by Will. Will is great, but it would be great to have a diversity of messages and messengers.)
There is more content at EAGxs, but then only a fraction of people see those. I’ve realised I’m basically touring the world giving more-or-less the same talk so most people hear it once. In some ways, this is quite fun, but it’s also pretty inefficient. I’d prefer to give that talk once and then be able to move onto other topics.
The EA forum currently serves as the central place for discussion, but it’s not that widely used and stuff tends to disappear from view pretty fast. It certainly doesn’t do the same thing as TED-style big talks do for communicating important ideas.
I think the people responsible for EA Global admissions (including Amy Labenz, Eli Nathan, and others) have added a bunch of value to me over the years by making it more likely that a conversation or meeting with somebody at EA Global who I don’t already know will end up being productive. Making admissions decisions at EAG (and being the public face of an exclusive admissions policy) sounds like a really thankless job and I know a bunch of the people involved end up having to make decisions that make them pretty sad because they think it’s best for the world. I mostly just wanted to express some appreciation for them and to mention that I’ve benefitted from it since it feels uncomfortable to say out loud so is probably under expressed.
One positive effect of selective admissions that I don’t often see discussed is that it makes me more likely to take meetings with folks I don’t already know. I’d guess that this increases the accessibility of EA leaders to a bunch of community members.
Fwiw, I’ve sometimes gotten overambitious with the number of meetings I take at EAG and ended up socially exhausted enough to be noticeably less productive for several days afterwards. This is a big enough cost that I’ve skipped some years. So, I think in the past I’ve probably been on the margin where if the people at EAG had not been selected for being people I could be helpful to, I’d have been less likely to go.
The Future Forum had a much worse version of this, eg: shifting language in their web pitch, missed self-set application review deadlines by 4-6 weeks, and then denying an unknown, suspected large, % of otherwise impressive applicants in relevant fields. I mention this here because, due to Future Forum’s proximity in date to EAG SF, the uncertainty around FF acceptance led me to delay travel arrangements, cancel pre-EAG meetings, and nearly cancel the trip to SF entirely.
Another impact was that two high-achieving colleagues on the cusp of joining EA came to believe that FF used its nomination form like a multi-level marketing ploy to “tell us who you know” and had no intention of sincerely evaluating most applications. I don’t share this view but wanted to share a case study of how tone shifts / missed comms deadlines during event applications can lead to the worst possible thing being assumed.
Author of “ How to Survive EAG: San Francisco FOMO,” here. In fairness to EAG organizers, I want to clarify that I did not apply for EAG SF this year. While I was bummed not to go, I was on a digital sabbatical during the application period, so I do not put any sort of blame on EAG organizers, as they had nothing to do with why I didn’t attend. I applied to two past EAGs and was accepted to both. While I support the general sentiment that it is a bummer to miss out on EAG, the post truly was written all in good fun, and I did actually have a great time putting that post together.
Thanks for writing it, it resonated with me a lot :)
And I didn’t apply to SF either. I did apply (late?) to EAG London and was rejected, after attending the previous one. But I was accepted to EAGx and went to Prague and had good fun and productive meetings. So the bit of anger at the rejection from London was quickly forgotten, and I wouldn’t have remembered it at all if not for this post.
What’s a digital sabbatical?
Thanks for the comment! A digital sabbatical is basically a fancy-shmancy term for “I took a break from social media and from having an online presence and increased my focus on other projects.” I took a break from pretty much everything online, then added things back in gradually or limited certain sites or apps to strategically curate a better media/info diet.
Sounds amazing!
(I wrote a longer comment with thoughts about how this applies to me, but decided this wasn’t the right place for it.)
Thanks for the comment, Hayley! Btw, I loved seeing your dog Maple with the EAG swag in your original post, so cute 🥺 🐶
As I mentioned on one of those Facebook threads: At least don’t bill the event as a global conference for EA people and then tell people no you can’t come. Call it maybe the EA Professionals Networking Event or something, which (a) makes it clear this is for networking and not the kind of academic conference people might be used to, and (b) implies this might be exclusive. But if you bill it as a global conference, then make it be like a global conference. And at the very least make it very clear that it’s exclusive! Personally I didn’t notice any mention of exclusivity at all in any EA Global posts or advertising until I heard about people actually getting rejected and feeling bad about that.
My interpretation of the “Global” part in EAG is ‘from around the world’, not ‘everyone is invited’. E.g. for EAGxAustralia it seems like you’re much more likely to get accepted if you’re based in Australia or the Asia Pacific, because it’s about building the community there. But EA Global is about connecting people across these different communities, and doesn’t prioritise admissions based on geographical closeness.
Honestly I’m super confused why people perceive ‘EA Global’ as an inclusive-sounding name. Especially in contrast to ‘EAGx’, which evokes the TEDx vs TED contrast, where TEDxes have a much lower bar, are scrappier and more community based.
I mean, from the moment I first had to apply I felt it was exclusive—but only in the sense of “they select the people whom the conference would most help to increase their impact”. So once I was rejected from one, I didn’t feel offended. I did get accepted for the very first one I applied to, so maybe that’s the reason. I was already a few years in the movement by then though.
What are these FB threads that are being referenced out of curiosity? Feels very in-groupey.
It really does!
Big fan of your blog. Some quick counterpoints/counterarguments (that are not meant to be decisive):
Re: …because it makes people upset.
As others noted, the point of EA(G) is to have the maximum impact on moral patients overall, not to be welcoming to individual EAs or make EAs happy. I think it’s not impossible that aiming for community happiness is a better proxy goal than aiming for impact directly (e.g.) , but I think it’d be quite surprising at least to me and should be explicitly argued for more.
More to the point, I’m not convinced that having open EAGs will actually make people happier. To some degree, I read the exclusivity as part of an important signal for people to complain/be upset about, and that as long as we have rejections for reasons other than commitment, people will be similarly upset. I expect with open EAGs, the goalposts will move and people will instead be upset about:
Getting rejected from directly important things like grants or jobs
Networking “tiers” within EAG
More illegible signals of status
probably even EAF karma, downvotes, other social media stuff etc
I think it genuinely makes sense to be upset about these things. And it’s unfortunate and I do think people’s emotions matter. But ultimately we’re here to reduce existential risk or end global poverty or stop factory farming or other important work. Not primarily to make each other happy, especially during work hours
(with maybe a few exceptions, like if you’re an EA therapist or something).
Re: …because you can’t identify promising people.
I think you’re just wrong about the object-level point. Both you and Kelsey (and, I suspect, future analogues) were successful and high-potential in ways that are highly legible to EA-types.
However, I think your argument can basically be preserved by arguing a) that people who are legibly impressive in non-EA ways (a la Austin’s point) will be rejected plus arguing that legibly impressive in non-EA ways is a better predictor of future counterfactual impact than being impressive in EA ways. Or b) people who aren’t legibly impressive will often end up having a huge impact later, so the processes are just really bad at discernment. So I think #1 isn’t my true rejection.
My more central rejection is like, man, all selection processes are imperfect. That doesn’t mean we shouldn’t have selection processes at all. Otherwise the same arguments applies to jobs and grants, and I think it’s probably wrong for us to give grants and jobs without discernment.
More generally, I wish there’s an attempt to grapple with the costs and benefits, rather than just look at the costs.
...because people will refuse to apply out of scrupulosity.
Yes, this is unfortunate. But again I would not guess that great people self-selecting out of something is particularly common.
(I think I might be unusually blind to this type of thing however. For example, I have nearly zero imposter syndrome and I’m given to understand that imposter syndrome is a common thing in EA).
I also think this can be ameliorated through better messaging.
...because of Goodhart’s Law
Oh man I should have a longer post about this at some point, but in general I think EAs and rationalists are too prone to invoke “Goodhart’s Law” as a magic curse that suggests something is maximally bad in the limit, without considering more empirically how useful or bad is something in practice.
And there’s a bunch of advantages of having people aim towards something (even if imperfect), like in general I think we’re better off with more BOTECs and more evaluations, rather than less.
(weak-moderate strength argument) “If you only let people whose work fits the current paradigm to sit at the table, you’re guaranteed not to get these.” To some degree I think there’s a bit of an internal contradiction: to the degree that our legible systems are poorly legibly incentivizing independent thinkers, this is a somewhat self-correcting problem. We’d expect the best independent thinkers to be less affected/damaged by our norms.
...because man does not live by networking alone
I actually don’t have a clear understanding of this point, so I feel not ready to argue against it.
I do think some EAs are looking for EA events/community mostly from a “vibes” angle, where they find being effectively altruistic is very hard, and they want events that helps with maintaining altruistic commitment. I do have a lot of sympathy to this view, and I do agree it is quite important.
...because you can have your cake and eat it too.
I suspect this will not solve most of the emotional problems people have (for reasons I mentioned in that section), or Goodhart’s Law, or the “can’t identify promising people” problem. Though I agree there’s a decent chance it can significantly ameliorate the “man doesn’t live by networking alone” problem and the “people will refuse to apply out of scrupulosity” problem.
In general, I thought this post was interesting and talked about a serious potential issue/mistake in our community, and I’m glad that this conversation is started/re-ignited. I also appreciate having distinct sections that makes it easier to argue against. However, in some ways I find the post also one-sided, overly simplistic, and too rhetorical. I ended up neither upvoting nor downvoting this post as a result.
You raise many good points, but I would like to respond to (not necessarily contradict) this sentiment. Of course you are right, those are the goals of the EA community. But by calling this whole thing a community, we cannot help but create certain implicit expectations. Namely, that I will not only be treated simply as a means to an end. That means only being assessed and valued by how promising I am, how much my counterfactual impact could be, or how much I could help an EA org. That’s just being treated as an employee, which is fine for most people, as long as the employer does not call the whole enterprise a community.
Rather, it vaguely seems to me that people expect communities to reward and value their engaged members, and consider the wellbeing of the members to be important by itself (and not so the members can be e.g. more productive).
I am not saying this fostering of the community should happen in every EA context, or even at EA globals (maybe a more local context would be more fitting). I am simply saying that if every actor just bluntly considers impact, and at no place community involvement is rewarded, then people are likely and also somewhat justified to feel bitter about the whole community thing.
I’m curious, how was Scott Alexander “successful and high-potential in ways that are highly legible to EA-types” in his early 20s? I wouldn’t be surprised, at all, if he was, but I’m just curious because I have little idea of what he was like back then. As far as I know, he started posting on LessWrong in 2009, at the age of 24 (and started Slate Star Codex four years later). I’m not sure if that is what you are counting as “early 20s,” or if are referring to his earlier work on LiveJournal, or perhaps on another platform that I’m not aware of. I’ve read very few (perhaps none) of his pre-2009 LJ posts, so I don’t know how notable they were.
Oh hmm I might just be wrong here. Some quick points:
I didn’t know Scott’s exact age, and thought he is younger.
In particular I thought this was written when he was younger (EDIT: than 25), couldn’t figure out exactly when.
EA has more infrastructure/ability to discover great bloggers/would-be bloggers who are interested in EA-ish issues than we previously had.
I think it’s easier to be recognized as an EA blogger than it used to be 5-10 years ago, though probably harder to “make it big” (since more of the low-hanging fruit in EA blogging have been plucked).
I think I wrote that piece in 2010 (based on timestamp on version I have saved, though I’m not 100% sure that’s the earliest draft). I would have been 25-26 then. I agree that’s the first EA-relevant thing I wrote.
See https://web.archive.org/web/20131230140344/http://squid314.livejournal.com/243765.html?(Also I think the webpages you link to are from no later than 2008, and clustered up to November 2008.)
(The dead-child thing was almost certainly written in 2008.) (Edit: see https://web.archive.org/web/20131230140344/http://squid314.livejournal.com/243765.html.)
Thanks for finding this. Assuming he wrote this around the time that it was posted, he’d have been 24.
Maybe I’m just ignorant here, but where’s Scott in that link?
The quoted excerpt from the post, and the original “Dead Child Currency” post in general, is written by Scott.
The missing info for me was that Scott had yet another alias, as Denise kindly replied. I think the lesson learned is “If you have a good reason not to reveal your identity, at least stick to just one alias”.
Wouldn’t be the lesson I’ll take here, but probably not that important! :)
Yvain is Scott’s old LW name.
Question, when do you think you will make a post about Goodhart’s Law?
Sorry the “should” is more like a normative “I wish I could do this” rather than a prediction or promise.
Maybe 15% I’ll do this in the next two months?
Add my voice to the others who’d support an EACon with open registration (maybe with particular persona-non-grata being excluded, I would not support an event that made a big deal about ‘excluding nobody’ or some such). Get some of the people who run successful science-fiction conventions, pick a relatively accessible and cheap location (Las Vegas, maybe?), have panels, invite merchants to bring by EA-relevant or just weird merchandise.
This does sound like fun.
I actually agree that it would be good to try running an EAG:Open that is >3x bigger, with marketing, big-name speakers and an open invite list. But organising it would probably be >3x as much work, and <3x as valuable, so I don’t think it’s right to nag CEA into running it, nor should it replace current EAGs.
A few quick thoughts:
I always appreciate well-meaning discussion and thought this brought up some good points. That said, I overall don’t really agree with it.
It’s a lot of work to organize an event for many people. In the last year, total global attendance at EAGs (between all of them) seems to have grown by around an order of magnitude or so; from maybe ~500 ~3 years ago, to maybe ~6k this year? My impression is that it’s been correspondingly tricky to scale the CEA team in charge of this growth. I imagine specific proposals to “Open EAGs” would look like some combination of charging a fair bit more for them and/or allowing 10k+ people at each event. This doesn’t at all seem trivial. Maybe it would be easy to do a very minimalist/mediocre version of a huge event, but I imagine if that were done, people would find a lot of reasons to complain.
My personal proposal is that eventually, it would be nice (assuming there are people available to do it) to try out essentially an “EA Open”, with 5k-15k people. If this works, then rename “EA Open” to “EA Global”, and then continue having a smaller “EA Global event”, but now named something more like, “Super boring detailed EA summit.” This way the senior people could still have their event, and others can have some “EA” event to go to (that’s called “EA Global”, if they care about that so much).
Even if a bigger conference isn’t set up, I think “EA Global” might be a mediocre name for the current conference. The harm caused by the resentment of people not getting invited to it might outweigh the benefits of making it seem more accessible to some who do get in, but wouldn’t have applied otherwise. The branding could likewise change to make the focus seem more professional/dedicated.
Some people treat “EAG” as “A professional venue to do work meetings”, and others treat it as “the place for the cool kids to be cool with each other”. I’d probably prioritize the former for a few reasons.
Most of the benefits of attending EAG get watered down when you increase the size. I imagine the 10k-person version would look very different. The senior people that do show up wouldn’t have much time per person, and would be there for different reasons (recruitment, some very selective mentorship). The experience for most people would be “a chance to talk with many others who are vaguely interested in EA”. This doesn’t sound very exciting to me, but maybe it could be made to work somehow.
I am sceptical about this, I think understanding the formula of EAG and EAGx helps contextualise and solve most of the issues in the post.
There’s currently 8 EAGx in the next 7 months, and probably many more in the pipework. EAGx events are designed to have a more inclusive bar to entry. It should be noted, that they not exactly small events. We are aiming for 1000 people in EAGxNordics in April next year. EAGx are also, notoriously, “weirder”.
With that in mind, I am not sure how less events, which are less targeted, but bigger would be an advantage. But I have to admit, the quickest way to resolve this would be to try and run an event for 10,000 people, and collect feedback
In what ways are EAGx events weirder?
So this is just my subjective opinion, but because they are less professionalised, more studenty, they feel more relaxed. Even after the event itself (i.e. at after-parties) if the vibe is less professional, you will feel that in the atmosphere, and the way people behave
Can’t confirm about the partys, but I did feel EAGx was more relaxed than EAG. Though it was also due to a personal decision by me to make it so.
[EDIT: Eli from CEA has clarified that places are not currently a limitation for accepting people into EAG]
I agree it’s a problem that 1) promising people and 2) established professionals in EA are not attending EAG because there aren’t enough places.
The solution is not to make EAG open access. This is antithetical to keeping EA weird.
The solution is to plan bigger EAGs, and keep the selectiveness criteria the same (or even up the selectiveness). And being ok with not filling the planned quota—empty seats are much better than disgrunted people left out.
But then of course there is the question of what’s keeping EAG organizers from making them twice the size. They will have a better insight here!
Is the goal of EA to maximize impact, or some other collection of things? If EA losing some of its distinct culture helps it execute on its stated core mission, isn’t that ok?
I suspect/my understanding is, it’s just a slow feedback loop. You have to book a venue before you know how many people will apply, and if that venue has a capacity (i.e. there literally wouldn’t be enough space, or health and safety issues) you can’t accept more than that.
With that said, I would be very surprised if EAG-SF (Which IIRC had around 1700 accepted, with 200 no shows) rejected more than 30% of US based applicants. And those rejections would likely no be capacity related, by more due to not hitting the bar of acceptance. So the limiting factor could be demand
I clicked through to the source. I feel this person. They made a significant commitment because they wanted to help others, and they followed through on it.
This is the type of person I would love to meet. But not at EAG, because I do not want to go to EAG. I don’t fit there, and there are other events (e.g. EAGx or online events) that fit me better.
Late to the party, but I just came across this post and wanted to add another perspective. For context, I’ve been “living EA principles” for a while, but am not really involved in the community and have never applied to EAG.
The fact that the biggest EA conference is intentionally exclusive is something of a red flag for me and makes me significantly less likely to apply to EAG. In my experience, one of the strongest predictors of whether I will connect with a given community is how much they value inclusivity. (This has been true even in cases where I was part of the “in-group”.)
I understand that there are tradeoffs, and I don’t have enough context to form an opinion on whether the benefits of exclusivity are worth the costs in this case. My goal here isn’t to argue for either position; I just wanted to add another data point for those costs.
Hi Scott, I truly appreciated your post on “Open EA Global” and am inclined to agree on most fronts. This is a month old, EAG DC has passed, and so I’ll strive to focus on what’s new or unique versus what has already been said:
I’m new to the EA community and have not been to EAG. In fact, I applied to EAGx Berlin this month and was told to apply to a local conference, then I applied to EAG DC and was told to apply to an EAGx. (No EAGx will be posted in my region until likely Boston in the spring; although I can look forward to the virtual conference coming up.) I was excited to dive in to the community and disappointed by the chicken-and-egg here and the impersonal rejection process. I still believe EA is a great cause, I understand the thankless job of the staff in these decisions, and I acknowledge the ways I can strengthen my personal candidacy. (But I wish the website had not invited me to apply with such cheer! And I wish I had spent less time on this particular process.)
Despite not being an EA, I possess professional experience around community engagement and customer service. I led massive Growth in two companies from obscurity through IPO. My teams have popularized challenging concepts to mainstream audiences and dealt with toxic customer service issues. I acknowledge I am one year into my EA journey and far from the smartest person in the room.
Being an EA is an identity, it’s a personal choice. That sums up why EAG rejections may sting so badly compared to, say, getting passed over for a job offer or finding out Burning Man tix already sold out. The literature can state that this is merely admission to event and it’s not intended to be a personal judgment, but the verdict might not feel that way to an individual. Even in these comments, I see lots of talk of “not meeting the bar” which personally makes me uncomfortable when the bar is so ill-defined. A “good enough EA” seems like a straightforward interpretation of this bar, so I get why some folks take it personally.
The overall objective of EA is to find the best ways to help people and put them into practice. Let’s set aside costs, logistics, precedent and hurt feelings… for EA to find the most effective interventions and scale them, it benefits from casting a large net. This exposes the growing movement to new ideas which can be sifted through (and incorporated or discarded). So it seems to me, the larger, more open conference serves the overarching objective best. There are plenty of technologies that can and do help with matching in order to make the event “feel” smaller for various reasons. There are still plenty of ways to pamper the 800 pound gorillas.
To an outsider, it’s not apparent how networking-heavy EAG is. It’s only because I have multiple colleagues who have attended EAG in the past that I learned how the 1-1 matching is a key factor in doling out spaces. I approached the event with the mentality of “let’s consume content, hear keynotes and absorb all that EA has to offer!” It’s clear from some of the comments that this is relatively unimportant, but I do wonder if EAG misses a golden opportunity to amplify the content for general community-building (and as one comment mentions—to elevate voices besides Will’s).
Is South by Southwest (Tech) a worthwhile template for EAG? The massive conference in Austin, TX has a growing share of detractors, but it did a lot of things right. The “Day Pass” is the front door for folks who are attending, learning, listening (and paying big bucks) to attend. Then there are all manner of networking events from semi-public lectures to exclusive gatherings on top of the Ritz. I don’t feel bad that I didn’t get a 1-1 with the CEO of X company, because I didn’t even know that was going on. That conference has succeeded in letting anyone in and creating exclusivity at the same time.
Thanks again for the consideration and I hope you find this helpful, etc. -Steve
I disagreed when at the start of this post on the grounds that I strongly prefer smaller events, but updated towards agreeing fairly strongly, subject to the logistical issues Scott mentions at the start.
Saulius’ point that we could try it once seems excellent
I really dislike the desire to police EA culture that people are expressing elsewhere in the comments. Hardcore vegans didn’t stop being hardcore vegans when vegan circles started admitting reduceatarians et al, and I don’t see any reason to think hardcore EAs would disappear or have trouble meeting each other just because less hardcore EAs started showing up.
If they did have trouble meeting each other, you could define one or more submovements that people could opt into, possibly with requirements for entry to their congresses.
The event’s culture is still going to be heavily dominated by the talks, marketing, and prearranged norms.
You could raise the price to a profitmaking point, funding the smaller, weirder conferences, while still offering free/subsidised access to promising people who couldn’t afford it.
I can’t say I have a strong opinion one way or the other, but I agree strongly with:
Why not be creative and have a virtual parallel EA global event? There’s lots of opportunity to be creative there.
For example it could be as simple as specified times for people to hop on the EA Gather dot town. There are also a lot of opportunities to match people creatively with pre-survey work and such.
Thanks for the suggestion. We did do a parallel virtual event before and decided against doing it again because virtual underperformed the in-person event and split our attention. We were considering running our own separate virtual event this year, but instead, we are supporting EAGx Virtual next month.
As one data point, the virtual EA Student Summit was both really fun and informative for me (did many 1-1s), and encouraged me to apply to a physical EAG a year later :)
Thank you! I’m so glad :)
Why not make ordinary attendees pay a little extra, and subsidize even more travel reimbursements?
I think more tweaking might be needed in terms of reimbursements. I’m based in an area that has a large EA student hub. Most EAs are very connected and most of the people I interact with go to each single EAxWherever and apply for reimbursements. They’ve told me it’s not that hard to get money for expenses based on the prestige of the university and the fact that they’re students. It seems a partial motivation is also travel and time off. It sounds to me that people who don’t live near such hubs and are less connected to other people and resources will benefit more.
Just want to clarify that we typically provide travel grants for those who need them, and don’t select based on which university they attend or whether they’re studying. If someone is accepted to an EAG and asks for it, they’ll get a travel grant no questions asked (as long as they ask for a reasonable amount). Tickets to students are generally provided for free by default though (but again, which university they attend is not taken into account).
Thanks, that’s a helpful clarification. Upvoted.
Tl;dr: Big conferences can be really good: warm, friendly, easy to navigate, etc. Although they might not look like the current EAG. In fact the big conference I used to go to was possibly better than EAGs at some aspects of this!
– –
Without opining on the general for/against I wanted to raise that a point that might be a crux for some people considering this is that I expect most people have not been to or lack a strong sense in their mind of it would look like to have a well-run community-focused 2000+ person conference which had a nice culture, that is designed to facilitate meeting on shared interests and to not be too overwhelming or impersonal.
I used to go to a big religious conference every year with at its peak maybe 2500 attendees, and it had these features. It worked as anyone could (with a bit of preplanning) add a session on anything. So at any time there were 20+ so sessions to pick from. They were not all talks and many were workshops, discussion groups, spaces to co-create projects, etc. Most of them were just put on by anyone but a few top sessions were organised by the conference team. This generally felt like a good way to meet interesting people. The conference was also a residential and 5 days long. Sessions went from 8am to 1am and there was always a a lot of socialising too. There was also childcare throughout. It needed a lot of volunteers to run but volunteering was also a good way to meet others and get a discounted price. It was not at all selective.
If anything I think it is possible that 500+ person EAGs could work better if they shifted to this kind of event: a longer event with meetings happening at a workshop / talk / discussion group rather than on an app as (at least for me) it is almost impossible to find the people to connect to on the app if you do not already know those people, and almost impossible to meet them all in a 2 day conference anyway.
I like 55% agree and it’s updating me to think about what spots I’m taking by attending. I am slightly worried about a) the logistics of organising a massive open conference, and b) potentially getting less value aligned / EA involved people and it negatively impacting the community.
As one of the EAGxBerkeley organisers, we’re trying to figure out how to do outreach to those early level EAs and those that are very EA aligned without knowing what EA—two groups I think would get the most benefit out of an EAGx. Do you (I mean this in a general sense for anyone to answer) have any suggestions for how to do outreach to those not deep in the EA radar so we can make it more of an ‘open’ conference?
Just making sure you saw Eli Nathan’s comment saying that this year plus next year they didn’t/won’t hit venue capacity so you’re not taking anybody’s spot
Thanks!! Good to know :)
I’m a big fan of different “tracks” which are appealing to people in different stages of their EA journey. Let people self-select into things that are more valuable.
E.g. “career journeys”are more valuable if you’re new to EA but interested in EA career paths, whereas a in-depth discussion on a obscure critique may be less interesting.
how you advertise what the conference is probably matters—you probably want to think about the limitations of the EA brand to attract such people. Maybe we need a non EAG branded conference to get really promising people who are more skeptical of EA.
That being said, how can we build on the messaging to be more informative?
E.g. give a sample agenda, talk about the benefit people can receive (how can we frame the value of 1-1s to be more intuitive and attractive to newcomers)
do a lot of pre-event programming that could be Q&As or just a sample of the conference to get people interested and excited to sign up (maybe 2 months before).
For accepted attendees, have some programming leading up to conference to prep them on what it’s like, how to get value, make plans, engage in conversations.
Of the cuff, feel free to ping me for more!
I have had concerns about this, and seen similar concerns among others even applying to our national retreat. An easy solution would be to add a box on the application form “I want to come, but don’t want to crowd out somebody else”—or perhaps better wording! These people are accepted last, after all other people hitting the-bar-for-entry are accepted.
Could be worthwhile reaching out to Santeri from EA Finland about this, since he’s running a hackathon for 100′s of persons. As I understand, have nice website and graphics, but all the work is done in person conversations/flyering at the uni , until eventually you build enough of a reputation over the years that it just spreads by word of mouth (Do things that don’t scale)
On this last point, I think this is likely what we are starting to see with EAGx’s—I think EAGxBerlin is gonna be pretty huge
For some events it might make sense to draw applicants by lottery, if capacity is an issue. True, this is, in one sense a clear departure from optimization. However, I this people will be a lot less upset if they know they were rejected by a die roll rather than by someone who looked into their soul and decided they were not worthy of being among us.
This could be refined a bit with some transparent rules like “everyone working full time at an EA org or who has spoke at a previous meeting is guaranteed admission to at least one event per year.” I think people would also be less insulted by rigid rules than by what feels like personal discretion.
It sounds like capacity isn’t an issue. Based on a comment above from an EAG organiser, it seems that they just accept everyone who meets a certain bar
Ok but maybe there is a belief that smaller events work better in many cases?
If they accept everyone who meets a certain bar, I wonder why that can’t be translated into rigid rules. I agree that this would feel better than having to try to “perform” promisingness on the application and feeling like you are not good enough if rejected. It might be frustrating to not get to go because you don’t yet have some concrete accolade. But this seems better than not getting to go because of some subjective-ish, shady process.
In my experience the bar isn’t about the impressiveness of the applicants’s achievements etc (the number of accolades I have is definitely below zero).
In the application each time I laid out what career directions I’m looking at and the reasoning behind them, and had clear, impact-focused and time-sensitive decisions or projects that I thought going to the conference could help with. And in fact the time I got put on the wait list was when I didn’t have as much of the latter. So I assume they factor these things in more than a record of achievement at least in some cases.
(I did lead a student EA group at one point, but that’s not a signal for anything competence-related like an internship is, like usually no one else is putting their hand up).
I don’t think the application criteria are implicitly or explicitly based on prior achievements. Not sure how to link to comments on the forum, but somewhere in these comments CEA people suggest it’s more about ‘does this person really know what EA is’, and ‘how Will attending this conference help increase this persons impact’. In another comment I also mention how my experience applying for EAGs aligns with this.
E.g. maybe if it’s uncertain based on a written application whether the person has a good/nuanced understanding of EA, having done an EA internship etc is the only way they can infer this—rather than being about the achievement of getting the internship.
One thing that bothers me the most is our rejection process seems pretty opaque?
We’re a community that prides itself on transparency, assuming no info hazards, because of how instrumentally helpful said transparency can be.
I explained to an excited young bright-eyed philanthropically-minded newcomer to EA this week why he likely got rejected to EAG DC. He doesn’t really know the EA-basics so I explained that EAG rejects EA newcomers so attendees can avoid having the “what is EA” converasation for the millionth time. I then encouraged him that I’ll help make sure he can get in next year and that the EA Virtual programs are a great place to get started on his EA journey. Also EAGx’s exist with a lower barrier to entry.
He told me he appreciates my honesty in letting him know why he got rejected.
What bugs me right now is I shouldn’t have to encourage him or make him feel as if the rejection is “honest.” The rejection they receive should be doing that and transparently giving them advice on how not to get rejected again. Does the rejection right now just consist of an automated “sorry we have too many great applicants” and that’s all?
I find it odd that this post totally ignores the existence of EAGx, but I haven’t been to a non-x EAG yet so there’s not much I can authoritatively comment on further.
There’s some discussion of that in this comment thread.
Although it’s a bit tangential to the main point of the post, I’d be interested to hear what interventions the anti-alcohol activist proposed to help “[reduce] social expectations of drinking”.
Is there anywhere I can read more about such proposals?
Here are some suggestions written by Julia Wise from our Community Health team.
I want to state my strong agreement with these ideas. It isn’t hard to come up with dozens of examples of people who didn’t seem particularly impressive and then went on to be much more impressive than any reasonable observer would have expected.
I would also be surprised if EAs (a community of people who think about scope insensitivity, moral cluelessness, and similar ideas that I roughly categorize as “intellectual humility”) are able to identify talent confidently in advance.
I’m a bit worried that the current trends are making EAs somewhat insular along the lines of class/socioeconomic status, as the legible things (attending Stanford, doing internships, networking) tend to be things that are strongly correlated with growing up with a wealthy family, as well as being things that are much harder to do/obtain if you grow up without money. I don’t have enough information nor have I put enough thought to expand upon this idea, but it is something I’m interested in exploring more.
Sorry if tangential or I am missing an obvious cultural reference, but this statement keeps bugging me:
What is the support for holding this belief? The only cultural reference that comes to my mind is Hitler being rejected from Art school (i.e. not being invited to a prestigious thing). However, the thought that the counterfactual impact of such a rejection is the holocaust is more an internet meme than a rational thought that should be turned into a belief about the world.
It’s a joking reference to the Apple of Discord story, wherein the goddess of discord Eris crashed a party and started the Trojan War.
Now I feel dumb, but at least I’m smarter. Thanx.
No reason to feel dumb—I didn’t immediately get the reference either. I saw that it was a reference to a legend about a golden apple from how it was the caption to a painting of a legend-looking-person holding a golden apple, so to answer your question I googled “golden apple legend”, found the wikipedia disambiguation page, and searched that for the legend that fit.
I agree so much with this post. Thank you for saying this!!!