I’m working as the Interim Head of Operations at the Centre for Effective Altruism (CEA), where I was previously the lead organizer for EA Global. Before working at CEA I was an Operations Assistant at Open Philanthropy, and prior to that was involved in various community building projects at EA Oxford.
Eli_Nathan
By agentive I sort of meant “how effectively an agent is able to execute actions in accordance with their goals and values”—which seems to be independent of their values/how aligned they are with doing the most good.
I think this is a different scenario to the agent causing harm due to negative corrigibility (though I agree with your point about how this could be taken into account with your model).
It seems possible however that you could incorporate their values/alignment into corrigibility depending on one’s meta-ethical stance.
Ah okay—I think I understand you, but this is entering areas where I become more confused and have little knowledge.
I’m also a bit lost as to what I meant by my latter point, so will think about it some more if possible.
Thanks Rossa,
I’m wondering how you see 1FTW’s position changing due to the presence of OpenPhil and a shift towards a more money rich, talent poor community (across certain cause areas)?
In my eyes, the comparative advantage for student groups is more about driving engagement and plan changes and less about raising funds. Of course, money still goes a long way, but I’m skeptical that group leaders should be spending their time focusing on (relatively) small donations over building communities of talented, engaged individuals.
Is your view that 1FTW will be a better outreach vehicle (than standard community building techniques) for certain demographics? It seems that 1FTW attracts similar types of people that the GWWC pledge would, but at higher quantities due to the lower barrier. However, I’m skeptical that this lower barrier is necessarily a positive thing, because it would seem that, on average, these individuals are less likely to further engage with the EA community at large.
Is this something you’re concerned about, or do you think these concerns are relatively minor?
Thanks Marek,
I remember some suggestions a while back to store the EA funds cash (not crypto) in an investment vehicle rather than in a low-interest bank account. One benefit to this would be donors feeling comfortable donating whenever they wish, rather than waiting for the last possible minute when funds are to be allocated (especially if the fund manager does not have a particular schedule). Just wondering whether there’s been any thinking on this front?
Thanks Max! I too am not certain that this is the correct approach, and think there is a good case for longer form conversations due to the reasons you give. The rough case I’d make for the “maximizing” approach is:
1. It’s easy to scale: You can easily gather 5-10 members of your group, give them 10-15 minutes of guidance and put them on the stall. I slightly worry about group members who are newer to EA having long form on-boarding conversations with new and interested people (in EA Oxford, we’ve previously taken some time to verify that people are knowledgable enough to have formal 1-1 conversations with newcomers).
2. Activities fairs are often noisy and as such don’t represent the best environment to engage in long form conversations.
3. Even if you do have long form conversations at the stall, they likely won’t last longer than 5-10 minutes, which I think is generally not enough time for someone to properly understand what EA is. Often, when engaging in longer conversations at activity fairs, I’ve observed people come across as somewhat skeptical of EA, but in such a way that upon further reflection I could imagine them being reasonably excited about it. As such, it may be better to optimize for driving attendance at longer form events, such as a 1-1 coffee chat or a 1-hour introductory talk.
I agree that this approach could come across as unfriendly, and that it’s important to make sure stall-runners are aware of this. Overall, I see this as a downside, but one that is probably worth it in the long run.
When (if ever) will marijuana be legal for recreational use, or effectively so, across all 50 US states?
Yeah — this seems pretty reasonable to me. I’d not thought about this explicitly before, but the rough numbers/boundaries you provide seem quite plausible!
Thanks Timothy!
I think this is broadly fair, and perhaps a reframing of “think more actively about your interests” would be better than just “think more actively about your career” for many readers.
That said, I think for a lot of people, what they’re immediately excited about doesn’t line up well with what might be good for their career, especially if they’re trying to do good. I worry that “keep noticing what excites you and find ways to do more of that” would lead some people down career paths with little impact, whilst also making it hard to transition to high impact roles in the future. I also suspect that many people’s passions are more flexible than they might expect, and that without careful planning, they may narrow down their options unnecessarily.
Thanks for the post — I only sort of skimmed the post and comments, and crucially I don’t think this is what your post is really about, but it seems like you have the view that we’re kinda clueless about whether factory farmed animals have good or bad lives. In reference to this, you mention in a comment: “It’s hard to be confident of any view on this, when we understand so little about consciousness, animal cognition, or morality.”
As an aside, the term “factory farmed animals” is kind of weird category that includes both cows and chickens (among other animals). You could plausibly make the case that cows have net positive lives, but it seems pretty difficult to say the same for chickens.
Sure, we don’t understand everything and everything about morality, but given the evidence we do have with regards to animal suffering and a few other basic axioms and intuitions, it seems hard to put this at 50:50 or similar. There are a bunch of arguments in favor of factory farmed chickens having bad lives, and I’m not aware of many arguments saying that they have positive lives. I think the Holocaust case is interesting but a bit confusing because those people had (probably) happy/positive lives before the Holocaust, and could have had happy/positive lives if they had been released. If someone were to intentionally breed humans into existence in order to place them into concentration camps (and later kill them), I think most plausible ethical theories would consider this to be uncontroversially bad.
Eli from the EAG London team here: there will be plenty (hundreds) of COVID tests available at the event for any attendee who wants them. Please ask an on-site volunteer or organizer if you’d like a rapid/lateral flow test!
Eli from the EA Global team here: For anyone that has travelled to London for the conference, we will reimburse you for any extra travel or accommodation costs that arise should you be stuck in town due to contracting COVID-19 (e.g. if you have to stay in your hotel for an extra week and book new flights due to contracting COVID-19 at or slightly before the event).
You can see more information in our COVID protocol here, though please feel free to reach out to hello@eaglobal.org should you have any questions or concerns — thanks!
Hi Alastair — sorry to hear you had such a rough experience! I work on the EA Global team and posts like these are super helpful to us as well as the wider EA community (helping folks manage expectations, helping people who have burnt out feel like they aren’t alone, etc.). I think EA Globals have a lot going on (including afterparties) and many attendees definitely feel like they are under a lot of pressure, which can be a lot. Glad to hear you’re doing better now — and definitely keen to hear any feedback you or anyone else might have for the organizing team (feedback forms were sent out to all attendees)!
Hi Luke — sorry to hear about all of this! I work on the EA Global team and I can confirm that we definitely definitely don’t want you sleeping on the bus! Please apply for more travel/accommodation funding next time if it’d be useful, it definitely won’t affect your chances and we won’t reject you for thinking you’re taking advantage of us!
For folks who need it, funding is also available up-front (rather than having to wait to be reimbursed), with an option to return extra money should you have any leftover.
Thanks for the suggestion here! I’ve edited the post to put in the full place names, and will try to do so on our other communications/sites.
I won’t go into too much detail here, but FWIW I lived in the Bay Area for ~2.5 years and found it somewhat difficult to network or get into various EA/rationalist social scenes (I think I was something of an outlier, but not extremely so). If you don’t have a clear pathway to meeting people (such as being invited to work out of an EA co-working space for the summer, or having friends already living out there) you might have a more difficult experience networking/socializing than the post describes.
That said, I think for many EAs, visiting the Bay Area for at least some period of time is a great idea.
Hi — I’m Eli from the EA Global team. Thanks for your thoughts on this — appreciate your concerns here. I’ll try to chip in with some context that may be helpful. To address your main underlying point, my take is that EA Globals have incredibly high returns on investment — EA orgs and members of the community report incredibly large amounts of value from our events. For example:
An attendee from an EA-aligned org said they would probably trade $5 million in donations for the contacts they made at EAGxBoston.
Another EA-aligned org reporting that they’ve gotten a minimum of $1.25 million worth of value from connections they’ve made at EAG(x)’s.
This pushes me in the direction of spending more money if it will help make the event better and facilitate more connections between members of the community — though of course we don’t want to spend money unnecessarily or in any way that would be particularly flashy. Another point is that, whilst niceness/spending can be a turn off to some, the reverse can be a turn off to others (e.g., a poorly furnished/not as nice venue could turn off potential donors) — and it can be hard to trade off the preferences between these two groups. To address some other more minor details in your post:
It’s not actually feasible for us to collect sweatshirt sizes in advance, as many/most people apply and register quite late in the day (applications close two weeks before the event). Given that we were customizing such a large number of sweatshirts, these needed to be ordered several weeks in advance. I do think we could have gotten smaller sizes on the whole though, as most of the leftover sweatshirts were XL or similar.
Alcohol is generally done “on consumption” at EAG events (though I don’t know whether this was exactly the case this time) — meaning that CEA is only billed for the alcohol that is consumed, and any bottles of wine not used are not charged to us. Compared to most non-EA events, our attendees actually drink much less than average, and the caterers/beverage staff are always quite surprised by this. I think there’s a benefit to there being alcohol available (as it makes some people less socially anxious). An alternative to our current setup could be something like “each person gets two drinks”, but then we’d need to track this and it’s unlikely to come out to a significantly lower cost anyway. Another option is having people pay for their own drinks, but that seems unfair to those with less financial stability.
The security staff on site was fairly necessary as the venue had multiple entrances/exists and was adjacent to a public park frequented by tourists. Without the security team, members of the public would have just flooded into the venue (which would be bad for several reasons). We selected the venue in question because it had a bunch of outside space and we weren’t sure what the COVID situation was going to be so far in advance. Additionally, for a crowd of ~1500 people, there just aren’t that many suitable venues anyway (most are too small, some are too big).
At an event of ~1500 people, “staff taking away cups” is more of a cleaning exercise rather than a way to pamper attendees. Without this, the trash buildup would likely be substantial.
No problem! I probably won’t be able to respond to your later points, just because the answers would be complicated and I’d have to go into a lot of detail re how I think about EAG. But to answer some of your other questions:
1. I don’t have concrete data on the counterfactual likelihood of connections, but I expect that it’s not that high (very strong confidence that it’s <50% of connections). There’s no obvious way for many these people to connect virtually, other than attending a virtual EA conference, and I think there are also strong benefits to meeting in person (as well as the possibility of group discussions and meetups). My rough guess would also be that people in general are less interested in virtual conferences than in-person ones, meaning that there are a bunch of counterfactual connections here.2. The org that said they’d gotten a minimum of $1.25 million worth of value from connections they’ve made at EAG(x)’s was a global health and development org. I don’t know exactly who said that they would trade $5 million in donations for the contacts they made at EAGxBoston, but my guess is that this was someone working in a longtermist/x-risk field (someone on my team told me about this feedback, I didn’t receive it directly myself).
Hi Scott — I work for CEA as the lead on EA Global and wanted to jump in here.
Really appreciate the post — having a larger, more open EA event is something we’ve thought about for a while and are still considering.
I think there are real trade-offs here. An event that’s more appealing to some people is more off-putting to others, and we’re trying to get the best balance we can. We’ve tried different things over the years, which can lead to some confusion (since people remember messaging from years ago) but also gives us some data about what worked well and badly when we’ve tried more open or more exclusive events.
We’ve asked people’s opinion on this. When we’ve polled our advisors including leaders from various EA organizations, they’ve favored more selective events. In our most recent feedback surveys, we’ve asked attendees whether they think we should have more attendees. For SF 2022, 34% said we should increase the number, 53% said it should stay the same, and 14% said it should be lower. Obviously there’s selection bias here since these are the people who got in, though.
To your “...because people will refuse to apply out of scrupulosity” point — I want to clarify that this isn’t how our admissions process works, and neither you nor anyone else we accept would be bumping anyone out of a spot. We simply have a specific bar for admissions and everyone above that bar gets admitted (though previous comms have unfortunately mentioned or implied capacity limits). This is why the events have been getting larger as the community grows.
I wanted to outline the case for having an admissions process and limiting the size of the event, which is roughly:
We host different events for different purposes. EAG is intended as a more selective event for people who mostly already have a lot of context on EA and are taking significant action based on EA principles. The EAGx conference series (which will serve nearly 5000 unique attendees across the different events this year) is intended to reach a broader, newer-to-EA audience.
EAG is primarily a networking event, as one-to-one conversations are consistently reported to be the most valuable experiences for attendees. I think there’s less value in very new folks having such conversations — a lot of the time they’re better off learning more about EA and EA cause areas first (similar to how I should probably learn how ML works before I go to an ML conference).
Very involved and engaged EAs might be less eager to come to EAG if the event is not particularly selective. (This is a thing we sometimes get complaints about but it’s hard for people to voice this opinion publicly, because it can sound elitist). These are precisely the kinds of people we most need to come — they are the most in-demand people that attendees want to talk to (because they can offer mentorship, job opportunities, etc.).
We think that some of our most promising newer attendees would also have a worse experience if the event were fully open.
Using an admissions process lets us try to screen out applicants who have caused problems at past events or who seem likely to cause problems.
I don’t think this is really what your post is about, but I wanted to clarify: EAG exists to make the world a better place, rather than serve the EA community or make EAs happy. This unfortunately sometimes means EAs will be sad due to decisions we’ve made — though if this results in the world being a worse place overall, then we’ve clearly made a mistake.
I agree it’s hard to identify promising people reliably, but I don’t think it’s impossible to get some signal here. I do think our admissions process could improve though, and we adjust the process every year. We’re currently in the process of revisiting the application/admissions process with the aim of identifying promising people more reliably — though of course it’s hard to make this perfect.
“The conference is called “EA Global” and is universally billed as the place where EAs meet one another, learn more about the movement, and have a good time together.” It’s possible we should rename the event, and I agree this confusion and reputation is problematic, but I would like to clarify that we don’t define the event like this anywhere (though perhaps we used to in previous years). It’s now explicitly described as an event with a high bar for highly engaged EAs (see here). We also have the EAGx conference series, which is more introductory and has a lower bar for admissions. If someone is excited to learn more about EA, they’d likely be better suited to an EAGx event (and they’d be more likely to get accepted, too).
Having different levels of access to the conference app seems like it might worsen rather than improve the problem of some people feeling like second-class citizens.
Regarding the specific volunteer case you mentioned, I’m not exactly sure what the details were here and it’s not something anyone on the team recalls. It does sound like something that easily could have happened — just perhaps a few years ago. FWIW, as of 2019, all volunteers had to meet the general bar for admission.
I think I would also be in favor of other more specialized conferences, such as those on AI safety or global health, but these are unlikely to be things we’ll have capacity to run at the moment (though I encourage people to apply for CEA event support and run events like these).
Thanks again for the post, hope these points are helpful!
- Case Study of EA Global Rejection + Criticisms/Solutions by 23 Sep 2022 11:38 UTC; 234 points) (
- Invisible impact loss (and why we can be too error-averse) by 6 Oct 2022 15:08 UTC; 129 points) (
- 23 Sep 2022 13:25 UTC; 61 points) 's comment on Case Study of EA Global Rejection + Criticisms/Solutions by (
- Proposal — change the name of EA Global by 19 Dec 2022 17:34 UTC; 57 points) (
- 31 Jan 2023 10:20 UTC; 23 points) 's comment on Karma overrates some topics; resulting issues and potential solutions by (
- 9 Feb 2023 8:34 UTC; 20 points) 's comment on Solidarity for those Rejected from EA Global by (
- 1 Sep 2022 9:53 UTC; 17 points) 's comment on Open EA Global by (
- 23 Sep 2022 13:34 UTC; 17 points) 's comment on Case Study of EA Global Rejection + Criticisms/Solutions by (
- 9 Feb 2023 8:25 UTC; 16 points) 's comment on Solidarity for those Rejected from EA Global by (
- 24 Sep 2022 20:02 UTC; 8 points) 's comment on Case Study of EA Global Rejection + Criticisms/Solutions by (
- Who’s interested in an EA unconference style virtual convening on gather dot town to mirror EA global? by 1 Sep 2022 20:07 UTC; 5 points) (
- 5 Sep 2022 8:39 UTC; 2 points) 's comment on Open EA Global by (
- 20 Jul 2023 14:55 UTC; 2 points) 's comment on OllieBase’s Quick takes by (
Thanks for the thoughts here — a lot of what’s going on is just that our website is pretty out of date, and we’re in the process of refreshing/updating it currently. We’re also going to make some slight edits to our front page ~now to make things a bit clearer.
I really liked this post and the model you’ve introduced!
With regards to your pseudomaths, a minor suggestion could be that your product notation is equal to how agentive our actor is. This could allow us to take into account impact that is negative (i.e., harmful processes) by then multiplying the product notation by another factor that takes into account the sign of the action. Then the change in impact could be proportional to the product of these two terms.