I really liked this post, and I felt like I understood both characters’ views super well. I feel like this type of writing often makes one character a strawman, but I didn’t feel that at all in this case. Great job!
OllieBase
I say this at EAGx events and in various posts, but I still don’t think I say it enough: running EAGx events is a huge amount of work, and most of this work is done by dedicated and hard-working EA community members and national group staff. My colleagues and I support these teams, but I think we get too much credit.
I’m continuously impressed by EAGx teams; their thoughtfulness, their focus on impact and the sheer amount of effort they put into ensuring these events go well (and they do). There’s not been a team I haven’t enjoyed working with.
I think there’s more I could do to make working on EAGx events more enjoyable/efficient/worthwhile, but at the very least I want to be extremely open about my (and CEA’s) gratitude towards these people for all they do, have done and will do.
Applications to EAGxBerlin, EAGxAustralia and EAGxPhilippines are open, and Berlin closes tonight. (Adding this because these teams would probably rather I promote their events than just thank them)
- 27 Aug 2023 10:53 UTC; 20 points) 's comment on How much do EAGs cost (and why)? by (
This needs to be discussed internally, but I think a better description is Cooperative with EA (CEA)
Congratulations to the EA Project For Awesome 2024 team, who managed to raise over $100k for AMF, GiveDirectly and ProVeg International by submitting promotional/informational videos to the project.
There’s been an effort to raise money for effective charities via Project For Awesome since 2017, and it seems like a really productive effort every time. Thanks to all involved!
I’ve only been at CEA for the last ~quarter of Max’s tenure but it’s hard to overstate how much I’ve appreciated Max’s humility, his warm nature, his receptivity to feedback and how much he values and appreciates CEA staff in return. I’m really sad to see you go Max—we’ll miss you!
This is such devastating news and a huge loss to this community. Sebastian was deeply kind, thoughtful and selfless. He volunteered at several EAGs, and was always generous with his time; my colleagues and I learned of his passing because he was providing input on a project as a favour to us.
Whenever I saw him, and once I’d learned about all the projects he was driving forward, his face would light up as he told me about all the new things his kids had learned or started saying.
I feel very lucky to have known him, and he will be sorely missed by so many.
Thanks for sharing this. I’ve not been following your work closely, but running a new org with very ambitious goals must be challenging, and I appreciate you acknowledging and sharing your mistakes so far. It would be surprising if you hadn’t made a few mistakes at this point. Good luck!
I agree with the thrust of the argument here but I think your estimate for the size of the EA-aligned graduate pool is far too large.
I helped run the group at Warwick (top 10 uni) for a couple of years. For each year I was on the committee, I would be surprised and very happy if more than 5 graduates identified as ‘committed EAs’. I would also say that Warwick has one of the more active groups outside of Oxbridge. My fermi for the size of the EA grad pool each year would therefore be something more like:
Oxford and Cambridge: 200 (this strikes me as high but I’ll defer to you)
~10 other unis with active groups: 50
~10 other unis with small groups: 20 (most uncertain about this number)
10-20% of those graduates actually applying for non-technical EA roles seems about right so I think the number is more like 25-50. The resultant ratio is still undesirable so I’ve no doubt that there are many grads out there having difficulty getting hired which is saddening and made all the more visceral by the recent post.
This doesn’t really respond to the thrust of what you’re saying here, but just responding to:
there are no clear guidelines regarding appropriate and inappropriate behavior at different types of EA events
I wanted to check that you’re aware that at least EA Global and EAGx events require all attendees to agree to our code of conduct. To save readers a click, it is currently:
___
At EA Global and social events associated with EA Global, you agree to:Respect the boundaries of other participants.
Look out for one another and try to help if you can.
Adhere to national and local health and safety regulations, as well as any additional policies we institute for EA Global.
This is a professional learning and networking event. These behaviors don’t belong at EA Global or related events:
Unwanted sexual attention, or sexual harassment of any kind.
Using the event app to request meetings for romantic or sexual reasons.
Offensive, disruptive, or discriminatory actions or communication.
We understand that human interaction is complex. If you feel able, please give each other the benefit of explaining behavior you find unwelcome or offensive.
If you’re asked to stop a behavior that’s causing a problem for someone, we expect you to stop immediately.
By submitting this form, you confirm that you will adhere to this Code of Conduct, which applies at the conference and all related social events.
You can contact us at hello@eaglobal.org if you have any questions.
All our conferences have at least one community contact person, whose role is to be available for personal or interpersonal problems that come up. Feedback can also be left anonymously on the event survey, or on the community health team’s anonymous contact form.
___It seems plausible to me that this isn’t sufficient, and we’re open to input on how these could be improved.
This seems like a reasonable ask, good luck with it! I can’t help myself, unfortunately.
However, I did bounce off the clickbaity title of this post. I wouldn’t like an EA forum where I had to open each link to work out whether it was worth reading or taking action on. I much prefer posts which try to make transparent what the ask is. In this case, I think “Call on U.S. legislators to protect farmed animal welfare” would’ve been more transparent and possibly even more attractive to some.
This seems basically right to me. That said, I thought I’d share some mild pushback because there are incentives against disagreeing with EA funders (not getting $) and so, when uncertain, it might be worth disagreeing publicly, if only to set some kind of norm and eliciting better pushback.
My main uncertainty about all this, beyond what you’ve already mentioned, is that I’m not sure it would’ve been good to “lock in our pitch” at any previous point in EA history (building on your counterargument “we’re still highly uncertain about which strategies are best from an EA perspective, which is a big part of why truth-seeking and patience are important.”).For example, what if EAs in the early 2010s decided to stop explaining the core principles of EA, and instead made an argument like:
Effective charities are, or could very plausible be, very effective.
Effective charities are effective enough that donating to them is a clear and enormous opportunity to do good.
The above is sufficient to motivate people to take high-priority paths, like earning to give. We don’t need to emphasise more complicated things like rigorous research, scope sensitivity, and expected value-based reasoning.
This argument is probably different in important respects to yours, but illustrates the point. If we started using the above argument instead of explaining the core principles of EA, it might have taken a lot longer for the EA movement to identify x-risks/transformative tech as a top priority. This all seems pretty new in the grand scheme of things, so I guess I expect our priorities to change a lot.
But then again, things haven’t changed that much recently, so I’m convinced by:
many EA-first and longtermist-first people are, in practice, primarily concerned about imminent x-risk and transformative technology, have been that way for a while, and (I think) anticipate staying that way.
Thanks for writing this!
Just to say that the CEA events team has seen this. We’ve actually already implemented a few of these things since you drafted the post—e.g. EAGxCambridge is specifically for people in the UK and has a hard application deadline (this was in part thanks to the draft of this post you sent me!). We now also have an FAQ section on every event page (instead of one FAQ tucked away on the site).
I’ll reply in more detail soon.
Thanks for writing this up!
Like ThomasW, this also reads to me like the “Berkeley take on things” (which you do acknowledge, thanks) and if you live in a different EA hub, you’d say different things. Being in Oxford, I’d say many are focused on longtermism, but not necessarily AI safety, per se.Claim 2, in particular, feels a little strong to me. If the claim was “Many EA leaders believe that making AI go well is one of our highest priorities, if not the highest priority”, I think this would be right.
I also think a true background claim I wish was here is “there are lots of EAs working on lots of different things, and many disagree with each other. Many EAs would disagree with several of my own claims here”.
cool effort amigo
Ollie here from the CEA events team, thanks for this nudge. We’re planning on sharing an update w.r.t to our costs here later this year. You can also see my recent sequence about the costs of EAGx and how we prioritise among events (this doesn’t cover EA Global though).
I’m pretty excited about picnics.
Picnics, or more specifically, free, inclusive events which take place outdoors, probably with cheap or bring-your-own food, seem like a great EA community event format:
They’re cheap—venue and food are often the most expensive line items for events, but this format radically reduces the cost for both.
They help attendees connect—connections are one of the key sources of value from EAG/x events, and picnics help people connect without any frills.
They’re easy to scale—we see increasing returns to scale for EA community-building events and picnics allow you to reach a lot of people without much additional work per attendee (assuming you choose a large enough park).
They’re relaxed—no admissions, no stages, no microphones, soft grass and hopefully sun. Seems like a great environment to meet other people in.
They’re good for the COVID-cautious—no masks required!
Obviously, this isn’t my idea: EA NYC and EA Oxford held them recently and they seemed well-attended, and there’s another one in SF this weekend. I just wanted to give this idea a shout-out. There could be value in something like an “EA picnic day” where a tonne of EA groups host a picnic on the same day, one in every major city.
Next month, two EAGx events are happening in new locations: Austin and Copenhagen!
Applications for these events are closing soon:Apply to EAGxAustin by this Sunday, March 31
Apply to EAGxNordics by April 7
These conferences are primarily for people who are at least familiar with the core ideas of effective altruism and are interested in learning more about what to do with these ideas. We’re particularly excited to welcome people working professionally in the EA space to connect with others nearby and provide mentorship to those new to the space.
If you want to attend but are unsure about whether to apply, please err on the side of applying!
If you’ve applied to attend an EA Global or EAGx event before, you can use the same application for either event.
I often hear (and sometimes think) that EA is still “mostly students” and that means we need to outreach to “actual adults” more. I checked, and 45% of my Twitter followers (EA-heavy, I think) thought the average was 25 or lower.
If EAG attendance is anything to go by, this picture seems basically false. The median EAG attendee is 28.2 years old (mean 29.2). EAGx is not that far behind, with a mean of 27. The average age of the 2022 EA survey respondent was 26.
Thanks for this piece, I really enjoyed it.
I want to hold out that Eli sweeping the offices is more truly heroic than Eli chasing after the biggest project or the most prestigious role or the highest status research area.
I also admire this orientation, props to Eli.
I note that you think the orientation is more important than the action but I do think that doing some marginally helpful task for an EA org is now slightly overrated by the community. I’d want to make salient the much larger class of unheroic yet valuable actions which one can take outside of the professional EA community, such as:
Making progress on some unexciting but helpful research question (like this post on coal seam fires).
Building the EA community where they are, especially if it’s outside of the US or the UK.
Pursuing a career in a more niche or risky cause area, even if it proves not to pay off.
I have a lot of respect for people who do/are doing the above, especially when they know it probably won’t secure them a place in the history books.
I’m disappointed that much of this document involves attacking the people who’ve accused you of harmful actions, in place of a focus on disputing the evidence they provided (I appreciate that you also do the latter). I also really bounce off the distraction tactics at play here, where you encourage the reader to turn their attention back to the world’s problems. It doesn’t seem like you’ve reflected carefully and calmly about this situation; I don’t see many places where you admit to making mistakes and it doesn’t seem like you’re willing to take ownership of this situation at all.
I don’t have time to engage with all the evidence here, but even if I came away convinced that all of the original claims provided by Ben weren’t backed up, I still feel really uneasy about Nonlinear; uneasy about your work culture, uneasy about how you communicate and argue, and alarmed at how forcefully you attack people who criticise you.