Hi! I work on the EA Global team and I post a lot of my thoughts on Twitter :)
frances_lorenz
Hello :) I currently work as an Events Associate on the EA Global (EAG) team, a subset of the Events team. I joined in January 2023 (with no prior events experience). I’m incredibly excited for the team to expand, so I thought I might share a bit about my experience so far, for anyone who’s unsure whether to apply.
What I love about working on the team:
I think there’s an implicit motto of, “take the serious stuff seriously and otherwise have fun.” We use charitable funding to run events with the goal of helping others do good in the world, in alignment with effective altruism and its principles, and that’s something the team takes very seriously. But events are also fun!! My colleagues are low-ego, encouraging, supportive, relaxed, I genuinely adore spending time with them; the production room at an event is non-stop jokes. Amy (head of the team) usually brings her baby (Charley) and I get to hold him while monitoring Slack. During EAG, I might need to be on-site 14 hours a day, but the atmosphere is so nice that it doesn’t really feel like work.
I’m deeply motivated by how tangible the product is. You can see the result of your work so saliently before you, people are all around, often happy or excited and chatting, it’s energetic and lovely.
It can be difficult to find experienced mentors in EA — the team’s managers are experts in events, incredibly supportive, and very present / hands-on. At the same time, we’re still a smaller team, so everyone takes on significant responsibility and ownership. I’ve really thrived in this environment, developing skills rapidly.
What I find difficult:
There’s a “sprinty” nature to the work, the lead up to events are hectic. I expect this to improve with new hires. However, to some degree, this is probably unavoidable. Some tasks can’t be done beforehand, we have a lot of attendees to support, and the event’s timing is rigid, everything has to be ready. As a result, EA Global takes over most of my brain space in the few weeks before. It’s incredibly difficult not to constantly check Slack/email. At night I’m usually ticking through tasks in my head, there are tons of little things to track.
I live in the UK, I find it hard traveling to the US twice a year for EA Global (though it is also a huge privilege). Some roles on the team have more travel, there’s typically at least one additional US trip required per year for a company retreat. I struggle with insomnia; I’ve found that the jet lag throws me off for weeks or can trigger a full insomnia episode. I also find flying a bit scary. Flying with my team helps. I’m pretty sure my manager prefers to be alone on flights, but she lets me choose a seat next to her and yap for like 7 hours straight, which I really appreciate.
y’all really are experts :’)
Hey Patrick! My name is Frances and I work on the EA Global team :) About two weeks before the event, we’ll send an email inviting everyone to our conference app (Swapcard). Swapcard will have the event agenda and allow you to book meetings with other attendees. If you have any further questions, please email hello@eaglobal.org and we’ll be very happy to help.
Hey Vasco, thanks for the question! This is an idea we’ve looked into quite a bit. There are some unresolved considerations (e.g. whether it makes sense for CEA to run an event like this), but the idea is still on our radar.
80,000 Hours has a great 2018 article on Operations management roles, which includes a ‘How to assess your fit’ section (I’ll link to it at the bottom of this take). Having worked on the EA Global team for a year now, here are two important traits I would add for assessing fit:
1) Good at task-switching. I think it’s pretty crucial that task-switching isn’t super costly for you and you can do it relatively quickly. Otherwise, I imagine many Ops roles will be quite tiring / frustrating. It might be particularly emphasised in my role, but as an anecdote: in the lead up to an event, my days are working through maybe 10+ small-medium planned tasks with a ton of small, unplanned tasks in between (i.e. monitoring Slacks/emails and responding to them if they take priority). I once mentioned this to two friends and they instinctively said, “I’m really sorry,” so I suspect reactions to this are a useful fit heuristic.
2) Responsive. This one is from a conversation with my team, and I concur—it’s really standout if you can respond to people quickly. This goes hand-in-hand with task switching (i.e. when someone messages you, how costly is it for you to stop what you’re doing and respond). It also necessitates being calibrated on how long tasks take (I’ll explain) and not hating messaging people. The level of responsiveness necessary and how often you get pinged will vary by role. I’m guessing for most Ops roles, a day or two response time is great. For some, you’ll need to generally respond within the same working day(i.e. within minutes or hours). Whether necessary or not, I think achieving this is a huge asset to any team (assuming your other work doesn’t suffer and you’re prioritising well). It means you’re: 1) quickly unblocking others; and, 2) relieving the mental load on the message-sender of tracking their own request. As a note on mental load, over-communication is almost always best in Ops roles. You might open a message and think, “I can’t get to this until tomorrow”—it’s useful to train in the habit of saying that rather than just making a note to yourself. Your coworkers will then be relieved of tracking this (though crucially, it’s important to meet the timeline you set or communicate changes). In an ideal world, your co-workers are never tracking the tasks/requests they send because you’re handling that (i.e. responding quickly or providing timelines and updates automatically).
80,000 Hours article: https://80000hours.org/articles/operations-management/#how-to-assess-fit
- 19 Feb 2024 13:43 UTC; 4 points) 's comment on 2ndRichter’s Quick takes by (
I’ll commit to not commenting more now unless I’ve gotten something really wrong or it’s really necessary or something :’)
Yeah, I don’t necessarily mind an informal tone. But the reality is, I read [edit: a bit of] the appendix doc and I’m thinking, “I would really not want to be managed by this team and would be very stressed if my friends were being managed by them. For an organisation, this is really dysfunctional.” And not in an, “understandably risky experiment gone wrong” kind of way, which some people are thinking about this as, but in a, “systematically questionable judgement as a manager” way. Although there may be good spin-off convos around, “how risky orgs should be” and stuff. And maybe the point of this post isn’t to say, “nonlinear did a reasonably sufficient job managing employees and can expect to do so in the future” but rather, “I feel slandered and lied about and I want to share my perspective.”
But you see how they provide approximately no additional evidence, right? Because photos provide no account for how long someone was away or not away, etc. Basically, in both Alice/Chloe’s world and your world, these photos can exist. One of them is just Alice sitting on a beach chair? And to the second point, I don’t believe the claim was that the environment was materially poor (please tell me if I’m wrong).
I think this comment will be frustrating for you and is not high quality. Feel free to disagree, I’m including it because I think it’s possible many people (or at least some?) will feel wary of this post early on and it might not be clear why. In my opinion, including a photo section was surprising and came across as near completely misunderstanding the nature of Ben’s post. It is going to make it a bit hard to read any further with even consideration (edit: for me personally, but I’ll just take a break and come back or something). Basically, without any claim on what happened, I don’t think anyone suspects “isolated or poor environment” to mean, “absence of group photos in which [claimed] isolated person is at a really pretty pool or beach doing pool yoga.” And if someone is psychologically distressed, whether you believe this to be a misunderstanding or maliciously exaggerated, it feels like a really icky move to start posting pictures that add no substance, even with faces blurred, with the caption “s’mores”, etc.
- 12 Dec 2023 17:25 UTC; 34 points) 's comment on Nonlinear’s Evidence: Debunking False and Misleading Claims by (
What a fantastic post, thank you so so much for writing this.
1. I don’t often get to hear from people in EA who deeply committed to one path to impact and have long-term experience with it. It’s incredibly valuable to hear from someone who has built up so much context around the path and can describe it in different phases, rather than the shorter stints I more often hear about (which are valuable in their own way of course, but more common).2. Yeah, I’ve been involved since 2019-ish and never considered earning-to-give, yet distinctly noticed and remember the tonal shift against it that seemed to crop up out of no where (partly because I wasn’t consciously following EtG advice at all, so when ideas around it reached me I was like, oh vibe is negative now?). Like felt distinctly negatively valenced rather than just a neutral “we no longer recommend this,” idk. I imagine this did feel like suddenly being “turned on,” and I appreciate you bringing attention to that experience. I’m pretty sad to hear that.
THANK YOU thank you for all the money you and your wife have given.
Hey Jonny, thanks so much for pointing that out, that’s my bad!! I’ve replaced the link with hopefully a more helpful resource :D
Oh that’s totally okay, thanks for clarifying!! And good to get more feedback because I was/am still trying to collect info on how accessible this is
this is really good to know, thank you!! I’m thinking we hit more of a ‘familiar with some technical concepts/lingo’ accessibility level rather than being accessible to people who truly have no/little familiarity with the field/concepts.
Curious if that seems right or not (maybe some aspects of this post are just broadly confusing). I was hoping this could be accessible to anyone so will have to try and hit that mark better in the future.
Luke, thank you for always being so kind :)) I very much appreciate you sharing your thoughts!!
“sometimes people exclude short-term actions because it’s not ‘longtermist enough’”
That’s a really good point on how we see longtermism being pursued in practice. I would love to investigate whether others are feeling this way. I have certainly felt it myself in AI Safety. There’s some vague sense that current-day concerns (like algorithmic bias) are not really AI Safety research. Although I’ve talked to some who think addressing these issues first is key in building towards alignment. I’m not even totally sure where this sense comes from, other than that fairness research is really not talked about much at all in safety spaces.Glad you brought this up as it’s definitely important to field/community building.
Do you think that’s a factor of: how many places you could apply for longtermist vs. other cause area funding? How high the bar is for longtermist ideas vs. others? Something else?
Thank you, I really appreciate the breadth of this list, it gives me a much stronger picture of the various ways a longtermist worldview is being promoted.
Yeah, absolutely! Happy to go through posts offering career advice, how one might implement the advice, if there are any other perspectives to consider, etc.
I would really encourage having a low-bar for sending people our way, very happy to talk to anyone! But generally, we offer coaching to those trying to get into the AI Safety field (ex. undergrads looking for research positions, software engineers or research scientists looking for work in the field, independent researchers or community-builders interested in applying for funding). Also happy to talk people through AI Safety career-related decisions (ex. whether or not to go to graduate school, choosing between positions, etc.)
This is great advice :) Already mentioned below; however, for people in similar positions, please do consider booking a coaching call with AI Safety Support: https://www.aisafetysupport.org/. We have experience helping people navigate the AI Safety field and can also connect you to others.
- 17 Dec 2021 0:30 UTC; 1 point) 's comment on I’m Offering Free Coaching for Software Developers in the EA community by (
Good idea :) thank you!
Hey there! I work on the EA Global team, thanks for the question :) At EAG London, each floor of the venue will have an all gender bathroom. For future reference, our team can always be reached by emailing hello@eaglobal.org (forum questions usually get flagged to us, but we don’t actively monitor the forum).