EAG talks are underrated IMO

Underrated is relative.[1] My position is something like “most people should consider going to >1 EAG talk” and not “most people should spend most of their EAG in talks.” This probably most applies to people who are kind of like me. (Been involved for a while, already have a strong network, don’t need to do 1-1s for their job.)

There’s a meme that 1-1s are clearly the most valuable part of EAG(x) and that you should not really go to talks. (See e.g. this, this, this, they don’t say exactly this but I think push in the direction of the meme.)

I think EAG talks can be really interesting and are underrated. It’s true that most of them are recorded and you could watch them later but I’m guessing most people don’t actually do that.[2] It also takes a while for them to be uploaded.

I still think 1-1s are pretty great, especially if you’re

  • new and don’t know many people yet (or otherwise mostly want to increase the number of people you know),

  • have a very specific thing you’re trying to get out of EAG and talking to lots of people seems to be the right thing to achieve it.

I’m mostly writing this post because I think the meme is really strong in some parts of the EA community. I can imagine that some people in the EA community would feel bad for attending talks because it doesn’t feel “optimal.”[2] If you feel like you need permission, I want to give you permission to go to talks without feeling bad. Another motivation is that I recently attended my first set of EAG talks in years (I was doing lots of 1-1s for my job before) and was really surprised by how great they were. (That said, it was a bit hit or miss.) I previously accidentally assumed that talks and other prepared sessions would give me ~nothing.

edit: Someone mentioned in person that they think EAG talks just got much better recently, so I just started going to them at a lucky time. So, if you’ve gone in the past and were disappointed, now is maybe a good time to try again.

  1. ^

    See also the rule of equal and opposite advice (1, 2) although I haven’t actually read the posts I linked.

  2. ^

    My best guess is that people in EA are more biased towards taking actions that are part of a collectively “optimal” plan for [generic human with willpower and without any other properties] than taking actions that are good given realistic counterfactuals.