Many people in EA community believe it is easier to get a job at an EA organisation than it really is. People working at EA organisations, sometimes in senior positions, were surprised when they heard I didn’t get an offer (from another organisation). I’d guess around half the organisations I applied to were “surprised about the very strong field of applicants”. Past messaging about talent constraints probably also plays a role. As a result, career advice in the EA community can be overly optimistic, to a point where more than one person seriously encouraged me to apply for the COO position at OpenPhil (a position which went to the person who led the operations for Hillary Clinton’s election campaign(!)). At least a year ago, when I was talking to dozens of people for career advice, I got the impression that it should be comparatively easy to get hired by an EA organisation.
This one is weirdly specific and only a minor point (so this comment should not be misconstrued as “the two main reasons people apply for (too) many positions at EA organisations”). I don’t know if this applies to many people, but I got quite a few heavily personalised invitations to apply for positions. I think I *heavily* over-weighted these as evidence that I would have a good chance in the application process. By now I see these invitations as very weak evidence at best, but when I got my first ones, I thought that means I’m half-way there. This was of course naive (and of course I wouldn’t think it means anything if I get a personal letter from a for-profit company). But I am not alone in that. I recently talked to a friend who said “By the way, I got a job offer now. Well, not really a job offer, but it is really close”. All they had gotten was a *very* personalised, well written invitation to apply. But I would guess quite a few people had gotten one (me included). One easy way for EA organisation to avoid inducing this undue optimism would be to transparently state to how many people they send personalised invitations to.
...
(PS: Your point 1 and 2 applied to me very much, but I didn’t get the impression of points 3-5 being the case (I didn’t think people consistently recommended EA orgs over other options))
Thanks for sharing your comment about personalized invitations, that’s interesting. At Open Phil, almost all our personalized invitations (even to people we already knew well) were only lightly personalized. But perhaps a noticeable fraction of people misperceived that as “high chance you’ll get the job if you apply,” or something. The Open Phil RA hiring committee is discussing this issue now, so thanks for raising it.
It sounds like this issue is at least fairly straightforward to address: in subsequent rounds OpenPhil could just include a blurb that more explicitly clarifies how many people they’re sending emails to, or something similar.
(I’ll note that this a bit above/beyond what I think they are obligated to. I received an email from Facebook once suggesting I apply to their lengthy application process, and I’m not under any illusions this gave me more than a 5-10% chance of getting the job. But the EA world sort of feels like it’s supposed to be more personal and I think it’s make for better overall information-and-resource-flow to include that sort of metadata)
I didn’t think people consistently recommended EA orgs over other options
Interesting, thank you for this data point. My speculation was partly based on recently having talked to people who told me something like “you’re the first one [or one of very few among many] who doesn’t clearly recommend me to choose <EA org> over <some other good option>”. It’s good to know that this isn’t what always happens.
I asked 10 people for career advice in a semi-structured way (I sent them the same document listing my options and asked them to provide rankings). These were all people I would think rank somewhere between “one of the top cause prioritization experts in the world” to “really, really knowledgeable about EA and very smart”.
6 out of 10 thought that research analyst for OpenPhil would be my best option. But after that, there was much less consensus on the second best option (among my remaining three top options). 3.5 people rated management at an EA organisation highest, 3 rated biosecurity highest, 3.5 rated an MSc in ML (with the aim of doing AI safety research) highest.
Of course, YOU were one of these ten people, so that might explain some of it :-).
I had many more informal discussions, and I didn’t think there was strong consensus either.
(Let me know if you need more data, I have many spreadsheets full of analysis waiting for you ;-) )
Two additional possible reasons:
Many people in EA community believe it is easier to get a job at an EA organisation than it really is. People working at EA organisations, sometimes in senior positions, were surprised when they heard I didn’t get an offer (from another organisation). I’d guess around half the organisations I applied to were “surprised about the very strong field of applicants”. Past messaging about talent constraints probably also plays a role. As a result, career advice in the EA community can be overly optimistic, to a point where more than one person seriously encouraged me to apply for the COO position at OpenPhil (a position which went to the person who led the operations for Hillary Clinton’s election campaign(!)). At least a year ago, when I was talking to dozens of people for career advice, I got the impression that it should be comparatively easy to get hired by an EA organisation.
This one is weirdly specific and only a minor point (so this comment should not be misconstrued as “the two main reasons people apply for (too) many positions at EA organisations”). I don’t know if this applies to many people, but I got quite a few heavily personalised invitations to apply for positions. I think I *heavily* over-weighted these as evidence that I would have a good chance in the application process. By now I see these invitations as very weak evidence at best, but when I got my first ones, I thought that means I’m half-way there. This was of course naive (and of course I wouldn’t think it means anything if I get a personal letter from a for-profit company). But I am not alone in that. I recently talked to a friend who said “By the way, I got a job offer now. Well, not really a job offer, but it is really close”. All they had gotten was a *very* personalised, well written invitation to apply. But I would guess quite a few people had gotten one (me included). One easy way for EA organisation to avoid inducing this undue optimism would be to transparently state to how many people they send personalised invitations to.
...
(PS: Your point 1 and 2 applied to me very much, but I didn’t get the impression of points 3-5 being the case (I didn’t think people consistently recommended EA orgs over other options))
Thanks for sharing your comment about personalized invitations, that’s interesting. At Open Phil, almost all our personalized invitations (even to people we already knew well) were only lightly personalized. But perhaps a noticeable fraction of people misperceived that as “high chance you’ll get the job if you apply,” or something. The Open Phil RA hiring committee is discussing this issue now, so thanks for raising it.
[Deleted]
It sounds like this issue is at least fairly straightforward to address: in subsequent rounds OpenPhil could just include a blurb that more explicitly clarifies how many people they’re sending emails to, or something similar.
(I’ll note that this a bit above/beyond what I think they are obligated to. I received an email from Facebook once suggesting I apply to their lengthy application process, and I’m not under any illusions this gave me more than a 5-10% chance of getting the job. But the EA world sort of feels like it’s supposed to be more personal and I think it’s make for better overall information-and-resource-flow to include that sort of metadata)
FWIW: I think I know of another example along these lines, although only second hand.
Interesting, thank you for this data point. My speculation was partly based on recently having talked to people who told me something like “you’re the first one [or one of very few among many] who doesn’t clearly recommend me to choose <EA org> over <some other good option>”. It’s good to know that this isn’t what always happens.
I have quantitative data on that :-)
I asked 10 people for career advice in a semi-structured way (I sent them the same document listing my options and asked them to provide rankings). These were all people I would think rank somewhere between “one of the top cause prioritization experts in the world” to “really, really knowledgeable about EA and very smart”.
6 out of 10 thought that research analyst for OpenPhil would be my best option. But after that, there was much less consensus on the second best option (among my remaining three top options). 3.5 people rated management at an EA organisation highest, 3 rated biosecurity highest, 3.5 rated an MSc in ML (with the aim of doing AI safety research) highest.
Of course, YOU were one of these ten people, so that might explain some of it :-).
I had many more informal discussions, and I didn’t think there was strong consensus either.
(Let me know if you need more data, I have many spreadsheets full of analysis waiting for you ;-) )