>I also think that (some) EA orgs do more to filter for competence than most companies; I’ve applied to many conventional jobs, and been accepted to some of those, but none of them put me through work testing anywhere near as rigorous and realistic as CEA’s.
I want to push back here based on recent experience. I recently applied for a job at CEA and was essentially told that my background and experience was a perfect fit and that I aced the application but that I was not ‘tightly value aligned’ and thus would not be getting the role.
CEA certainly had an extensive, rigorous application process. They just proceeded to not value the results of that application process. I expect they will hire someone with demonstrably less competence and experience for the role I applied for, but who is more ‘value aligned’.
I would normally hesitate to air this sort of thing in public, but I feel this point needs to be pushed back against. As far as I can tell this sort of thing is fairly endemic within EA orgs—there seems to be a strong, strong preference for valuing ideological purity as opposed to competence. I’ve heard similar stories from others. My small sample size is not just that this happens, but that it happens *openly*.
To relate this to the OP’s main thesis—this is a problem other areas (for instance, politics) have already seen and confronted and we know how it plays out. It’s fairly easy to spot a political campaign staffed with ‘true believers’ as opposed to one with seasoned, hardened campaign veterans. True Believer campaigns crash and burn at a much higher rate, controlling for other factors, because they don’t value basic competence. A common feature of long time pols who win over and over is that they don’t staff for agreement, they staff for experience and ability to win.
EA’s going to learn the same lesson at some point, it’s just a matter of how painful that learning is going to be. There are things EA as a movement is simply not good at, and they’d be far better off bringing in non-aligned outsiders with extensive experience than hiring internally and trying to re-invent the wheel.
Hi Jeremiah. I was the hiring manager here and I think there’s been something of a misunderstanding here: I don’t think this is an accurate summary of why we made the decision we did. It feels weird to discuss this in public, but I consent to you publishing the full rejection email we sent, if you would like.
I don’t particularly feel it would be a valuable use of anyone’s time to get into a drawn out public back-and-forth debate where we both nitpick the implications of various word choices. I’ll just say that if your intention was to communicate something other than “We prefer a candidate who is tightly value aligned” then there was a significant failure of communication and you shouldn’t have specifically used the phrase “tightly aligned” in the same sentence as the rejection.
If the issue is that CEA communicated poorly or you misunderstood the rejection, I agree that’s not necessarily worth getting into. But you’ve made a strong claim about how CEA makes decisions based on the contents of a message, whose author is willing to make public. It looks to me like you essentially have two choices:
Agree to make the message public, or
Onlookers interpret this as an admission that your claim was exaggerated.
I’m strongly downvoting the parent comment for now, since I don’t think it should be particularly visible. I’ll reverse the downvote if you release the rejection letter and it is as you’ve represented.
I’m sorry you had such a frustrating experience. The work of yours I’ve seen has been excellent, and I hope you find a place to use your skills within EA (or keep crushing it in other places where you have the chance to make an impact).
Some hiring processes definitely revolve around value alignment, but I also know a lot of people hired at major orgs who didn’t have any particular connection to EA, and who seem to just be really good at what they do. “Hire good people, alignment is secondary” still feels like the most common approach based on the processes I’ve seen and been involved with, but my data is anecdata (and may be less applicable to meta-focused positions, I suppose).
>I also think that (some) EA orgs do more to filter for competence than most companies; I’ve applied to many conventional jobs, and been accepted to some of those, but none of them put me through work testing anywhere near as rigorous and realistic as CEA’s.
I want to push back here based on recent experience. I recently applied for a job at CEA and was essentially told that my background and experience was a perfect fit and that I aced the application but that I was not ‘tightly value aligned’ and thus would not be getting the role.
CEA certainly had an extensive, rigorous application process. They just proceeded to not value the results of that application process. I expect they will hire someone with demonstrably less competence and experience for the role I applied for, but who is more ‘value aligned’.
I would normally hesitate to air this sort of thing in public, but I feel this point needs to be pushed back against. As far as I can tell this sort of thing is fairly endemic within EA orgs—there seems to be a strong, strong preference for valuing ideological purity as opposed to competence. I’ve heard similar stories from others. My small sample size is not just that this happens, but that it happens *openly*.
To relate this to the OP’s main thesis—this is a problem other areas (for instance, politics) have already seen and confronted and we know how it plays out. It’s fairly easy to spot a political campaign staffed with ‘true believers’ as opposed to one with seasoned, hardened campaign veterans. True Believer campaigns crash and burn at a much higher rate, controlling for other factors, because they don’t value basic competence. A common feature of long time pols who win over and over is that they don’t staff for agreement, they staff for experience and ability to win.
EA’s going to learn the same lesson at some point, it’s just a matter of how painful that learning is going to be. There are things EA as a movement is simply not good at, and they’d be far better off bringing in non-aligned outsiders with extensive experience than hiring internally and trying to re-invent the wheel.
Hi Jeremiah. I was the hiring manager here and I think there’s been something of a misunderstanding here: I don’t think this is an accurate summary of why we made the decision we did. It feels weird to discuss this in public, but I consent to you publishing the full rejection email we sent, if you would like.
I don’t particularly feel it would be a valuable use of anyone’s time to get into a drawn out public back-and-forth debate where we both nitpick the implications of various word choices. I’ll just say that if your intention was to communicate something other than “We prefer a candidate who is tightly value aligned” then there was a significant failure of communication and you shouldn’t have specifically used the phrase “tightly aligned” in the same sentence as the rejection.
If the issue is that CEA communicated poorly or you misunderstood the rejection, I agree that’s not necessarily worth getting into. But you’ve made a strong claim about how CEA makes decisions based on the contents of a message, whose author is willing to make public. It looks to me like you essentially have two choices:
Agree to make the message public, or
Onlookers interpret this as an admission that your claim was exaggerated.
I’m strongly downvoting the parent comment for now, since I don’t think it should be particularly visible. I’ll reverse the downvote if you release the rejection letter and it is as you’ve represented.
I’m sorry you had such a frustrating experience. The work of yours I’ve seen has been excellent, and I hope you find a place to use your skills within EA (or keep crushing it in other places where you have the chance to make an impact).
Some hiring processes definitely revolve around value alignment, but I also know a lot of people hired at major orgs who didn’t have any particular connection to EA, and who seem to just be really good at what they do. “Hire good people, alignment is secondary” still feels like the most common approach based on the processes I’ve seen and been involved with, but my data is anecdata (and may be less applicable to meta-focused positions, I suppose).