I think that most Forum users would agree with most of what youâve written here, and I donât see much in the post that I consider controversial (maybe the peer review claim?) .
I gave the post an upvote, and Iâm glad someone wrote down all these sensible points in one place, but I found the conventional wisdom and the vibe of âthis will surprise youâ to be an odd mix. (Though I might be more normie relative to the EA community than I realize?)
*****
As is the nature of comments, these will mostly be pushback/ânitpicks, but the main thrust of the post reads well to me. You generally shouldnât take Forum posts as seriously as peer-reviewed papers in top journals, and you shouldnât expect your philosophy friends to take the place of a strong professional network unless there are unusual circumstances in play.
(That said, I love my philosophy friends, and Iâm glad I have a tight-knit set of interesting people to hang out with whenever Iâm in the right cities; thatâs not an easy thing to find these days. I also think the people Iâve met in EA are unusually good at many conventional virtues â honesty, empathy, and curiosity come to mind.)
If you want to be sure that a job done is well, donât hire an EA fresh out of college. Hire someone with a strong track record that a conventional HR department would judge as demonstrably competent. Companies hire people all the time who are not âalignedâ with them, i.e. not philosophically motivated to maximize the companyâs profit, and it works out fine.
Iâve seen this debate play out many times online, but empirically, it seems to me like EA-ish orgs with a lot of hiring power (large budgets, strong brands) are more likely than other EA-ish orgs to hire people with strong track records and relevant experience.[1]
My guess is that other orgs EA-ish would also like to hire such people in many cases, but find them to be in short supply!
People with a lot of experience and competence command high salaries and can be very selective about what they do. Naively, it seems hard to convince such people to take jobs at small charities with low budgets and no brand recognition.
When I put up a job application for a copywriting contractor at CEA , my applicants were something like:
60% people with weak-to-no track records and no EA experience/âalignment
30% people with weak-to-no track records and some EA experience/âalignment
10% people with good track records (some had EA experience, others didnât)
Fortunately, I had a bunch of things going for me:
CEA was a relatively established org with a secure budget; we gave a reliable impression and I could afford to pay competitive rates.
Writing and editing are common abilities that are easy to test.
This means I had plenty of people to choose from (10% was 20 applicants), and it was easy to see whether they had skills to go with their track records. I eventually picked an experienced professional from outside the EA community, and she did excellent work.
A lot of jobs arenât like this. If youâre trying to find someone for âgeneral operationsâ, thatâs really hard to measure. If youâre trying to find a fish welfare researcher, the talent pool is thin. Iâd guess that a lot of orgs just donât have many applicants a conventional HR department would love, so they fall back on young people with good grades and gumption who seem devoted to their missions (in other words, âweak-to-no track records and some EA experience/âalignmentâ).
*****
I also think that (some) EA orgs do more to filter for competence than most companies; Iâve applied to many conventional jobs, and been accepted to some of those, but none of them put me through work testing anywhere near as rigorous and realistic as CEAâs.
*****
If âgrown-upsâ had been involved at the officer level at FTX, I claim fraud probably would not have occurred. I canât say I predicted FTXâs collapse, but I didnât know it was being run by people with no experience.
Iâm not sure about the âprobablyâ here: there are many counterexamples. Ken Lay and Jeff Skilling are the first who come to mind â lots of experience, lots of fraud.
If you think that FTX was fraud borne of the need to cover up dumb mistakes, that could point more toward âexperience would have prevented the dumb mistakes, and thus the fraudâ. And lack of experience is surely correlated with bad business outcomes...
...but also, a lot of experienced businesspeople drove their own crypto companies to ruin â I grabbed the first non-FTX crypto fraud I found for comparison, and found Alex Mashinsky at the helm. (Not an experienced financial services guy by any means, but he had decades of business experience and was the CEO of a company with nine-figure revenue well before founding Celsius.)
Many small organizations are founded by EAs who would not succeed at being hired to run an analogous non-EA organization of similar scope and size. I (tentatively) think that these organizations, which are sometimes given the outrageously unsubstantiated denomination âeffective organizationâ, are mostly ineffective.
As people Iâve edited can attest, Iâve been fighting against the âeffectiveâ label (for orgs, cause areas, etc.) for a long time. Iâm glad to have others alongside me in the fight!
Better alternatives: âPromisingâ, âworking in a promising areaâ, âpotentially highly impactfulâ, âhigh-potentialâ⊠you canât just assume success before youâve started.
On the âwould not succeedâ point: I think that this is true of EA orgs, and also all other types of orgs. Most new businesses fail, and Iâm certain the same is true of new charities. This implies that most founders are bad at running organizations, relative to the standard required for success.
(It could also imply that EA should have fewer and larger orgs, but thatâs a question too complicated for this comment to cover.)
Open Philanthropy once ran a research hiring round with something like a thousand applicants; by contrast, when I applied to be Stuart Russellâs personal assistant at the relatively new CHAI in 2018, I think I was one of three applicants.
âIâve seen this debate play out many times online, but empirically, it seems to me like EA-ish orgs with a lot of hiring power (large budgets, strong brands) are more likely than other EA-ish orgs to hire people with strong track records and relevant experience.â
Based on speaking to people at EA orgs (and looking at the orgsâ staff lists), I disagree with this. When I have spoken to employees at CEA and Open Phil, the people Iâve spoken to have either (a) expressed frustration about how focused their org is on hiring EA people for roles that seem to not need it or (b) defended hiring EAs for roles that seem to not need it. (Iâm talking about roles in ops, personal assistants, events, finance, etc.)
Maybe I agree with your claim that large EA orgs hire more âdiverselyâ than small EA orgs, but what I read as your implication (large EA orgs do not prioritize value-alignment over experience), I disagree with. I read this as your implication since the point youâre responding to isnât focusing on large vs. small orgs.
I could point to specific teams/âroles at these orgs which are held by EAs even though they donât seem like they obviously need to be held by EAs. But that feels a little mean and targeted, like Iâm implying those people are not good for their jobs or something (which is not my intent for any specific person). And I think there are cases for wanting value-alignment in non-obvious roles, but the question is whether the tradeoff in experience is worth it.
You generally shouldnât take Forum posts as seriously as peer-reviewed papers in top journals
I suspect I would advise taking them less seriously than you would advise, but Iâm not sure.
It could also imply that EA should have fewer and larger orgs, but thatâs a question too complicated for this comment to cover
I think there might be a weak conventional consensus in that direction, yes. By looking at the conventional wisdom on this point, we donât have deal with the complicatedness of the questionâthatâs kind of my whole point. But even more importantly, perhaps fewer EA orgs that are not any larger; perhaps only two EA orgs (Iâm thinking of 80k and OpenPhil; Iâm not counting CHAI as an EA org). There is not some fixed quantity of people that need to be employed in EA orgs! Conventional wisdom would suggest, I think, that EAs should mostly be working at normal, high-quality organizations/âuniversities, getting experience under the mentorship of highly qualified (probably non-EA) people.
I suspect I would advise taking them less seriously than you would advise, but Iâm not sure.
The range of quality in Forum posts is⊠wide, so itâs hard to say anything about them as a group. I thought for a while about how to phrase that sentence and could only come up with the mealy-mouthed version you read.
But even more importantly, perhaps fewer EA orgs that are not any larger.
Maybe? Iâd be happy to see a huge number of additional charities at the âmedian GiveWell granteeâ level, and someone has to start those charities. Doesnât have to be people in EA â maybe the talent pool is simply too thin right now â but thereâs plenty of room for people to create organizations focused on important causes.
(But maybe youâre talking about meta orgs only, in which case Iâd need a lot more community data to know how I feel.)
Conventional wisdom would suggest, I think, that EAs should mostly be working at normal, high-quality organizations/âuniversities, getting experience under the mentorship of highly qualified (probably non-EA) people.
I agree, and I also think this is what EA people are mostly doing.
When I open Swapcard for the most recent EA Global, and look at the first 20 attendees alphabetically (with jobs listed), I see:
Seven people in academia (students or professors); one is at the Global Priorities Institute, but it still seems like âstudying econ at Oxfordâ would be a good conventional-wisdom thing to do (Iâd be happy to yield on this, though)
Six people working in conventional jobs (this includes one each from Wave and Momentum, but despite being linked to EA, both are normal tech companies, and Wave at least has done very well by conventional standards)
One person in policy
Six people at nonprofit orgs focused on EA things
Glancing through the rest of the list, Iâd say it leans toward more âEA jobsâ than not, but this is a group that is vastly skewed in favor of âdoing EA stuffâ compared to the broad EA community as a whole, and itâs still not obvious that the people with EA jobs/âheaded for EA jobs are a majority.
(The data gets even messier if youâre willing to count, say, an Open Philanthropy researcher as someone doing a conventionally wise thing, since you seem to think OP should keep existing.)
Overall, Iâd guess that most people trying to maximize their impact with EA in mind are doing so via policy work, earning-to-give,[1] or other conventional-looking strategies; this just gets hidden by the greater visibility of people in EA roles.
Iâd love to hear counterarguments to this â Iâve held this belief for a long time, it feels uncommon, and thereâs a good chance Iâm just wrong.
This isnât a conventional way to use money, but the part where you earn money is probably very conventional (get professional skill, use professional skill in expected way, climb the ladder of your discipline).
This is very high-quality. No disputes just clarifications.
I donât just mean meta-orgs.
I think working for a well-financed grantmaking organization is not outrageously unconventional, although I suspect most lean on part-time work from well-respected academics more than OpenPhil does.
And I think 80k may just be an exception (a minor one, to some extent), borne out of an unusually clear gap in the market. I think some of their work should be done in academia instead (basically whatever work itâs possible to do), but some of the very specific stuff like the jobs board wouldnât fit there.
Also, if we imagine an Area Dad from an Onion Local News article, I donât think heâs skepticism would be quite as pronounced for 80k as for other orgs like, e.g., an AI Safety camp.
Yeah, Iâm not sure that people prioritizing the Forum over journal articles is a majority view, but it is definitely something that happens, and there are currents in EA that encourage this sort of thinking.
Iâm not saying we should not be somewhat skeptical of journal articles. There are huge problems in the peer-review world. But forum/âblogs posts, what your friends say, are not more reliable. And it is concerning that some elements of EA culture encourage you to think that they are.
Evidence for my claim, based on replies to some of Ineffective Altruismâs tweets (who makes a similar critique).
(If it is inappropriate for me to link to peopleâs Twitter replies in a critical way, let me know. I feel a little uncomfortable doing this, because my point is not to name and shame any particular person. But Iâm doing it because it seems worth pushing back against the claim that âthis doesnât happen here.â I do not want to post a name-blurred screenshot because I think all replies in the thread are valuable information, not just the replies I share, so I want to enable people to click through.)
>I also think that (some) EA orgs do more to filter for competence than most companies; Iâve applied to many conventional jobs, and been accepted to some of those, but none of them put me through work testing anywhere near as rigorous and realistic as CEAâs.
I want to push back here based on recent experience. I recently applied for a job at CEA and was essentially told that my background and experience was a perfect fit and that I aced the application but that I was not âtightly value alignedâ and thus would not be getting the role.
CEA certainly had an extensive, rigorous application process. They just proceeded to not value the results of that application process. I expect they will hire someone with demonstrably less competence and experience for the role I applied for, but who is more âvalue alignedâ.
I would normally hesitate to air this sort of thing in public, but I feel this point needs to be pushed back against. As far as I can tell this sort of thing is fairly endemic within EA orgsâthere seems to be a strong, strong preference for valuing ideological purity as opposed to competence. Iâve heard similar stories from others. My small sample size is not just that this happens, but that it happens *openly*.
To relate this to the OPâs main thesisâthis is a problem other areas (for instance, politics) have already seen and confronted and we know how it plays out. Itâs fairly easy to spot a political campaign staffed with âtrue believersâ as opposed to one with seasoned, hardened campaign veterans. True Believer campaigns crash and burn at a much higher rate, controlling for other factors, because they donât value basic competence. A common feature of long time pols who win over and over is that they donât staff for agreement, they staff for experience and ability to win.
EAâs going to learn the same lesson at some point, itâs just a matter of how painful that learning is going to be. There are things EA as a movement is simply not good at, and theyâd be far better off bringing in non-aligned outsiders with extensive experience than hiring internally and trying to re-invent the wheel.
Hi Jeremiah. I was the hiring manager here and I think thereâs been something of a misunderstanding here: I donât think this is an accurate summary of why we made the decision we did. It feels weird to discuss this in public, but I consent to you publishing the full rejection email we sent, if you would like.
I donât particularly feel it would be a valuable use of anyoneâs time to get into a drawn out public back-and-forth debate where we both nitpick the implications of various word choices. Iâll just say that if your intention was to communicate something other than âWe prefer a candidate who is tightly value alignedâ then there was a significant failure of communication and you shouldnât have specifically used the phrase âtightly alignedâ in the same sentence as the rejection.
If the issue is that CEA communicated poorly or you misunderstood the rejection, I agree thatâs not necessarily worth getting into. But youâve made a strong claim about how CEA makes decisions based on the contents of a message, whose author is willing to make public. It looks to me like you essentially have two choices:
Agree to make the message public, or
Onlookers interpret this as an admission that your claim was exaggerated.
Iâm strongly downvoting the parent comment for now, since I donât think it should be particularly visible. Iâll reverse the downvote if you release the rejection letter and it is as youâve represented.
Iâm sorry you had such a frustrating experience. The work of yours Iâve seen has been excellent, and I hope you find a place to use your skills within EA (or keep crushing it in other places where you have the chance to make an impact).
Some hiring processes definitely revolve around value alignment, but I also know a lot of people hired at major orgs who didnât have any particular connection to EA, and who seem to just be really good at what they do. âHire good people, alignment is secondaryâ still feels like the most common approach based on the processes Iâve seen and been involved with, but my data is anecdata (and may be less applicable to meta-focused positions, I suppose).
I think that most Forum users would agree with most of what youâve written here, and I donât see much in the post that I consider controversial (maybe the peer review claim?) .
I gave the post an upvote, and Iâm glad someone wrote down all these sensible points in one place, but I found the conventional wisdom and the vibe of âthis will surprise youâ to be an odd mix. (Though I might be more normie relative to the EA community than I realize?)
*****
As is the nature of comments, these will mostly be pushback/ânitpicks, but the main thrust of the post reads well to me. You generally shouldnât take Forum posts as seriously as peer-reviewed papers in top journals, and you shouldnât expect your philosophy friends to take the place of a strong professional network unless there are unusual circumstances in play.
(That said, I love my philosophy friends, and Iâm glad I have a tight-knit set of interesting people to hang out with whenever Iâm in the right cities; thatâs not an easy thing to find these days. I also think the people Iâve met in EA are unusually good at many conventional virtues â honesty, empathy, and curiosity come to mind.)
Iâve seen this debate play out many times online, but empirically, it seems to me like EA-ish orgs with a lot of hiring power (large budgets, strong brands) are more likely than other EA-ish orgs to hire people with strong track records and relevant experience.[1]
My guess is that other orgs EA-ish would also like to hire such people in many cases, but find them to be in short supply!
People with a lot of experience and competence command high salaries and can be very selective about what they do. Naively, it seems hard to convince such people to take jobs at small charities with low budgets and no brand recognition.
When I put up a job application for a copywriting contractor at CEA , my applicants were something like:
60% people with weak-to-no track records and no EA experience/âalignment
30% people with weak-to-no track records and some EA experience/âalignment
10% people with good track records (some had EA experience, others didnât)
Fortunately, I had a bunch of things going for me:
CEA was a relatively established org with a secure budget; we gave a reliable impression and I could afford to pay competitive rates.
Writing and editing are common abilities that are easy to test.
This means I had plenty of people to choose from (10% was 20 applicants), and it was easy to see whether they had skills to go with their track records. I eventually picked an experienced professional from outside the EA community, and she did excellent work.
A lot of jobs arenât like this. If youâre trying to find someone for âgeneral operationsâ, thatâs really hard to measure. If youâre trying to find a fish welfare researcher, the talent pool is thin. Iâd guess that a lot of orgs just donât have many applicants a conventional HR department would love, so they fall back on young people with good grades and gumption who seem devoted to their missions (in other words, âweak-to-no track records and some EA experience/âalignmentâ).
*****
I also think that (some) EA orgs do more to filter for competence than most companies; Iâve applied to many conventional jobs, and been accepted to some of those, but none of them put me through work testing anywhere near as rigorous and realistic as CEAâs.
*****
Iâm not sure about the âprobablyâ here: there are many counterexamples. Ken Lay and Jeff Skilling are the first who come to mind â lots of experience, lots of fraud.
If you think that FTX was fraud borne of the need to cover up dumb mistakes, that could point more toward âexperience would have prevented the dumb mistakes, and thus the fraudâ. And lack of experience is surely correlated with bad business outcomes...
...but also, a lot of experienced businesspeople drove their own crypto companies to ruin â I grabbed the first non-FTX crypto fraud I found for comparison, and found Alex Mashinsky at the helm. (Not an experienced financial services guy by any means, but he had decades of business experience and was the CEO of a company with nine-figure revenue well before founding Celsius.)
As people Iâve edited can attest, Iâve been fighting against the âeffectiveâ label (for orgs, cause areas, etc.) for a long time. Iâm glad to have others alongside me in the fight!
Better alternatives: âPromisingâ, âworking in a promising areaâ, âpotentially highly impactfulâ, âhigh-potentialâ⊠you canât just assume success before youâve started.
On the âwould not succeedâ point: I think that this is true of EA orgs, and also all other types of orgs. Most new businesses fail, and Iâm certain the same is true of new charities. This implies that most founders are bad at running organizations, relative to the standard required for success.
(It could also imply that EA should have fewer and larger orgs, but thatâs a question too complicated for this comment to cover.)
Open Philanthropy once ran a research hiring round with something like a thousand applicants; by contrast, when I applied to be Stuart Russellâs personal assistant at the relatively new CHAI in 2018, I think I was one of three applicants.
âIâve seen this debate play out many times online, but empirically, it seems to me like EA-ish orgs with a lot of hiring power (large budgets, strong brands) are more likely than other EA-ish orgs to hire people with strong track records and relevant experience.â
Based on speaking to people at EA orgs (and looking at the orgsâ staff lists), I disagree with this. When I have spoken to employees at CEA and Open Phil, the people Iâve spoken to have either (a) expressed frustration about how focused their org is on hiring EA people for roles that seem to not need it or (b) defended hiring EAs for roles that seem to not need it. (Iâm talking about roles in ops, personal assistants, events, finance, etc.)
Maybe I agree with your claim that large EA orgs hire more âdiverselyâ than small EA orgs, but what I read as your implication (large EA orgs do not prioritize value-alignment over experience), I disagree with. I read this as your implication since the point youâre responding to isnât focusing on large vs. small orgs.
I could point to specific teams/âroles at these orgs which are held by EAs even though they donât seem like they obviously need to be held by EAs. But that feels a little mean and targeted, like Iâm implying those people are not good for their jobs or something (which is not my intent for any specific person). And I think there are cases for wanting value-alignment in non-obvious roles, but the question is whether the tradeoff in experience is worth it.
Upvoted this.
I suspect I would advise taking them less seriously than you would advise, but Iâm not sure.
I think there might be a weak conventional consensus in that direction, yes. By looking at the conventional wisdom on this point, we donât have deal with the complicatedness of the questionâthatâs kind of my whole point. But even more importantly, perhaps fewer EA orgs that are not any larger; perhaps only two EA orgs (Iâm thinking of 80k and OpenPhil; Iâm not counting CHAI as an EA org). There is not some fixed quantity of people that need to be employed in EA orgs! Conventional wisdom would suggest, I think, that EAs should mostly be working at normal, high-quality organizations/âuniversities, getting experience under the mentorship of highly qualified (probably non-EA) people.
The range of quality in Forum posts is⊠wide, so itâs hard to say anything about them as a group. I thought for a while about how to phrase that sentence and could only come up with the mealy-mouthed version you read.
Maybe? Iâd be happy to see a huge number of additional charities at the âmedian GiveWell granteeâ level, and someone has to start those charities. Doesnât have to be people in EA â maybe the talent pool is simply too thin right now â but thereâs plenty of room for people to create organizations focused on important causes.
(But maybe youâre talking about meta orgs only, in which case Iâd need a lot more community data to know how I feel.)
I agree, and I also think this is what EA people are mostly doing.
When I open Swapcard for the most recent EA Global, and look at the first 20 attendees alphabetically (with jobs listed), I see:
Seven people in academia (students or professors); one is at the Global Priorities Institute, but it still seems like âstudying econ at Oxfordâ would be a good conventional-wisdom thing to do (Iâd be happy to yield on this, though)
Six people working in conventional jobs (this includes one each from Wave and Momentum, but despite being linked to EA, both are normal tech companies, and Wave at least has done very well by conventional standards)
One person in policy
Six people at nonprofit orgs focused on EA things
Glancing through the rest of the list, Iâd say it leans toward more âEA jobsâ than not, but this is a group that is vastly skewed in favor of âdoing EA stuffâ compared to the broad EA community as a whole, and itâs still not obvious that the people with EA jobs/âheaded for EA jobs are a majority.
(The data gets even messier if youâre willing to count, say, an Open Philanthropy researcher as someone doing a conventionally wise thing, since you seem to think OP should keep existing.)
Overall, Iâd guess that most people trying to maximize their impact with EA in mind are doing so via policy work, earning-to-give,[1] or other conventional-looking strategies; this just gets hidden by the greater visibility of people in EA roles.
Iâd love to hear counterarguments to this â Iâve held this belief for a long time, it feels uncommon, and thereâs a good chance Iâm just wrong.
This isnât a conventional way to use money, but the part where you earn money is probably very conventional (get professional skill, use professional skill in expected way, climb the ladder of your discipline).
This is very high-quality. No disputes just clarifications.
I donât just mean meta-orgs.
I think working for a well-financed grantmaking organization is not outrageously unconventional, although I suspect most lean on part-time work from well-respected academics more than OpenPhil does.
And I think 80k may just be an exception (a minor one, to some extent), borne out of an unusually clear gap in the market. I think some of their work should be done in academia instead (basically whatever work itâs possible to do), but some of the very specific stuff like the jobs board wouldnât fit there.
Also, if we imagine an Area Dad from an Onion Local News article, I donât think heâs skepticism would be quite as pronounced for 80k as for other orgs like, e.g., an AI Safety camp.
Yeah, Iâm not sure that people prioritizing the Forum over journal articles is a majority view, but it is definitely something that happens, and there are currents in EA that encourage this sort of thinking.
Iâm not saying we should not be somewhat skeptical of journal articles. There are huge problems in the peer-review world. But forum/âblogs posts, what your friends say, are not more reliable. And it is concerning that some elements of EA culture encourage you to think that they are.
Evidence for my claim, based on replies to some of Ineffective Altruismâs tweets (who makes a similar critique).
1: https://ââtwitter.com/ââIneffectiveAlt4/ââstatus/ââ1630853478053560321?s=20 Look at replies in this thread
2: https://ââtwitter.com/ââNathanpmYoung/ââstatus/ââ1630637375205576704?s=20 Look at all the various replies in this thread
(If it is inappropriate for me to link to peopleâs Twitter replies in a critical way, let me know. I feel a little uncomfortable doing this, because my point is not to name and shame any particular person. But Iâm doing it because it seems worth pushing back against the claim that âthis doesnât happen here.â I do not want to post a name-blurred screenshot because I think all replies in the thread are valuable information, not just the replies I share, so I want to enable people to click through.)
>I also think that (some) EA orgs do more to filter for competence than most companies; Iâve applied to many conventional jobs, and been accepted to some of those, but none of them put me through work testing anywhere near as rigorous and realistic as CEAâs.
I want to push back here based on recent experience. I recently applied for a job at CEA and was essentially told that my background and experience was a perfect fit and that I aced the application but that I was not âtightly value alignedâ and thus would not be getting the role.
CEA certainly had an extensive, rigorous application process. They just proceeded to not value the results of that application process. I expect they will hire someone with demonstrably less competence and experience for the role I applied for, but who is more âvalue alignedâ.
I would normally hesitate to air this sort of thing in public, but I feel this point needs to be pushed back against. As far as I can tell this sort of thing is fairly endemic within EA orgsâthere seems to be a strong, strong preference for valuing ideological purity as opposed to competence. Iâve heard similar stories from others. My small sample size is not just that this happens, but that it happens *openly*.
To relate this to the OPâs main thesisâthis is a problem other areas (for instance, politics) have already seen and confronted and we know how it plays out. Itâs fairly easy to spot a political campaign staffed with âtrue believersâ as opposed to one with seasoned, hardened campaign veterans. True Believer campaigns crash and burn at a much higher rate, controlling for other factors, because they donât value basic competence. A common feature of long time pols who win over and over is that they donât staff for agreement, they staff for experience and ability to win.
EAâs going to learn the same lesson at some point, itâs just a matter of how painful that learning is going to be. There are things EA as a movement is simply not good at, and theyâd be far better off bringing in non-aligned outsiders with extensive experience than hiring internally and trying to re-invent the wheel.
Hi Jeremiah. I was the hiring manager here and I think thereâs been something of a misunderstanding here: I donât think this is an accurate summary of why we made the decision we did. It feels weird to discuss this in public, but I consent to you publishing the full rejection email we sent, if you would like.
I donât particularly feel it would be a valuable use of anyoneâs time to get into a drawn out public back-and-forth debate where we both nitpick the implications of various word choices. Iâll just say that if your intention was to communicate something other than âWe prefer a candidate who is tightly value alignedâ then there was a significant failure of communication and you shouldnât have specifically used the phrase âtightly alignedâ in the same sentence as the rejection.
If the issue is that CEA communicated poorly or you misunderstood the rejection, I agree thatâs not necessarily worth getting into. But youâve made a strong claim about how CEA makes decisions based on the contents of a message, whose author is willing to make public. It looks to me like you essentially have two choices:
Agree to make the message public, or
Onlookers interpret this as an admission that your claim was exaggerated.
Iâm strongly downvoting the parent comment for now, since I donât think it should be particularly visible. Iâll reverse the downvote if you release the rejection letter and it is as youâve represented.
Iâm sorry you had such a frustrating experience. The work of yours Iâve seen has been excellent, and I hope you find a place to use your skills within EA (or keep crushing it in other places where you have the chance to make an impact).
Some hiring processes definitely revolve around value alignment, but I also know a lot of people hired at major orgs who didnât have any particular connection to EA, and who seem to just be really good at what they do. âHire good people, alignment is secondaryâ still feels like the most common approach based on the processes Iâve seen and been involved with, but my data is anecdata (and may be less applicable to meta-focused positions, I suppose).