I think that most Forum users would agree with most of what you’ve written here, and I don’t see much in the post that I consider controversial (maybe the peer review claim?) .
I gave the post an upvote, and I’m glad someone wrote down all these sensible points in one place, but I found the conventional wisdom and the vibe of “this will surprise you” to be an odd mix. (Though I might be more normie relative to the EA community than I realize?)
*****
As is the nature of comments, these will mostly be pushback/nitpicks, but the main thrust of the post reads well to me. You generally shouldn’t take Forum posts as seriously as peer-reviewed papers in top journals, and you shouldn’t expect your philosophy friends to take the place of a strong professional network unless there are unusual circumstances in play.
(That said, I love my philosophy friends, and I’m glad I have a tight-knit set of interesting people to hang out with whenever I’m in the right cities; that’s not an easy thing to find these days. I also think the people I’ve met in EA are unusually good at many conventional virtues — honesty, empathy, and curiosity come to mind.)
If you want to be sure that a job done is well, don’t hire an EA fresh out of college. Hire someone with a strong track record that a conventional HR department would judge as demonstrably competent. Companies hire people all the time who are not “aligned” with them, i.e. not philosophically motivated to maximize the company’s profit, and it works out fine.
I’ve seen this debate play out many times online, but empirically, it seems to me like EA-ish orgs with a lot of hiring power (large budgets, strong brands) are more likely than other EA-ish orgs to hire people with strong track records and relevant experience.[1]
My guess is that other orgs EA-ish would also like to hire such people in many cases, but find them to be in short supply!
People with a lot of experience and competence command high salaries and can be very selective about what they do. Naively, it seems hard to convince such people to take jobs at small charities with low budgets and no brand recognition.
When I put up a job application for a copywriting contractor at CEA , my applicants were something like:
60% people with weak-to-no track records and no EA experience/alignment
30% people with weak-to-no track records and some EA experience/alignment
10% people with good track records (some had EA experience, others didn’t)
Fortunately, I had a bunch of things going for me:
CEA was a relatively established org with a secure budget; we gave a reliable impression and I could afford to pay competitive rates.
Writing and editing are common abilities that are easy to test.
This means I had plenty of people to choose from (10% was 20 applicants), and it was easy to see whether they had skills to go with their track records. I eventually picked an experienced professional from outside the EA community, and she did excellent work.
A lot of jobs aren’t like this. If you’re trying to find someone for “general operations”, that’s really hard to measure. If you’re trying to find a fish welfare researcher, the talent pool is thin. I’d guess that a lot of orgs just don’t have many applicants a conventional HR department would love, so they fall back on young people with good grades and gumption who seem devoted to their missions (in other words, “weak-to-no track records and some EA experience/alignment”).
*****
I also think that (some) EA orgs do more to filter for competence than most companies; I’ve applied to many conventional jobs, and been accepted to some of those, but none of them put me through work testing anywhere near as rigorous and realistic as CEA’s.
*****
If “grown-ups” had been involved at the officer level at FTX, I claim fraud probably would not have occurred. I can’t say I predicted FTX’s collapse, but I didn’t know it was being run by people with no experience.
I’m not sure about the “probably” here: there are many counterexamples. Ken Lay and Jeff Skilling are the first who come to mind — lots of experience, lots of fraud.
If you think that FTX was fraud borne of the need to cover up dumb mistakes, that could point more toward “experience would have prevented the dumb mistakes, and thus the fraud”. And lack of experience is surely correlated with bad business outcomes...
...but also, a lot of experienced businesspeople drove their own crypto companies to ruin — I grabbed the first non-FTX crypto fraud I found for comparison, and found Alex Mashinsky at the helm. (Not an experienced financial services guy by any means, but he had decades of business experience and was the CEO of a company with nine-figure revenue well before founding Celsius.)
Many small organizations are founded by EAs who would not succeed at being hired to run an analogous non-EA organization of similar scope and size. I (tentatively) think that these organizations, which are sometimes given the outrageously unsubstantiated denomination “effective organization”, are mostly ineffective.
As people I’ve edited can attest, I’ve been fighting against the “effective” label (for orgs, cause areas, etc.) for a long time. I’m glad to have others alongside me in the fight!
Better alternatives: “Promising”, “working in a promising area”, “potentially highly impactful”, “high-potential”… you can’t just assume success before you’ve started.
On the “would not succeed” point: I think that this is true of EA orgs, and also all other types of orgs. Most new businesses fail, and I’m certain the same is true of new charities. This implies that most founders are bad at running organizations, relative to the standard required for success.
(It could also imply that EA should have fewer and larger orgs, but that’s a question too complicated for this comment to cover.)
Open Philanthropy once ran a research hiring round with something like a thousand applicants; by contrast, when I applied to be Stuart Russell’s personal assistant at the relatively new CHAI in 2018, I think I was one of three applicants.
“I’ve seen this debate play out many times online, but empirically, it seems to me like EA-ish orgs with a lot of hiring power (large budgets, strong brands) are more likely than other EA-ish orgs to hire people with strong track records and relevant experience.”
Based on speaking to people at EA orgs (and looking at the orgs’ staff lists), I disagree with this. When I have spoken to employees at CEA and Open Phil, the people I’ve spoken to have either (a) expressed frustration about how focused their org is on hiring EA people for roles that seem to not need it or (b) defended hiring EAs for roles that seem to not need it. (I’m talking about roles in ops, personal assistants, events, finance, etc.)
Maybe I agree with your claim that large EA orgs hire more “diversely” than small EA orgs, but what I read as your implication (large EA orgs do not prioritize value-alignment over experience), I disagree with. I read this as your implication since the point you’re responding to isn’t focusing on large vs. small orgs.
I could point to specific teams/roles at these orgs which are held by EAs even though they don’t seem like they obviously need to be held by EAs. But that feels a little mean and targeted, like I’m implying those people are not good for their jobs or something (which is not my intent for any specific person). And I think there are cases for wanting value-alignment in non-obvious roles, but the question is whether the tradeoff in experience is worth it.
You generally shouldn’t take Forum posts as seriously as peer-reviewed papers in top journals
I suspect I would advise taking them less seriously than you would advise, but I’m not sure.
It could also imply that EA should have fewer and larger orgs, but that’s a question too complicated for this comment to cover
I think there might be a weak conventional consensus in that direction, yes. By looking at the conventional wisdom on this point, we don’t have deal with the complicatedness of the question—that’s kind of my whole point. But even more importantly, perhaps fewer EA orgs that are not any larger; perhaps only two EA orgs (I’m thinking of 80k and OpenPhil; I’m not counting CHAI as an EA org). There is not some fixed quantity of people that need to be employed in EA orgs! Conventional wisdom would suggest, I think, that EAs should mostly be working at normal, high-quality organizations/universities, getting experience under the mentorship of highly qualified (probably non-EA) people.
I suspect I would advise taking them less seriously than you would advise, but I’m not sure.
The range of quality in Forum posts is… wide, so it’s hard to say anything about them as a group. I thought for a while about how to phrase that sentence and could only come up with the mealy-mouthed version you read.
But even more importantly, perhaps fewer EA orgs that are not any larger.
Maybe? I’d be happy to see a huge number of additional charities at the “median GiveWell grantee” level, and someone has to start those charities. Doesn’t have to be people in EA — maybe the talent pool is simply too thin right now — but there’s plenty of room for people to create organizations focused on important causes.
(But maybe you’re talking about meta orgs only, in which case I’d need a lot more community data to know how I feel.)
Conventional wisdom would suggest, I think, that EAs should mostly be working at normal, high-quality organizations/universities, getting experience under the mentorship of highly qualified (probably non-EA) people.
I agree, and I also think this is what EA people are mostly doing.
When I open Swapcard for the most recent EA Global, and look at the first 20 attendees alphabetically (with jobs listed), I see:
Seven people in academia (students or professors); one is at the Global Priorities Institute, but it still seems like “studying econ at Oxford” would be a good conventional-wisdom thing to do (I’d be happy to yield on this, though)
Six people working in conventional jobs (this includes one each from Wave and Momentum, but despite being linked to EA, both are normal tech companies, and Wave at least has done very well by conventional standards)
One person in policy
Six people at nonprofit orgs focused on EA things
Glancing through the rest of the list, I’d say it leans toward more “EA jobs” than not, but this is a group that is vastly skewed in favor of “doing EA stuff” compared to the broad EA community as a whole, and it’s still not obvious that the people with EA jobs/headed for EA jobs are a majority.
(The data gets even messier if you’re willing to count, say, an Open Philanthropy researcher as someone doing a conventionally wise thing, since you seem to think OP should keep existing.)
Overall, I’d guess that most people trying to maximize their impact with EA in mind are doing so via policy work, earning-to-give,[1] or other conventional-looking strategies; this just gets hidden by the greater visibility of people in EA roles.
I’d love to hear counterarguments to this — I’ve held this belief for a long time, it feels uncommon, and there’s a good chance I’m just wrong.
This isn’t a conventional way to use money, but the part where you earn money is probably very conventional (get professional skill, use professional skill in expected way, climb the ladder of your discipline).
This is very high-quality. No disputes just clarifications.
I don’t just mean meta-orgs.
I think working for a well-financed grantmaking organization is not outrageously unconventional, although I suspect most lean on part-time work from well-respected academics more than OpenPhil does.
And I think 80k may just be an exception (a minor one, to some extent), borne out of an unusually clear gap in the market. I think some of their work should be done in academia instead (basically whatever work it’s possible to do), but some of the very specific stuff like the jobs board wouldn’t fit there.
Also, if we imagine an Area Dad from an Onion Local News article, I don’t think he’s skepticism would be quite as pronounced for 80k as for other orgs like, e.g., an AI Safety camp.
Yeah, I’m not sure that people prioritizing the Forum over journal articles is a majority view, but it is definitely something that happens, and there are currents in EA that encourage this sort of thinking.
I’m not saying we should not be somewhat skeptical of journal articles. There are huge problems in the peer-review world. But forum/blogs posts, what your friends say, are not more reliable. And it is concerning that some elements of EA culture encourage you to think that they are.
Evidence for my claim, based on replies to some of Ineffective Altruism’s tweets (who makes a similar critique).
(If it is inappropriate for me to link to people’s Twitter replies in a critical way, let me know. I feel a little uncomfortable doing this, because my point is not to name and shame any particular person. But I’m doing it because it seems worth pushing back against the claim that “this doesn’t happen here.” I do not want to post a name-blurred screenshot because I think all replies in the thread are valuable information, not just the replies I share, so I want to enable people to click through.)
>I also think that (some) EA orgs do more to filter for competence than most companies; I’ve applied to many conventional jobs, and been accepted to some of those, but none of them put me through work testing anywhere near as rigorous and realistic as CEA’s.
I want to push back here based on recent experience. I recently applied for a job at CEA and was essentially told that my background and experience was a perfect fit and that I aced the application but that I was not ‘tightly value aligned’ and thus would not be getting the role.
CEA certainly had an extensive, rigorous application process. They just proceeded to not value the results of that application process. I expect they will hire someone with demonstrably less competence and experience for the role I applied for, but who is more ‘value aligned’.
I would normally hesitate to air this sort of thing in public, but I feel this point needs to be pushed back against. As far as I can tell this sort of thing is fairly endemic within EA orgs—there seems to be a strong, strong preference for valuing ideological purity as opposed to competence. I’ve heard similar stories from others. My small sample size is not just that this happens, but that it happens *openly*.
To relate this to the OP’s main thesis—this is a problem other areas (for instance, politics) have already seen and confronted and we know how it plays out. It’s fairly easy to spot a political campaign staffed with ‘true believers’ as opposed to one with seasoned, hardened campaign veterans. True Believer campaigns crash and burn at a much higher rate, controlling for other factors, because they don’t value basic competence. A common feature of long time pols who win over and over is that they don’t staff for agreement, they staff for experience and ability to win.
EA’s going to learn the same lesson at some point, it’s just a matter of how painful that learning is going to be. There are things EA as a movement is simply not good at, and they’d be far better off bringing in non-aligned outsiders with extensive experience than hiring internally and trying to re-invent the wheel.
Hi Jeremiah. I was the hiring manager here and I think there’s been something of a misunderstanding here: I don’t think this is an accurate summary of why we made the decision we did. It feels weird to discuss this in public, but I consent to you publishing the full rejection email we sent, if you would like.
I don’t particularly feel it would be a valuable use of anyone’s time to get into a drawn out public back-and-forth debate where we both nitpick the implications of various word choices. I’ll just say that if your intention was to communicate something other than “We prefer a candidate who is tightly value aligned” then there was a significant failure of communication and you shouldn’t have specifically used the phrase “tightly aligned” in the same sentence as the rejection.
If the issue is that CEA communicated poorly or you misunderstood the rejection, I agree that’s not necessarily worth getting into. But you’ve made a strong claim about how CEA makes decisions based on the contents of a message, whose author is willing to make public. It looks to me like you essentially have two choices:
Agree to make the message public, or
Onlookers interpret this as an admission that your claim was exaggerated.
I’m strongly downvoting the parent comment for now, since I don’t think it should be particularly visible. I’ll reverse the downvote if you release the rejection letter and it is as you’ve represented.
I’m sorry you had such a frustrating experience. The work of yours I’ve seen has been excellent, and I hope you find a place to use your skills within EA (or keep crushing it in other places where you have the chance to make an impact).
Some hiring processes definitely revolve around value alignment, but I also know a lot of people hired at major orgs who didn’t have any particular connection to EA, and who seem to just be really good at what they do. “Hire good people, alignment is secondary” still feels like the most common approach based on the processes I’ve seen and been involved with, but my data is anecdata (and may be less applicable to meta-focused positions, I suppose).
I think that most Forum users would agree with most of what you’ve written here, and I don’t see much in the post that I consider controversial (maybe the peer review claim?) .
I gave the post an upvote, and I’m glad someone wrote down all these sensible points in one place, but I found the conventional wisdom and the vibe of “this will surprise you” to be an odd mix. (Though I might be more normie relative to the EA community than I realize?)
*****
As is the nature of comments, these will mostly be pushback/nitpicks, but the main thrust of the post reads well to me. You generally shouldn’t take Forum posts as seriously as peer-reviewed papers in top journals, and you shouldn’t expect your philosophy friends to take the place of a strong professional network unless there are unusual circumstances in play.
(That said, I love my philosophy friends, and I’m glad I have a tight-knit set of interesting people to hang out with whenever I’m in the right cities; that’s not an easy thing to find these days. I also think the people I’ve met in EA are unusually good at many conventional virtues — honesty, empathy, and curiosity come to mind.)
I’ve seen this debate play out many times online, but empirically, it seems to me like EA-ish orgs with a lot of hiring power (large budgets, strong brands) are more likely than other EA-ish orgs to hire people with strong track records and relevant experience.[1]
My guess is that other orgs EA-ish would also like to hire such people in many cases, but find them to be in short supply!
People with a lot of experience and competence command high salaries and can be very selective about what they do. Naively, it seems hard to convince such people to take jobs at small charities with low budgets and no brand recognition.
When I put up a job application for a copywriting contractor at CEA , my applicants were something like:
60% people with weak-to-no track records and no EA experience/alignment
30% people with weak-to-no track records and some EA experience/alignment
10% people with good track records (some had EA experience, others didn’t)
Fortunately, I had a bunch of things going for me:
CEA was a relatively established org with a secure budget; we gave a reliable impression and I could afford to pay competitive rates.
Writing and editing are common abilities that are easy to test.
This means I had plenty of people to choose from (10% was 20 applicants), and it was easy to see whether they had skills to go with their track records. I eventually picked an experienced professional from outside the EA community, and she did excellent work.
A lot of jobs aren’t like this. If you’re trying to find someone for “general operations”, that’s really hard to measure. If you’re trying to find a fish welfare researcher, the talent pool is thin. I’d guess that a lot of orgs just don’t have many applicants a conventional HR department would love, so they fall back on young people with good grades and gumption who seem devoted to their missions (in other words, “weak-to-no track records and some EA experience/alignment”).
*****
I also think that (some) EA orgs do more to filter for competence than most companies; I’ve applied to many conventional jobs, and been accepted to some of those, but none of them put me through work testing anywhere near as rigorous and realistic as CEA’s.
*****
I’m not sure about the “probably” here: there are many counterexamples. Ken Lay and Jeff Skilling are the first who come to mind — lots of experience, lots of fraud.
If you think that FTX was fraud borne of the need to cover up dumb mistakes, that could point more toward “experience would have prevented the dumb mistakes, and thus the fraud”. And lack of experience is surely correlated with bad business outcomes...
...but also, a lot of experienced businesspeople drove their own crypto companies to ruin — I grabbed the first non-FTX crypto fraud I found for comparison, and found Alex Mashinsky at the helm. (Not an experienced financial services guy by any means, but he had decades of business experience and was the CEO of a company with nine-figure revenue well before founding Celsius.)
As people I’ve edited can attest, I’ve been fighting against the “effective” label (for orgs, cause areas, etc.) for a long time. I’m glad to have others alongside me in the fight!
Better alternatives: “Promising”, “working in a promising area”, “potentially highly impactful”, “high-potential”… you can’t just assume success before you’ve started.
On the “would not succeed” point: I think that this is true of EA orgs, and also all other types of orgs. Most new businesses fail, and I’m certain the same is true of new charities. This implies that most founders are bad at running organizations, relative to the standard required for success.
(It could also imply that EA should have fewer and larger orgs, but that’s a question too complicated for this comment to cover.)
Open Philanthropy once ran a research hiring round with something like a thousand applicants; by contrast, when I applied to be Stuart Russell’s personal assistant at the relatively new CHAI in 2018, I think I was one of three applicants.
“I’ve seen this debate play out many times online, but empirically, it seems to me like EA-ish orgs with a lot of hiring power (large budgets, strong brands) are more likely than other EA-ish orgs to hire people with strong track records and relevant experience.”
Based on speaking to people at EA orgs (and looking at the orgs’ staff lists), I disagree with this. When I have spoken to employees at CEA and Open Phil, the people I’ve spoken to have either (a) expressed frustration about how focused their org is on hiring EA people for roles that seem to not need it or (b) defended hiring EAs for roles that seem to not need it. (I’m talking about roles in ops, personal assistants, events, finance, etc.)
Maybe I agree with your claim that large EA orgs hire more “diversely” than small EA orgs, but what I read as your implication (large EA orgs do not prioritize value-alignment over experience), I disagree with. I read this as your implication since the point you’re responding to isn’t focusing on large vs. small orgs.
I could point to specific teams/roles at these orgs which are held by EAs even though they don’t seem like they obviously need to be held by EAs. But that feels a little mean and targeted, like I’m implying those people are not good for their jobs or something (which is not my intent for any specific person). And I think there are cases for wanting value-alignment in non-obvious roles, but the question is whether the tradeoff in experience is worth it.
Upvoted this.
I suspect I would advise taking them less seriously than you would advise, but I’m not sure.
I think there might be a weak conventional consensus in that direction, yes. By looking at the conventional wisdom on this point, we don’t have deal with the complicatedness of the question—that’s kind of my whole point. But even more importantly, perhaps fewer EA orgs that are not any larger; perhaps only two EA orgs (I’m thinking of 80k and OpenPhil; I’m not counting CHAI as an EA org). There is not some fixed quantity of people that need to be employed in EA orgs! Conventional wisdom would suggest, I think, that EAs should mostly be working at normal, high-quality organizations/universities, getting experience under the mentorship of highly qualified (probably non-EA) people.
The range of quality in Forum posts is… wide, so it’s hard to say anything about them as a group. I thought for a while about how to phrase that sentence and could only come up with the mealy-mouthed version you read.
Maybe? I’d be happy to see a huge number of additional charities at the “median GiveWell grantee” level, and someone has to start those charities. Doesn’t have to be people in EA — maybe the talent pool is simply too thin right now — but there’s plenty of room for people to create organizations focused on important causes.
(But maybe you’re talking about meta orgs only, in which case I’d need a lot more community data to know how I feel.)
I agree, and I also think this is what EA people are mostly doing.
When I open Swapcard for the most recent EA Global, and look at the first 20 attendees alphabetically (with jobs listed), I see:
Seven people in academia (students or professors); one is at the Global Priorities Institute, but it still seems like “studying econ at Oxford” would be a good conventional-wisdom thing to do (I’d be happy to yield on this, though)
Six people working in conventional jobs (this includes one each from Wave and Momentum, but despite being linked to EA, both are normal tech companies, and Wave at least has done very well by conventional standards)
One person in policy
Six people at nonprofit orgs focused on EA things
Glancing through the rest of the list, I’d say it leans toward more “EA jobs” than not, but this is a group that is vastly skewed in favor of “doing EA stuff” compared to the broad EA community as a whole, and it’s still not obvious that the people with EA jobs/headed for EA jobs are a majority.
(The data gets even messier if you’re willing to count, say, an Open Philanthropy researcher as someone doing a conventionally wise thing, since you seem to think OP should keep existing.)
Overall, I’d guess that most people trying to maximize their impact with EA in mind are doing so via policy work, earning-to-give,[1] or other conventional-looking strategies; this just gets hidden by the greater visibility of people in EA roles.
I’d love to hear counterarguments to this — I’ve held this belief for a long time, it feels uncommon, and there’s a good chance I’m just wrong.
This isn’t a conventional way to use money, but the part where you earn money is probably very conventional (get professional skill, use professional skill in expected way, climb the ladder of your discipline).
This is very high-quality. No disputes just clarifications.
I don’t just mean meta-orgs.
I think working for a well-financed grantmaking organization is not outrageously unconventional, although I suspect most lean on part-time work from well-respected academics more than OpenPhil does.
And I think 80k may just be an exception (a minor one, to some extent), borne out of an unusually clear gap in the market. I think some of their work should be done in academia instead (basically whatever work it’s possible to do), but some of the very specific stuff like the jobs board wouldn’t fit there.
Also, if we imagine an Area Dad from an Onion Local News article, I don’t think he’s skepticism would be quite as pronounced for 80k as for other orgs like, e.g., an AI Safety camp.
Yeah, I’m not sure that people prioritizing the Forum over journal articles is a majority view, but it is definitely something that happens, and there are currents in EA that encourage this sort of thinking.
I’m not saying we should not be somewhat skeptical of journal articles. There are huge problems in the peer-review world. But forum/blogs posts, what your friends say, are not more reliable. And it is concerning that some elements of EA culture encourage you to think that they are.
Evidence for my claim, based on replies to some of Ineffective Altruism’s tweets (who makes a similar critique).
1: https://twitter.com/IneffectiveAlt4/status/1630853478053560321?s=20 Look at replies in this thread
2: https://twitter.com/NathanpmYoung/status/1630637375205576704?s=20 Look at all the various replies in this thread
(If it is inappropriate for me to link to people’s Twitter replies in a critical way, let me know. I feel a little uncomfortable doing this, because my point is not to name and shame any particular person. But I’m doing it because it seems worth pushing back against the claim that “this doesn’t happen here.” I do not want to post a name-blurred screenshot because I think all replies in the thread are valuable information, not just the replies I share, so I want to enable people to click through.)
>I also think that (some) EA orgs do more to filter for competence than most companies; I’ve applied to many conventional jobs, and been accepted to some of those, but none of them put me through work testing anywhere near as rigorous and realistic as CEA’s.
I want to push back here based on recent experience. I recently applied for a job at CEA and was essentially told that my background and experience was a perfect fit and that I aced the application but that I was not ‘tightly value aligned’ and thus would not be getting the role.
CEA certainly had an extensive, rigorous application process. They just proceeded to not value the results of that application process. I expect they will hire someone with demonstrably less competence and experience for the role I applied for, but who is more ‘value aligned’.
I would normally hesitate to air this sort of thing in public, but I feel this point needs to be pushed back against. As far as I can tell this sort of thing is fairly endemic within EA orgs—there seems to be a strong, strong preference for valuing ideological purity as opposed to competence. I’ve heard similar stories from others. My small sample size is not just that this happens, but that it happens *openly*.
To relate this to the OP’s main thesis—this is a problem other areas (for instance, politics) have already seen and confronted and we know how it plays out. It’s fairly easy to spot a political campaign staffed with ‘true believers’ as opposed to one with seasoned, hardened campaign veterans. True Believer campaigns crash and burn at a much higher rate, controlling for other factors, because they don’t value basic competence. A common feature of long time pols who win over and over is that they don’t staff for agreement, they staff for experience and ability to win.
EA’s going to learn the same lesson at some point, it’s just a matter of how painful that learning is going to be. There are things EA as a movement is simply not good at, and they’d be far better off bringing in non-aligned outsiders with extensive experience than hiring internally and trying to re-invent the wheel.
Hi Jeremiah. I was the hiring manager here and I think there’s been something of a misunderstanding here: I don’t think this is an accurate summary of why we made the decision we did. It feels weird to discuss this in public, but I consent to you publishing the full rejection email we sent, if you would like.
I don’t particularly feel it would be a valuable use of anyone’s time to get into a drawn out public back-and-forth debate where we both nitpick the implications of various word choices. I’ll just say that if your intention was to communicate something other than “We prefer a candidate who is tightly value aligned” then there was a significant failure of communication and you shouldn’t have specifically used the phrase “tightly aligned” in the same sentence as the rejection.
If the issue is that CEA communicated poorly or you misunderstood the rejection, I agree that’s not necessarily worth getting into. But you’ve made a strong claim about how CEA makes decisions based on the contents of a message, whose author is willing to make public. It looks to me like you essentially have two choices:
Agree to make the message public, or
Onlookers interpret this as an admission that your claim was exaggerated.
I’m strongly downvoting the parent comment for now, since I don’t think it should be particularly visible. I’ll reverse the downvote if you release the rejection letter and it is as you’ve represented.
I’m sorry you had such a frustrating experience. The work of yours I’ve seen has been excellent, and I hope you find a place to use your skills within EA (or keep crushing it in other places where you have the chance to make an impact).
Some hiring processes definitely revolve around value alignment, but I also know a lot of people hired at major orgs who didn’t have any particular connection to EA, and who seem to just be really good at what they do. “Hire good people, alignment is secondary” still feels like the most common approach based on the processes I’ve seen and been involved with, but my data is anecdata (and may be less applicable to meta-focused positions, I suppose).