I’m glad CEA is sharing this data, but I wish the post focused more on the important question posed in the title: Is it still hard to get a job in EA? I think the data suggests quite strongly that the answer is “yes”, but the authors don’t seem to take an explicit stance on the issue and the Summary of the post arguably suggests the opposite conclusion.
Here’s why I think the data implies EA jobs are still hard to get:
Only a (very?) small percentage of applications led to a job. The report describes a 2.4% hiring rate for Core jobs (itself a low figure), but I think a more accurate number would be 1.85% = (7 hires/ (54 applications per role * 7 roles)). The 2.4% figure is “Weighted by the number of applicants in each stage” but I’m not actually sure what that means, and in any case the raw number of hires/applicants makes the most sense to me. I also think it’s important to include the EOI numbers; these roles received just as many applicants as Core jobs, but led to no hires. (For the same reason, I’d find it valuable to include data from Operations and other roles that were excluded from this analysis, as they all help inform the question of whether EA jobs are hard to get.) Including EOI applications, we get a hiring rate of 1.1% of total applications.
If “Ashby hires 4% of applicants, compared to 2% at CEA”, it seems like quite a stretch for the Summary to say “the probability of an applicant receiving an offer was similar to industry averages.” At the very least, I’d like to see the figures included in the Summary so readers could draw their own conclusion. Also, while Ashby hired 4% of applicants they gave job offers to 8% of applicants; the latter figure is more relevant to the question at hand. CEA’s analysis implies that their offer rate and hire rate were the same (please correct me if I’m misinterpreting), which itself is a suggestion that EA jobs are hard to get. If the offer rate in the Ashby survey is 4x CEA’s offer rate (8% vs 2%), that’s a very sizeable difference.
The Product Manager jobs included in the Ashby survey were all jobs at “high-growth tech companies”, which are probably more competitive than the same role at “all tech companies” or “all industries”. So I’d interpret the benchmarking data as saying that CEA’s Product Manager role was more competitive (much more competitive if CEA’s offer rate equals its hire rate) than an unusually competitive subset of private sector Product Manager jobs.
I agree with the comments from Vaidehi and Denkenberger about why this analysis may understate the availability of EA jobs.
There are a few different ways to look at the probability of being hired. As you suggest, one would be to take the total number of hires and divide it by the total number of applicants, across all recruitment. We chose not do to this here because the EOIs are substantially different from the Core roles (in having a higher bar for progression, etc.), which would make an overall figure less useful. (The CEA website does emphasise the difference between main roles and EOIs, so it is something prospective applicants are made aware of when applying.)
When we “weight by the numbers of applicants in each stage”, this just means that we’re taking the average across applicants, and not across roles. (Worked example: two Roles A and B each hired one person. Role A has 100 people in stage 1, with probability of success 1/100=1%; Role B has 10 people in stage 2, with probability of success 1/10=10%. The probability of success when weighting is (1%*100 + 10%*10)/110 = 2⁄110 = 1.8%; but when averaging across roles it is (1%+10%)/2 = 5.5%)
Regarding the industry comparison, as you mention there are ways in which CEA might be more selective than industry and other ways in which CEA might be less selective. As Ben mentions in an earlier comment, we probably don’t have solid enough evidence to call it in one direction or another.
EOIs are substantially different from the Core roles (in having a higher bar for progression, etc.), which would make an overall figure less useful.
If EOIs are hard to get, that seems relevant to the question of whether EA jobs are hard to get since EOIs are quite sought after (as many applicants as core jobs despite less chance of getting hired). But since AFAIK CEA is the only EA org that has EOIs, I can certainly see the case for excluding them from the sample.
we’re taking the average across applicants, and not across roles.
100% agree this is the right methodology. But I still think 1.85% is the relevant number (number of hires/number of applicants). From your answer to Khorton, it sounds like your 2.4% figure excludes the Core job you didn’t hire for (which seems to have gotten more applicants than the average core job). I don’t understand that decision, and think it makes it harder to answer the question of whether EA jobs are hard to get.
Regarding the industry comparison, as you mention there are ways in which CEA might be more selective than industry and other ways in which CEA might be less selective. As Ben mentions in an earlier comment, we probably don’t have solid enough evidence to call it in one direction or another.
Can you provide CEA’s offer rate, for the PM role and for core jobs overall? Hire rate really isn’t the best measure for determining whether jobs are hard to get.
FWIW, I’m not sure why Ben thinks hires as a “percent of applicants who get to the people ops interview stage” (the only stage where CEA is more likely to hire, and not an apples-to-apples comparison since CEA has a work trial before it and Ashby doesn’t) is the right metric. He suggests he likes that metric as a way to exclude low-quality applicants, but the better way to do that is to look at hires (or ideally offers) as a percent of people who make it past the initial screen (which is more restrictive for Ashby than CEA). CEA hires 1 in 28 people who make it past the first screen; the Ashby sample hires 1 in 12 (and makes offers to 2 in 12).
Hey, apologies that it has taken us so long to get back to you on this.
From your answer to Khorton, it sounds like your 2.4% figure excludes the Core job you didn’t hire for (which seems to have gotten more applicants than the average core job). I don’t understand that decision, and think it makes it harder to answer the question of whether EA jobs are hard to get.
Thanks for pointing this out! You’ve shed light on an important point.The 2.4% figure can be thought of as “the probability of being hired, conditional on clearing a hiring bar” and the 1.85% figure is the “probability of being hired at all”; on reflection I agree that the latter would be more useful in this case. I’ve updated the post to reflect this.
Can you provide CEA’s offer rate, for the PM role and for core jobs overall?
For the PM role there was only one offer made (to the one hire), so a rate of 1/52=1.9%.
For core jobs overall, on average there was just one offer made for each[1]. The average number of applications was 53.7, so the average offer rate for core roles is 1⁄53.7=1.9%.
Thanks for updating the post and providing the offer rate data! As I mentioned in my response to Ben, I think CEA’s much lower offer rates relative to those in the Ashby survey and CEA’s 100% offer acceptance rate are strong evidence that EA jobs are hard to get.
Sorry for my slow response here, I missed the notification about your comment.
If EOIs are hard to get, that seems relevant to the question of whether EA jobs are hard to get since EOIs are quite sought after
I think maybe we just didn’t explain what EOIs are well. As an example: we had a product manager EOI; once we opened a full hiring round for PMs we contacted all the people who filled out the EOI and said “hey are you still looking for a PM position” and then moved the ones who said “yes” into the p.m. hiring round.[1]
I’m not sure why Ben thinks hires as a “percent of applicants who get to the people ops interview stage” (the only stage where CEA is more likely to hire, and not an apples-to-apples comparison since CEA has a work trial before it and Ashby doesn’t) is the right metric
My conclusion was: “in some ways CEA is more selective, and in other ways we are less; I think the methodology we used isn’t precise enough to make a stronger statement than ‘we are about the same.’”
I don’t think one of these comparison points is the “right metric” – they all have varying degrees of usefulness, and you and I might disagree a bit about their relative value, but, given their contradictory conclusions, I don’t think you can draw strong conclusions other than “we are about the same”.
Sometimes exceptional candidates are hired straight from an EOI, the example I give is specific to that role. I think in retrospect we should have just left EOIs off, as the data was more confusing than helpful.
Thanks for clarifying how the EOIs work, I had a different impression from the OP.
I still strongly disagree with the following statement:
in some ways CEA is more selective, and in other ways we are less; I think the methodology we used isn’t precise enough to make a stronger statement than ‘we are about the same.’”
Which are the ways in which CEA is less selective? You mentioned in a previous comment that “ we hire a substantially greater percent of applicants who get to the people ops interview stage” and I cited that interpretation in my own comment, but on closer look I don’t think that’s right (or perhaps I’m confused by what you mean by this?) As the OP states “At CEA 1 in 7 of those reaching a people ops interview get hired, compared to 1 in 5 at Ashby” which would have CEA as more selective. And if you look at offer rate (more relevant to the question of selectivity than hire rate) the difference is quite big (1 in 7 at CEA vs. 2 in 5 at Ashby, or 14% vs. 40%, a difference of 2.8x).
The difference is even bigger if you compare offer rate as a percentage of all applicants (2% for CEA vs. 9% at Ashby, or 4.5x) or as a percentage of applicants that passed an initial screen to weed out obviously unqualified applicants (4% for CEA vs. 17% for Ashby, or 4.7x). I also think it is quite notable that CEA’s offer acceptance rate across all roles was 100% vs. 50% for PM roles in the Ashby survey.
This data shows clear, consistent, and large differences all suggesting that CEA is much more selective than the industry benchmark (which itself is likely highly selective since it looks at high tech growth companies). Unless I’m missing some counter-evidence, I think that should be the conclusion. And I think the OP’s summary of the industry benchmark exercise is extremely misleading: “We compared our Product Manager hiring round to industry benchmark data. We found that the probability of an applicant receiving an offer was similar to industry averages.” If CEA made an offer to 1 of 52 applicants, and the benchmark survey had an offer to ~1 in 12, how is that similar?
Thanks yeah sorry, there is a greater change in the percentage of drop off for Ashby on-site → hired, but because we start with a smaller pool we are still more selective. 1 in 7 versus 1 in 5 is the correct comparison.
This data shows clear, consistent, and large differences all suggesting that CEA is much more selective than the industry benchmark
I guess I’m flattered that you trust the research we did here so much, but I think it’s very much not clear:
The number of applicants we get is very heavily influenced by how widely we promote the position, if the job happens to get posted to a job aggregator site, etc. To take a concrete example: six months ago we hired for a PM and got 52 applicants; last month we opened another PM position which got on to some non-EA job boards and got 113 applicants. If we hire one person from each round, I think you will say that we have gotten more than twice as selective, which is I guess kind of true, but our hiring bar hasn’t really changed (the person who we hired last time would be a top candidate this time).
I don’t really know what Ashby’s candidate pool is like, but I would guess their average applicant has more experience than ours – for example: none of our final candidates last round ever even had the job title “product manager” before, though they had had related roles, and in the current round neither of the two people at the furthest round in the process have ever had a PM role. I would be pretty surprised if Ashby’s final rounds were consistently made up of people who had never been PMs before.[1]
The conclusion of this post was “Overall, CEA might be slightly more selective than Ashby’s customers, but it does not seem like the difference is large” and that still seems basically right to me: 1⁄7 vs. 1⁄5 is more selective, but well within the margin of error given how much uncertainty I have.
I think the OP’s summary of the industry benchmark exercise is extremely misleading
Thanks – I just cut that sentence since my inability to communicate my view even with our substantial back-and-forth makes me pessimistic about making a summary.
In general, I would guess that CEA’s applicants have substantially less experience than their for-profit counterparts, as EA is quite young, but our applicants are more impressive given their age. E.g. we get a lot of college student applicants, but those students are going to prestigious universities.
The conclusion of this post was “Overall, CEA might be slightly more selective than Ashby’s customers, but it does not seem like the difference is large” and that still seems basically right to me: 1⁄7 vs. 1⁄5 is more selective, but well within the margin of error given how much uncertainty I have.
The 1⁄7 vs. 1⁄5 comparison is based on hire rate, but offer rate is more relevant to selectivity (if you disagree, could you explain why?) The difference in offer rate is 14% for CEA vs. 40% for Ashby; I’d be quite surprised if this large difference is still within your margin of error given your uncertainty.
Stepping back, the key question is whether EA jobs are still hard to get. As you note, any single perspective from CEA’s recruiting data will be imperfect. But they’ll be imperfect in different ways. For example, looking at offer rate as a percentage of applicant could be distorted by large numbers of clearly unqualified applicants, but this can be avoided by looking at offers as a percentage of people who made it past an initial screen or people who made it to a people ops interview. And problems with Ashby as a benchmark don’t apply if you’re assessing CEA’s selectivity in an absolute sense (or relative to benchmarks other than Ashby). If you look across a variety of perspectives and they all tell the same story, that’s probably the right story.
When I look at the different perspectives on the CEA’s recruitment data, they all tell the story that jobs at CEA are (very) hard to get. I don’t see any metrics that suggest the opposite is true, or even that it’s a close call or ambiguous in any way. The perspectives I find compelling (individually, but more so collectively) are:
In an absolute sense, CEA’s offer rates are extremely low, whether one looks at offer rate as a percentage of applicants (1.9%), as a percentage of people who passed a first screen (4.7%) or those who passed a second screen (9.5%). If all you knew about a job was that less than 1 in 50 applicants got an offer, and that less than 1 in 20 people who passed an initial screen got an offer, wouldn’t you consider that job highly selective? I sure would.
CEA’s offer rates are in the same ballpark as the notoriously hyper-selective McKinsey. McKinsey hires ~1% of applicants (vs. ~2% for CEA). If you pass an initial resume screen at McKinsey, you have a ~12.5% chance of getting an offer, which is actually higher than the offer rate for CEA applicants who make it past an initial (~5%) or even a secondary (~10%) screen.
As another way of contextualizing CEA’s selectivity without relying on the Ashby data, even if you pass CEA’s initial screen your expected success rate of ~5% is the same as someone applying to Harvard.
CEA is much more selective than the Ashby benchmark (which itself likely captures selective jobs). That’s true whether you look at offer rates for all candidates (4.7x higher offer rate for Ashby) or offer rates after unqualified people had been screened out (4.5x higher offer rate for Ashby after an initial screen, and 2.8x higher at the people ops/onsite stage). Side note: Given the magnitude and consistency of the finding that CEA is much more selective than Ashby, I do think it belongs in the OP’s Summary section (though it could certainly include a note or footnote regarding how Ashby is an imperfect reference group).
Across all of CEA’s core jobs there was an offer acceptance rate of 100%. This strongly suggests that these candidates, who were presumably very strong applicants, were not choosing between many attractive EA jobs offers.
There’s very strong evidence that in the relatively recent past, EA jobs were quite hard to get. So I think there’s a high burden of proof for anyone making the argument that this situation has changed. I don’t think data presented in this post or the comments meets that burden of proof, and in fact I think that data all strongly supports the notion that EA jobs are still hard to find.
offer rate is more relevant to selectivity (if you disagree, could you explain why?)
I think it’s pretty uncontroversial that our applicants are more dedicated (i.e. more likely to accept an offer). My understanding of Ashby is that it’s used by a bunch of random tech recruiting agencies, and I would guess that their applicants have ~0 pre-existing excitement about the companies they get sent to.
I don’t see any metrics that suggest the opposite is true, or even that it’s a close call or ambiguous in any way.
The statement in the post is “CEA might be slightly more selective than Ashby’s customers, but it does not seem like the difference is large”. This seems consistent with the view that CEA is selective? (It also just implies that Ashby is selective, which is a reasonable thing to believe.[1])
--
As a meta point: I kind of get the sense that you feel that this post is intended to be polemical, like we are trying to convince people that CEA isn’t selective or something. But as you originally said: “the authors don’t seem to take an explicit stance on the issue” – we just wanted to share some statistics about our hiring and, at least as evidenced by that first comment of yours, we were somewhat successful in conveying that we didn’t have particularly strong opinions about whether EA jobs are still hard to get.
This post was intended to provide some statistics about our hiring, because we were collecting them for internal purposes anyway so I figured we might as well publish. We threw in the Ashby thing at the end because it was an easily accessible data point, but to be honest I kind of regret doing that – I’m not sure the comparison was useful for many people, and it caused confusion.
Re: offer rate vs hire rate, CEA’s applicants are likely applying to other EA jobs they’d also be dedicated to. CEA may well be more attractive than other EA employers, but I don’t think that’s a given and I’m not sure of the magnitude of any difference there might be. Bigger picture, as I mentioned earlier I think any individual metric is problematic and that we should look at a variety of metrics and see what story they collectively tell.
Re: your meta point, the thing I find confusing is that you “didn’t have particularly strong opinions about whether EA jobs are still hard to get.” There’s a bunch of data, and every data point (CEA’s absolute offer rates at each stage, CEA vs. Ashby at each stage, and CEA vs. other benchmarks like McKinsey and Harvard) supports the idea that EA jobs are hard to get. So I don’t really understand why you present a lot of data that all points the same way, yet remain unconvinced by the conclusion they lead to.
Similarly, I find it confusing that you still seem to endorse the claim that “CEA might be slightly more selective than Ashby’s customers, but it does not seem like the difference is large.” CEA has lower offer rates and lower hire rates at each stage of the process. And in almost every case, the difference is quite large (at least 2x). Even in the one comparison where the difference isn’t huge (hire rates at the person ops/onsite stage), it is still a moderate magnitude (Ashby’s rate is 40% higher than CEA’s) despite the fact that CEA required passing 3 screens to get to that stage vs. 2 for Ashby. I think a more reasonable interpretation of that data would be “It’s very likely that CEA is much more selective than Ashby’s customers, though it’s possible the magnitude of this difference is only moderate (and Ashby is not a perfect reference point.)”
the thing I find confusing is that you “didn’t have particularly strong opinions about whether EA jobs are still hard to get.”… So I don’t really understand why you present a lot of data that all points the same way, yet remain unconvinced by the conclusion they lead to.
I think I’m largely like “bruh, literally zero of our product manager finalist candidates had everhad the title “product manager” before, how could we possibly be more selective than Ashby?”[1]
Some other data points:
When I reach out to people who seem like good fits, they often decline to apply, meaning that they don’t even get into the data set evaluated here
When I asked some people who are well-connected to PMs to pass on the job to others they know, they declined to do so because they thought the PMs they knew would be so unlikely to want it it wasn’t worth even asking
I acknowledge that, if you rely 100% on the data set presented here, maybe you will come to a different conclusion, but I really just don’t think the data set presented here is that compelling.
As mentioned, our candidates are impressive in other ways, and maybe they are more impressive than the average Ashby candidate overall, but I just don’t think we have the evidence to confidently say that.
Hmm, if we are still talking about comparing CEA versus Ashby, I’m not sure this carves reality at the joints: it’s certainly true that people with zero experience have an uphill battle getting hired, but I don’t think CEA is unusual in this regard. (If anything, I would guess that we are more open to people with limited experience.)
Sorry, I’m not sure I understand what your point is. Are you saying that my point 1 is misleading because having even any relevant experience can be a big boost for an applicant’s chances to getting hired by CEA, and any relevant experience isn’t a high bar?
Yeah, job experience seems like a major difference between CEA and Ashby. I’d guess that salary could be quite different too (which might be why the CEA role doesn’t seem interesting to experienced PMs).
It sounds like one of the reasons why EA jobs are hard to get (at least for EA candidates) is because EA candidates (typically young people with great academic credentials and strong understanding of EA but relatively little job experience) lack the job experience some roles require. To me this suggests that advising (explicitly or implicitly) young EAs that the most impactful thing they can do is direct work could be counterproductive, and that it might be better to emphasize building career capital.
I’m glad CEA is sharing this data, but I wish the post focused more on the important question posed in the title: Is it still hard to get a job in EA? I think the data suggests quite strongly that the answer is “yes”, but the authors don’t seem to take an explicit stance on the issue and the Summary of the post arguably suggests the opposite conclusion.
Here’s why I think the data implies EA jobs are still hard to get:
Only a (very?) small percentage of applications led to a job. The report describes a 2.4% hiring rate for Core jobs (itself a low figure), but I think a more accurate number would be 1.85% = (7 hires/ (54 applications per role * 7 roles)). The 2.4% figure is “Weighted by the number of applicants in each stage” but I’m not actually sure what that means, and in any case the raw number of hires/applicants makes the most sense to me. I also think it’s important to include the EOI numbers; these roles received just as many applicants as Core jobs, but led to no hires. (For the same reason, I’d find it valuable to include data from Operations and other roles that were excluded from this analysis, as they all help inform the question of whether EA jobs are hard to get.) Including EOI applications, we get a hiring rate of 1.1% of total applications.
If “Ashby hires 4% of applicants, compared to 2% at CEA”, it seems like quite a stretch for the Summary to say “the probability of an applicant receiving an offer was similar to industry averages.” At the very least, I’d like to see the figures included in the Summary so readers could draw their own conclusion. Also, while Ashby hired 4% of applicants they gave job offers to 8% of applicants; the latter figure is more relevant to the question at hand. CEA’s analysis implies that their offer rate and hire rate were the same (please correct me if I’m misinterpreting), which itself is a suggestion that EA jobs are hard to get. If the offer rate in the Ashby survey is 4x CEA’s offer rate (8% vs 2%), that’s a very sizeable difference.
The Product Manager jobs included in the Ashby survey were all jobs at “high-growth tech companies”, which are probably more competitive than the same role at “all tech companies” or “all industries”. So I’d interpret the benchmarking data as saying that CEA’s Product Manager role was more competitive (much more competitive if CEA’s offer rate equals its hire rate) than an unusually competitive subset of private sector Product Manager jobs.
I agree with the comments from Vaidehi and Denkenberger about why this analysis may understate the availability of EA jobs.
Hey, thanks for your comment.
There are a few different ways to look at the probability of being hired. As you suggest, one would be to take the total number of hires and divide it by the total number of applicants, across all recruitment. We chose not do to this here because the EOIs are substantially different from the Core roles (in having a higher bar for progression, etc.), which would make an overall figure less useful. (The CEA website does emphasise the difference between main roles and EOIs, so it is something prospective applicants are made aware of when applying.)
When we “weight by the numbers of applicants in each stage”, this just means that we’re taking the average across applicants, and not across roles. (Worked example: two Roles A and B each hired one person. Role A has 100 people in stage 1, with probability of success 1/100=1%; Role B has 10 people in stage 2, with probability of success 1/10=10%. The probability of success when weighting is (1%*100 + 10%*10)/110 = 2⁄110 = 1.8%; but when averaging across roles it is (1%+10%)/2 = 5.5%)
Regarding the industry comparison, as you mention there are ways in which CEA might be more selective than industry and other ways in which CEA might be less selective. As Ben mentions in an earlier comment, we probably don’t have solid enough evidence to call it in one direction or another.
If EOIs are hard to get, that seems relevant to the question of whether EA jobs are hard to get since EOIs are quite sought after (as many applicants as core jobs despite less chance of getting hired). But since AFAIK CEA is the only EA org that has EOIs, I can certainly see the case for excluding them from the sample.
100% agree this is the right methodology. But I still think 1.85% is the relevant number (number of hires/number of applicants). From your answer to Khorton, it sounds like your 2.4% figure excludes the Core job you didn’t hire for (which seems to have gotten more applicants than the average core job). I don’t understand that decision, and think it makes it harder to answer the question of whether EA jobs are hard to get.
Can you provide CEA’s offer rate, for the PM role and for core jobs overall? Hire rate really isn’t the best measure for determining whether jobs are hard to get.
FWIW, I’m not sure why Ben thinks hires as a “percent of applicants who get to the people ops interview stage” (the only stage where CEA is more likely to hire, and not an apples-to-apples comparison since CEA has a work trial before it and Ashby doesn’t) is the right metric. He suggests he likes that metric as a way to exclude low-quality applicants, but the better way to do that is to look at hires (or ideally offers) as a percent of people who make it past the initial screen (which is more restrictive for Ashby than CEA). CEA hires 1 in 28 people who make it past the first screen; the Ashby sample hires 1 in 12 (and makes offers to 2 in 12).
Hey, apologies that it has taken us so long to get back to you on this.
Thanks for pointing this out! You’ve shed light on an important point.The 2.4% figure can be thought of as “the probability of being hired, conditional on clearing a hiring bar” and the 1.85% figure is the “probability of being hired at all”; on reflection I agree that the latter would be more useful in this case. I’ve updated the post to reflect this.
For the PM role there was only one offer made (to the one hire), so a rate of 1/52=1.9%.
For core jobs overall, on average there was just one offer made for each[1]. The average number of applications was 53.7, so the average offer rate for core roles is 1⁄53.7=1.9%.
Of the 7 Core roles, one role made two offers, and one other role made zero offers, so this averages out at one offer per role.
Thanks for updating the post and providing the offer rate data! As I mentioned in my response to Ben, I think CEA’s much lower offer rates relative to those in the Ashby survey and CEA’s 100% offer acceptance rate are strong evidence that EA jobs are hard to get.
Sorry for my slow response here, I missed the notification about your comment.
I think maybe we just didn’t explain what EOIs are well. As an example: we had a product manager EOI; once we opened a full hiring round for PMs we contacted all the people who filled out the EOI and said “hey are you still looking for a PM position” and then moved the ones who said “yes” into the p.m. hiring round.[1]
My conclusion was: “in some ways CEA is more selective, and in other ways we are less; I think the methodology we used isn’t precise enough to make a stronger statement than ‘we are about the same.’”
I don’t think one of these comparison points is the “right metric” – they all have varying degrees of usefulness, and you and I might disagree a bit about their relative value, but, given their contradictory conclusions, I don’t think you can draw strong conclusions other than “we are about the same”.
Sometimes exceptional candidates are hired straight from an EOI, the example I give is specific to that role. I think in retrospect we should have just left EOIs off, as the data was more confusing than helpful.
Thanks for clarifying how the EOIs work, I had a different impression from the OP.
I still strongly disagree with the following statement:
Which are the ways in which CEA is less selective? You mentioned in a previous comment that “ we hire a substantially greater percent of applicants who get to the people ops interview stage” and I cited that interpretation in my own comment, but on closer look I don’t think that’s right (or perhaps I’m confused by what you mean by this?) As the OP states “At CEA 1 in 7 of those reaching a people ops interview get hired, compared to 1 in 5 at Ashby” which would have CEA as more selective. And if you look at offer rate (more relevant to the question of selectivity than hire rate) the difference is quite big (1 in 7 at CEA vs. 2 in 5 at Ashby, or 14% vs. 40%, a difference of 2.8x).
The difference is even bigger if you compare offer rate as a percentage of all applicants (2% for CEA vs. 9% at Ashby, or 4.5x) or as a percentage of applicants that passed an initial screen to weed out obviously unqualified applicants (4% for CEA vs. 17% for Ashby, or 4.7x). I also think it is quite notable that CEA’s offer acceptance rate across all roles was 100% vs. 50% for PM roles in the Ashby survey.
This data shows clear, consistent, and large differences all suggesting that CEA is much more selective than the industry benchmark (which itself is likely highly selective since it looks at high tech growth companies). Unless I’m missing some counter-evidence, I think that should be the conclusion. And I think the OP’s summary of the industry benchmark exercise is extremely misleading: “We compared our Product Manager hiring round to industry benchmark data. We found that the probability of an applicant receiving an offer was similar to industry averages.” If CEA made an offer to 1 of 52 applicants, and the benchmark survey had an offer to ~1 in 12, how is that similar?
Thanks yeah sorry, there is a greater change in the percentage of drop off for Ashby on-site → hired, but because we start with a smaller pool we are still more selective. 1 in 7 versus 1 in 5 is the correct comparison.
I guess I’m flattered that you trust the research we did here so much, but I think it’s very much not clear:
The number of applicants we get is very heavily influenced by how widely we promote the position, if the job happens to get posted to a job aggregator site, etc. To take a concrete example: six months ago we hired for a PM and got 52 applicants; last month we opened another PM position which got on to some non-EA job boards and got 113 applicants. If we hire one person from each round, I think you will say that we have gotten more than twice as selective, which is I guess kind of true, but our hiring bar hasn’t really changed (the person who we hired last time would be a top candidate this time).
I don’t really know what Ashby’s candidate pool is like, but I would guess their average applicant has more experience than ours – for example: none of our final candidates last round ever even had the job title “product manager” before, though they had had related roles, and in the current round neither of the two people at the furthest round in the process have ever had a PM role. I would be pretty surprised if Ashby’s final rounds were consistently made up of people who had never been PMs before.[1]
The conclusion of this post was “Overall, CEA might be slightly more selective than Ashby’s customers, but it does not seem like the difference is large” and that still seems basically right to me: 1⁄7 vs. 1⁄5 is more selective, but well within the margin of error given how much uncertainty I have.
Thanks – I just cut that sentence since my inability to communicate my view even with our substantial back-and-forth makes me pessimistic about making a summary.
In general, I would guess that CEA’s applicants have substantially less experience than their for-profit counterparts, as EA is quite young, but our applicants are more impressive given their age. E.g. we get a lot of college student applicants, but those students are going to prestigious universities.
The 1⁄7 vs. 1⁄5 comparison is based on hire rate, but offer rate is more relevant to selectivity (if you disagree, could you explain why?) The difference in offer rate is 14% for CEA vs. 40% for Ashby; I’d be quite surprised if this large difference is still within your margin of error given your uncertainty.
Stepping back, the key question is whether EA jobs are still hard to get. As you note, any single perspective from CEA’s recruiting data will be imperfect. But they’ll be imperfect in different ways. For example, looking at offer rate as a percentage of applicant could be distorted by large numbers of clearly unqualified applicants, but this can be avoided by looking at offers as a percentage of people who made it past an initial screen or people who made it to a people ops interview. And problems with Ashby as a benchmark don’t apply if you’re assessing CEA’s selectivity in an absolute sense (or relative to benchmarks other than Ashby). If you look across a variety of perspectives and they all tell the same story, that’s probably the right story.
When I look at the different perspectives on the CEA’s recruitment data, they all tell the story that jobs at CEA are (very) hard to get. I don’t see any metrics that suggest the opposite is true, or even that it’s a close call or ambiguous in any way. The perspectives I find compelling (individually, but more so collectively) are:
In an absolute sense, CEA’s offer rates are extremely low, whether one looks at offer rate as a percentage of applicants (1.9%), as a percentage of people who passed a first screen (4.7%) or those who passed a second screen (9.5%). If all you knew about a job was that less than 1 in 50 applicants got an offer, and that less than 1 in 20 people who passed an initial screen got an offer, wouldn’t you consider that job highly selective? I sure would.
CEA’s offer rates are in the same ballpark as the notoriously hyper-selective McKinsey. McKinsey hires ~1% of applicants (vs. ~2% for CEA). If you pass an initial resume screen at McKinsey, you have a ~12.5% chance of getting an offer, which is actually higher than the offer rate for CEA applicants who make it past an initial (~5%) or even a secondary (~10%) screen.
As another way of contextualizing CEA’s selectivity without relying on the Ashby data, even if you pass CEA’s initial screen your expected success rate of ~5% is the same as someone applying to Harvard.
CEA is much more selective than the Ashby benchmark (which itself likely captures selective jobs). That’s true whether you look at offer rates for all candidates (4.7x higher offer rate for Ashby) or offer rates after unqualified people had been screened out (4.5x higher offer rate for Ashby after an initial screen, and 2.8x higher at the people ops/onsite stage). Side note: Given the magnitude and consistency of the finding that CEA is much more selective than Ashby, I do think it belongs in the OP’s Summary section (though it could certainly include a note or footnote regarding how Ashby is an imperfect reference group).
Across all of CEA’s core jobs there was an offer acceptance rate of 100%. This strongly suggests that these candidates, who were presumably very strong applicants, were not choosing between many attractive EA jobs offers.
There’s very strong evidence that in the relatively recent past, EA jobs were quite hard to get. So I think there’s a high burden of proof for anyone making the argument that this situation has changed. I don’t think data presented in this post or the comments meets that burden of proof, and in fact I think that data all strongly supports the notion that EA jobs are still hard to find.
I think it’s pretty uncontroversial that our applicants are more dedicated (i.e. more likely to accept an offer). My understanding of Ashby is that it’s used by a bunch of random tech recruiting agencies, and I would guess that their applicants have ~0 pre-existing excitement about the companies they get sent to.
The statement in the post is “CEA might be slightly more selective than Ashby’s customers, but it does not seem like the difference is large”. This seems consistent with the view that CEA is selective? (It also just implies that Ashby is selective, which is a reasonable thing to believe.[1])
--
As a meta point: I kind of get the sense that you feel that this post is intended to be polemical, like we are trying to convince people that CEA isn’t selective or something. But as you originally said: “the authors don’t seem to take an explicit stance on the issue” – we just wanted to share some statistics about our hiring and, at least as evidenced by that first comment of yours, we were somewhat successful in conveying that we didn’t have particularly strong opinions about whether EA jobs are still hard to get.
This post was intended to provide some statistics about our hiring, because we were collecting them for internal purposes anyway so I figured we might as well publish. We threw in the Ashby thing at the end because it was an easily accessible data point, but to be honest I kind of regret doing that – I’m not sure the comparison was useful for many people, and it caused confusion.
It sounds to me like you think Ashby is selective: “the Ashby benchmark (which itself likely captures selective jobs).”
Re: offer rate vs hire rate, CEA’s applicants are likely applying to other EA jobs they’d also be dedicated to. CEA may well be more attractive than other EA employers, but I don’t think that’s a given and I’m not sure of the magnitude of any difference there might be. Bigger picture, as I mentioned earlier I think any individual metric is problematic and that we should look at a variety of metrics and see what story they collectively tell.
Re: your meta point, the thing I find confusing is that you “didn’t have particularly strong opinions about whether EA jobs are still hard to get.” There’s a bunch of data, and every data point (CEA’s absolute offer rates at each stage, CEA vs. Ashby at each stage, and CEA vs. other benchmarks like McKinsey and Harvard) supports the idea that EA jobs are hard to get. So I don’t really understand why you present a lot of data that all points the same way, yet remain unconvinced by the conclusion they lead to.
Similarly, I find it confusing that you still seem to endorse the claim that “CEA might be slightly more selective than Ashby’s customers, but it does not seem like the difference is large.” CEA has lower offer rates and lower hire rates at each stage of the process. And in almost every case, the difference is quite large (at least 2x). Even in the one comparison where the difference isn’t huge (hire rates at the person ops/onsite stage), it is still a moderate magnitude (Ashby’s rate is 40% higher than CEA’s) despite the fact that CEA required passing 3 screens to get to that stage vs. 2 for Ashby. I think a more reasonable interpretation of that data would be “It’s very likely that CEA is much more selective than Ashby’s customers, though it’s possible the magnitude of this difference is only moderate (and Ashby is not a perfect reference point.)”
I think I’m largely like “bruh, literally zero of our product manager finalist candidates had ever had the title “product manager” before, how could we possibly be more selective than Ashby?”[1]
Some other data points:
When I reach out to people who seem like good fits, they often decline to apply, meaning that they don’t even get into the data set evaluated here
When I asked some people who are well-connected to PMs to pass on the job to others they know, they declined to do so because they thought the PMs they knew would be so unlikely to want it it wasn’t worth even asking
I acknowledge that, if you rely 100% on the data set presented here, maybe you will come to a different conclusion, but I really just don’t think the data set presented here is that compelling.
As mentioned, our candidates are impressive in other ways, and maybe they are more impressive than the average Ashby candidate overall, but I just don’t think we have the evidence to confidently say that.
It sounds like there are two, separate things going on:
Jobs at CEA are very hard to get, even for candidates with impressive resumes overall.
CEA finds it hard to get applicants that have particular desirable qualities like previous experience in the same role.
Hmm, if we are still talking about comparing CEA versus Ashby, I’m not sure this carves reality at the joints: it’s certainly true that people with zero experience have an uphill battle getting hired, but I don’t think CEA is unusual in this regard. (If anything, I would guess that we are more open to people with limited experience.)
Sorry, I’m not sure I understand what your point is. Are you saying that my point 1 is misleading because having even any relevant experience can be a big boost for an applicant’s chances to getting hired by CEA, and any relevant experience isn’t a high bar?
Yeah, job experience seems like a major difference between CEA and Ashby. I’d guess that salary could be quite different too (which might be why the CEA role doesn’t seem interesting to experienced PMs).
It sounds like one of the reasons why EA jobs are hard to get (at least for EA candidates) is because EA candidates (typically young people with great academic credentials and strong understanding of EA but relatively little job experience) lack the job experience some roles require. To me this suggests that advising (explicitly or implicitly) young EAs that the most impactful thing they can do is direct work could be counterproductive, and that it might be better to emphasize building career capital.