On the topic of feedback… At Triplebyte, where I used to work as an interviewer, we would give feedback to every candidate who went through our technical phone screen. I wasn’t directly involved in this, but I can share my observations—I know some other EAs who worked at Triplebyte were more heavily involved, and maybe they can fill in details that I’m missing. My overall take is that offering feedback is a very good idea and EA orgs should at least experiment with it.
Offering feedback was a key selling point that allowed us to attract more applicants.
As an interviewer, I was supposed to be totally candid in my interview notes, and also completely avoid any feedback during the screening call itself. Someone else in the company (who wasn’t necessarily a programmer) would lightly edit those notes before emailing them—they wanted me to be 100% focused on making an accurate assessment, and leave the diplomacy to others. My takeaway is that giving feedback can likely be “outsourced”—you can have a contractor / ops person / comms person / intern / junior employee take notes on hiring discussions, then formulate diplomatic but accurate feedback for candidates.
My boss told me that the vast majority of candidates appreciated our feedback. I never heard of any candidate suing us, even though we were offering feedback on an industrial scale. I think occasionally candidates got upset, but they mostly insulated me from that unless they thought it would be valuable for me to hear—they wanted my notes to stay candid.
Jan writes: “when evaluating hundreds of applications, it is basically certain some errors are made, some credentials misunderstood, experiences not counted as they should, etc. - but even if the error rate is low, some people will rightfully complain, making hiring processes even more costly.” I think insofar as you have low confidence in your hiring pipeline, you should definitely be communicating this to candidates, so they don’t over-update on rejection. At Triplebyte, we had way more data to validate our process than I imagine any EA org has. But I believe that “our process is noisy and we know we’re rejecting good candidates” was part of the standard apologetic preamble to our feedback emails. (One of the worst parts of my job was constant anxiety that I was making the wrong call and unfairly harming a good candidate’s career.)
Relatedly… I’m in favor of orgs taking the time to give good feedback. It seems likely worthwhile as an investment in the human capital of the rejectee, the social capital of the community as a whole, and improved community retention. But I don’t think feedback needs to be good to be appreciated—especially if you make it clear if your feedback is low confidence. As a candidate, I’m often asking the question of which hoops I need to jump through in order to get a particular sort of job. If part of hoop-jumping means dealing with imperfect interviewers who aren’t getting an accurate impression of my skills, I want to know that so I can demonstrate my skills better.
But I also think that practices that help you give good feedback are quite similar to practices that make you a good interviewer in general. If your process doesn’t give candidates a solid chance to demonstrate their skills, that is something you should fix if you want to hire the best people! (And hearing from candidates whose skills were, in fact, judged inaccurately will help you fix it! BTW, I predict if you acknowledge your mistake and apologize, the candidate will get way less upset, even if you don’t end up hiring them.) A few more examples to demonstrate the point that interviewing and giving feedback are similar competencies:
Concrete examples are very useful for feedback. And I was trained to always have at least one concrete example to back up any given assessment, to avoid collecting fuzzy overall impressions that might be due to subconsciousbias. (BTW, I only saw a candidate’s resume at the very end of the interview, which I think was helpful.)
Recording the interview (with the candidate’s consent), so you can review it as needed later, is another thing that helps with both objectives. (The vast majority of Triplebyte candidates were happy to have their interview recorded.)
Using objective, quantifiable metrics (or standard rubrics) makes your process better, and can also give candidates valuable info on their relative strengths and weaknesses. (Obviously you want to be diplomatic, e.g. if a candidate really struggled somewhere, I think we described their skills in that area as “developing” or something. We’d also give them links to resources to help them level up on that.)
At Triplebyte, we offered feedback to every candidate regardless of whether they asked for it. I once suggested to my boss that we should make it opt-in, because that would decrease the time cost on our side and also avoid offending candidates who didn’t actually want feedback. IIRC my boss didn’t really object to that thought. It wasn’t deemed a high-priority change, but I would suggest organizations creating a process from scratch make feedback opt-in.
BTW if any EA hiring managers have questions for me I’m happy to answer here, via direct message, or on a video call. I interviewed both generalist software engineers (tilted towards backend web development) and machine learning engineers.
I was one of the people who edited interview notes and sent other feedback to Triplebyte candidates; I certify that everything John said here is correct, even re: the parts of the process he wasn’t directly involved in, and I endorse his takeaways. This comment is more a response to John than it is a response to the OP, but hopefully/maybe people might still find it useful.
Feedback emails were about 25% of my job. As a team, we sent maybe 50 feedback emails on an average day (not totally sure here, numbers fluctuated a lot and also it was two years ago).
One of the things that made it possible to give good feedback at scale was that Triplebyte had a well-oiled, standardized process. Every candidate took much the same interview, which meant that we could largely build our emails out of pre-existing blocks — e.g., telling a candidate that we were impressed with their code quality but they could have been faster, or mentioning specific knowledge areas where they could improve and linking to relevant resources. I doubt the same could be done at most EA orgs? Maybe if they’re hiring programmers.
The process of editing interviewers’ raw feedback became pretty quick and easy after a while (edit out the swearing and anything mean, change some keywords, bam), although sometimes one of us would slip up and that wasn’t great, lol. So yeah I agree that this is a job that could pretty easily be offloaded to a decent communicator who was familiar with the interview process. We did write some of our own content if we felt it was needed (e.g. writing nice things if we knew the candidate was struggling personally), and we used our judgment to tone down the harshness (e.g. if someone needed improvement in every single area we tested, we would focus on just a few areas rather than sending a super long email telling them they were bad at everything).
There was also huge variation in quality between the notes of different interviewers; some would write long, detailed, encouraging messages to the candidates, while others would write like one sentence. So if EA orgs choose to go down this road, they need to make sure to give the feedback-giver enough to work with.
Another thing is that we were explicitly assessing only candidates’ technical abilities, and not whether we wanted them to join our team. That meant that all rejections were of the form “you need to brush up on X skills”, and we never had to reference a candidate’s personality or energy levels or whatever. That probably helps a ton re: protection from lawsuits. (I had never thought of that before, huh.)
On the topic of feedback… At Triplebyte, where I used to work as an interviewer, we would give feedback to every candidate who went through our technical phone screen. I wasn’t directly involved in this, but I can share my observations—I know some other EAs who worked at Triplebyte were more heavily involved, and maybe they can fill in details that I’m missing. My overall take is that offering feedback is a very good idea and EA orgs should at least experiment with it.
Offering feedback was a key selling point that allowed us to attract more applicants.
As an interviewer, I was supposed to be totally candid in my interview notes, and also completely avoid any feedback during the screening call itself. Someone else in the company (who wasn’t necessarily a programmer) would lightly edit those notes before emailing them—they wanted me to be 100% focused on making an accurate assessment, and leave the diplomacy to others. My takeaway is that giving feedback can likely be “outsourced”—you can have a contractor / ops person / comms person / intern / junior employee take notes on hiring discussions, then formulate diplomatic but accurate feedback for candidates.
My boss told me that the vast majority of candidates appreciated our feedback. I never heard of any candidate suing us, even though we were offering feedback on an industrial scale. I think occasionally candidates got upset, but they mostly insulated me from that unless they thought it would be valuable for me to hear—they wanted my notes to stay candid.
Jan writes: “when evaluating hundreds of applications, it is basically certain some errors are made, some credentials misunderstood, experiences not counted as they should, etc. - but even if the error rate is low, some people will rightfully complain, making hiring processes even more costly.” I think insofar as you have low confidence in your hiring pipeline, you should definitely be communicating this to candidates, so they don’t over-update on rejection. At Triplebyte, we had way more data to validate our process than I imagine any EA org has. But I believe that “our process is noisy and we know we’re rejecting good candidates” was part of the standard apologetic preamble to our feedback emails. (One of the worst parts of my job was constant anxiety that I was making the wrong call and unfairly harming a good candidate’s career.)
Relatedly… I’m in favor of orgs taking the time to give good feedback. It seems likely worthwhile as an investment in the human capital of the rejectee, the social capital of the community as a whole, and improved community retention. But I don’t think feedback needs to be good to be appreciated—especially if you make it clear if your feedback is low confidence. As a candidate, I’m often asking the question of which hoops I need to jump through in order to get a particular sort of job. If part of hoop-jumping means dealing with imperfect interviewers who aren’t getting an accurate impression of my skills, I want to know that so I can demonstrate my skills better.
But I also think that practices that help you give good feedback are quite similar to practices that make you a good interviewer in general. If your process doesn’t give candidates a solid chance to demonstrate their skills, that is something you should fix if you want to hire the best people! (And hearing from candidates whose skills were, in fact, judged inaccurately will help you fix it! BTW, I predict if you acknowledge your mistake and apologize, the candidate will get way less upset, even if you don’t end up hiring them.) A few more examples to demonstrate the point that interviewing and giving feedback are similar competencies:
Concrete examples are very useful for feedback. And I was trained to always have at least one concrete example to back up any given assessment, to avoid collecting fuzzy overall impressions that might be due to subconscious bias. (BTW, I only saw a candidate’s resume at the very end of the interview, which I think was helpful.)
Recording the interview (with the candidate’s consent), so you can review it as needed later, is another thing that helps with both objectives. (The vast majority of Triplebyte candidates were happy to have their interview recorded.)
Using objective, quantifiable metrics (or standard rubrics) makes your process better, and can also give candidates valuable info on their relative strengths and weaknesses. (Obviously you want to be diplomatic, e.g. if a candidate really struggled somewhere, I think we described their skills in that area as “developing” or something. We’d also give them links to resources to help them level up on that.)
At Triplebyte, we offered feedback to every candidate regardless of whether they asked for it. I once suggested to my boss that we should make it opt-in, because that would decrease the time cost on our side and also avoid offending candidates who didn’t actually want feedback. IIRC my boss didn’t really object to that thought. It wasn’t deemed a high-priority change, but I would suggest organizations creating a process from scratch make feedback opt-in.
BTW if any EA hiring managers have questions for me I’m happy to answer here, via direct message, or on a video call. I interviewed both generalist software engineers (tilted towards backend web development) and machine learning engineers.
For those interested in Triplebyte’s approach, there’s also Kelsey Piper’s thoughts on why and how the company gives feedback, and why others don’t.
I was one of the people who edited interview notes and sent other feedback to Triplebyte candidates; I certify that everything John said here is correct, even re: the parts of the process he wasn’t directly involved in, and I endorse his takeaways. This comment is more a response to John than it is a response to the OP, but hopefully/maybe people might still find it useful.
Feedback emails were about 25% of my job. As a team, we sent maybe 50 feedback emails on an average day (not totally sure here, numbers fluctuated a lot and also it was two years ago).
One of the things that made it possible to give good feedback at scale was that Triplebyte had a well-oiled, standardized process. Every candidate took much the same interview, which meant that we could largely build our emails out of pre-existing blocks — e.g., telling a candidate that we were impressed with their code quality but they could have been faster, or mentioning specific knowledge areas where they could improve and linking to relevant resources. I doubt the same could be done at most EA orgs? Maybe if they’re hiring programmers.
The process of editing interviewers’ raw feedback became pretty quick and easy after a while (edit out the swearing and anything mean, change some keywords, bam), although sometimes one of us would slip up and that wasn’t great, lol. So yeah I agree that this is a job that could pretty easily be offloaded to a decent communicator who was familiar with the interview process. We did write some of our own content if we felt it was needed (e.g. writing nice things if we knew the candidate was struggling personally), and we used our judgment to tone down the harshness (e.g. if someone needed improvement in every single area we tested, we would focus on just a few areas rather than sending a super long email telling them they were bad at everything).
There was also huge variation in quality between the notes of different interviewers; some would write long, detailed, encouraging messages to the candidates, while others would write like one sentence. So if EA orgs choose to go down this road, they need to make sure to give the feedback-giver enough to work with.
Another thing is that we were explicitly assessing only candidates’ technical abilities, and not whether we wanted them to join our team. That meant that all rejections were of the form “you need to brush up on X skills”, and we never had to reference a candidate’s personality or energy levels or whatever. That probably helps a ton re: protection from lawsuits. (I had never thought of that before, huh.)
This is great to hear and an interesting read, thank you for sharing!