I tried doing this a while back. Some things I think I worried about at the time:
(1) disheartening people excessively by sending them scores that seem very low/brutal, especially if you use an unusual scoring methodology
(2) causing yourself more time costs than it seems like at first, because (a) you find yourself needing to add caveats or manually hide some info to make it less disheartening to people, (b) people ask you follow-up questions
(3) exposing yourself to some sort of unknown legal risk by saying something not-legally-defensible about the candidate or your decision-making.
(1) turned out to be pretty justified I think, e.g. at least one person expressing upset/dissatisfaction at being told this info.
(2) definitely happened too, although maybe not all that many hours in the grand scheme of things
(3) we didn’t get sued but who knows how much we increased the risk by
Jamie, I’ve been contemplating writing up a couple of informal “case study”-type reports of different hiring practices. My intention/thought process would be to allow EA orgs to learn about how several different orgs do hiring, to highlight some best practices, and generally to allow/encourage organizations to improve their methods. How would you feel about writing up a summary or having a call with me to allow me to understand how you tried giving feedback and what specific aspects caused challenges?
Unfortunately this was quite a while ago at the last org I worked at; I don’t have access to the relevant spreadsheets, email chains etc anymore and my memory is not the best, so I don’t expect to be able to add much beyond what I wrote in the comment above.
This is something I would be interested in seeing! A lot of EA orgs already have public info on their hiring process (at least in a structural sense). I’d be more curious about what happens under the hood, ‘scoring methodologies’ in particular.
Regarding “disheartening people,” I once got feedback for a hiring round and the organization shared what scores I got, and even shared scoring info for the other (anonymized) candidates. It was the best and most accurate data I have ever been given as feedback.
I scored very low, much lower than I had expected. Of course I felt sad and frustrated. I wish that I knew more details about their scoring methodology, and part of me says that it was an unfair process because they weren’t clear on what I would be evaluated on. But I draw a analogies to getting rejected from anything else (such as a school application or a romantic partner): it sucks, but you get over it eventually. I felt bad for a day or two, and then the feelings of frustration faded away.
Okay, I definitely see those concerns! Unknown legal risk—especially as it relates to in many cases hiring in a lot of different countries at the same time with potentially different laws seems like a good reason not to release scores.
For me personally getting a rejection vs getting a rejection and being told I had the lowest score among all applicants, probably wouldn’t make much of a difference—it might even save me time spent on future applications for similar positions. But on that maybe just releasing quarter percentiles would be a better less brutal alternative?
I think a general, short explainer of the scoring methodology used for a hiring round could/should be released to the applicants, if only for transparency’s sake. So, explainer + raw scores and no ranking might also be another alternative?
Maybe I am misguided in my idea that ‘this could be a low-time-cost way of making sure all applicants get a somewhat better sense of how good/bad their applications were.’ I have after all only ever been on the applicant side of things and it does seem the current system is working fine at generating good hires.
I tried doing this a while back. Some things I think I worried about at the time:
(1) disheartening people excessively by sending them scores that seem very low/brutal, especially if you use an unusual scoring methodology (2) causing yourself more time costs than it seems like at first, because (a) you find yourself needing to add caveats or manually hide some info to make it less disheartening to people, (b) people ask you follow-up questions (3) exposing yourself to some sort of unknown legal risk by saying something not-legally-defensible about the candidate or your decision-making.
(1) turned out to be pretty justified I think, e.g. at least one person expressing upset/dissatisfaction at being told this info. (2) definitely happened too, although maybe not all that many hours in the grand scheme of things (3) we didn’t get sued but who knows how much we increased the risk by
Jamie, I’ve been contemplating writing up a couple of informal “case study”-type reports of different hiring practices. My intention/thought process would be to allow EA orgs to learn about how several different orgs do hiring, to highlight some best practices, and generally to allow/encourage organizations to improve their methods. How would you feel about writing up a summary or having a call with me to allow me to understand how you tried giving feedback and what specific aspects caused challenges?
Unfortunately this was quite a while ago at the last org I worked at; I don’t have access to the relevant spreadsheets, email chains etc anymore and my memory is not the best, so I don’t expect to be able to add much beyond what I wrote in the comment above.
This is something I would be interested in seeing! A lot of EA orgs already have public info on their hiring process (at least in a structural sense). I’d be more curious about what happens under the hood, ‘scoring methodologies’ in particular.
Regarding “disheartening people,” I once got feedback for a hiring round and the organization shared what scores I got, and even shared scoring info for the other (anonymized) candidates. It was the best and most accurate data I have ever been given as feedback.
I scored very low, much lower than I had expected. Of course I felt sad and frustrated. I wish that I knew more details about their scoring methodology, and part of me says that it was an unfair process because they weren’t clear on what I would be evaluated on. But I draw a analogies to getting rejected from anything else (such as a school application or a romantic partner): it sucks, but you get over it eventually. I felt bad for a day or two, and then the feelings of frustration faded away.
Okay, I definitely see those concerns! Unknown legal risk—especially as it relates to in many cases hiring in a lot of different countries at the same time with potentially different laws seems like a good reason not to release scores.
For me personally getting a rejection vs getting a rejection and being told I had the lowest score among all applicants, probably wouldn’t make much of a difference—it might even save me time spent on future applications for similar positions. But on that maybe just releasing quarter percentiles would be a better less brutal alternative?
I think a general, short explainer of the scoring methodology used for a hiring round could/should be released to the applicants, if only for transparency’s sake. So, explainer + raw scores and no ranking might also be another alternative?
Maybe I am misguided in my idea that ‘this could be a low-time-cost way of making sure all applicants get a somewhat better sense of how good/bad their applications were.’ I have after all only ever been on the applicant side of things and it does seem the current system is working fine at generating good hires.