As a former applicant for many EA org roles, I strongly agree! I recall spending on average 2-8 times longer on some initial applications than was estimated by many job ads.
As someone who just helped drive a hiring process for Giving What We Can (for a Research Communicator role) I feel a bit daft having experienced it on the other side, but not having learned from it. I/we did not do a good enough job here. We had a few initial questions that we estimated would take ~20-60 minutes, and in retrospect I now imagine many candidates would have spent much longer than this (I know I would have).
Over the coming month or so I’m hoping to draft a post with reflections on what we learned from this, and how we would do better next time (inspired by Aaron Gertler’s 2020 post on hiring a copyeditor for CEA). I’ll be sure to include this comment and its suggestion (having a link at the end of the application form where people can report how long it actually took to fill the form in) in that post.
Might there be a way to time submissions? I know some tests I have taken for prospective employers are timed. This means candidates e.g. only gets 1 hour both to see the questions asked and to answer them. This might also remove any bias in recruitment as someone with a full-time job and caretaker responsibilities might not have the luxury of spending 6 x the time on an application, while someone in a more privileged position can even spend longer than that.
In the hiring round I mentioned, we did time submissions for the work tests, and at least my impression is we found a way of doing so worked out fairly well. Having a timed component for the initial application is also possible, but might require more of an ‘honour code’ system as setting up a process that allows for verification of the time spent is a pretty a big investment for the first stage of an application.
Yes, there are ways to time submissions, and (from my perspective) they aren’t particularly difficult to find or to use. I suspect that any organization not using them doesn’t have can’t find a timing tool as a reason, and more likely has chose not to devote the resources to improving this process, or hasn’t thought of it or hasn’t bothered with it as a reason.
A second thought I had is also that timed responses might be beneficial for the hiring organization. This could be because of two reasons. First, at work, you do not have 4 hours to polish an email to a potential donor. You have 10 minutes because you have a mountain of other important things to do. As such, having a strictly timed assessment is likely to give a more realistic view of the expected performance on the job. Secondly, timed responses will also make for a more apples-to-apples comparison, where you are more likely to select the best candidates instead of the candidates with the most time and/or the largest network of educated family and friends willing to help out polish responses.
We had a few initial questions that we estimated would take ~20-60 minutes, and in retrospect I now imagine many candidates would have spent much longer than this (I know I would have).
Michael, I’m wondering if more transparency would have helped here? As a simplistic example, there is a big difference between these two questions:
Tell us about a time when you took initiative in a work context.
and
Tell us about a time when you took initiative in a work context. We are specifically looking for candidates that have done this in relation to people management, can describe the process and the results/impact, and can demonstrate taking initiative by doing something fairly innovative.
I’m not sure I follow what you mean by transparency in this context. Do you mean being more transparent about what exactly we were looking for? In our case we asked for <100 words on “Why are you interested in this role?” and “Briefly, what is your experience with effective giving and/or effective altruism?” and we were just interested in seeing if applicants’ interest/experienced aligned with the skills, traits and experience we listed in the job descriptions.
I mean transparency in the sense of how the answers are assessed/evaluated. This basically gives candidates a little bit more guidance and structure.
An analogy that I like to use is rather silly, but it works: I might ask a candidate to describe to me how physically fit he are, and he tells me about how many weights he can lift and how fast you can run. But it turns out that I’m actually interested in flexibility and endurance rather than power and speed, and I’ll reject this candidate since he didn’t demonstrate flexibility or endurance. So it is true that he described physical fitness and that I’m assessing based on your physical fitness, but it’s also true that the information offered and what I wanted to assess were very different.
I don’t have any particularly strong views, and would be interested in what others think.
Broadly, I feel like I agree that more specificity/transparency is helpful, though I don’t feel convinced that it’s not also worth asking at some stage in the application an open-ended question like “Why are you interested in the role?”. Not sure I can explain/defend my intuitions here much right now but I would like to think more on it when I get around to writing some reflections on the Research Communicator hiring process.
I just want to say that I love seeing this kind of thing on the EA Forum, and it is so different from most other parts of the internet: I have a proposal or a suggestion, and it doesn’t quite mesh with what you think/feel. Neither of us have great justifications or clear data, and rather than ad hominems or posturing or some type of ‘battle,’ there is simply a bit of exchange and some reflection.
I really like that your response was reflective/pensive, rather than aggressive or defensive. Thanks for being one of the people that makes the internet ever-so-slightly better than it otherwise would be. ☺
As a former applicant for many EA org roles, I strongly agree! I recall spending on average 2-8 times longer on some initial applications than was estimated by many job ads.
As someone who just helped drive a hiring process for Giving What We Can (for a Research Communicator role) I feel a bit daft having experienced it on the other side, but not having learned from it. I/we did not do a good enough job here. We had a few initial questions that we estimated would take ~20-60 minutes, and in retrospect I now imagine many candidates would have spent much longer than this (I know I would have).
Over the coming month or so I’m hoping to draft a post with reflections on what we learned from this, and how we would do better next time (inspired by Aaron Gertler’s 2020 post on hiring a copyeditor for CEA). I’ll be sure to include this comment and its suggestion (having a link at the end of the application form where people can report how long it actually took to fill the form in) in that post.
Might there be a way to time submissions? I know some tests I have taken for prospective employers are timed. This means candidates e.g. only gets 1 hour both to see the questions asked and to answer them. This might also remove any bias in recruitment as someone with a full-time job and caretaker responsibilities might not have the luxury of spending 6 x the time on an application, while someone in a more privileged position can even spend longer than that.
In the hiring round I mentioned, we did time submissions for the work tests, and at least my impression is we found a way of doing so worked out fairly well. Having a timed component for the initial application is also possible, but might require more of an ‘honour code’ system as setting up a process that allows for verification of the time spent is a pretty a big investment for the first stage of an application.
Yes, there are ways to time submissions, and (from my perspective) they aren’t particularly difficult to find or to use. I suspect that any organization not using them doesn’t have can’t find a timing tool as a reason, and more likely has chose not to devote the resources to improving this process, or hasn’t thought of it or hasn’t bothered with it as a reason.
A second thought I had is also that timed responses might be beneficial for the hiring organization. This could be because of two reasons. First, at work, you do not have 4 hours to polish an email to a potential donor. You have 10 minutes because you have a mountain of other important things to do. As such, having a strictly timed assessment is likely to give a more realistic view of the expected performance on the job. Secondly, timed responses will also make for a more apples-to-apples comparison, where you are more likely to select the best candidates instead of the candidates with the most time and/or the largest network of educated family and friends willing to help out polish responses.
I’m looking forward to reading a post with reflections on lessons learned. :)
Michael, I’m wondering if more transparency would have helped here? As a simplistic example, there is a big difference between these two questions:
and
I’m not sure I follow what you mean by transparency in this context. Do you mean being more transparent about what exactly we were looking for? In our case we asked for <100 words on “Why are you interested in this role?” and “Briefly, what is your experience with effective giving and/or effective altruism?” and we were just interested in seeing if applicants’ interest/experienced aligned with the skills, traits and experience we listed in the job descriptions.
I mean transparency in the sense of how the answers are assessed/evaluated. This basically gives candidates a little bit more guidance and structure.
An analogy that I like to use is rather silly, but it works: I might ask a candidate to describe to me how physically fit he are, and he tells me about how many weights he can lift and how fast you can run. But it turns out that I’m actually interested in flexibility and endurance rather than power and speed, and I’ll reject this candidate since he didn’t demonstrate flexibility or endurance. So it is true that he described physical fitness and that I’m assessing based on your physical fitness, but it’s also true that the information offered and what I wanted to assess were very different.
What do you think about Joseph’s thoughts on those types of questions here: https://forum.effectivealtruism.org/posts/4towuFeBfbGn8hJGs/amber-dawn-s-shortform?commentId=2N7JqCYzyt7FHCti2
I don’t have any particularly strong views, and would be interested in what others think.
Broadly, I feel like I agree that more specificity/transparency is helpful, though I don’t feel convinced that it’s not also worth asking at some stage in the application an open-ended question like “Why are you interested in the role?”. Not sure I can explain/defend my intuitions here much right now but I would like to think more on it when I get around to writing some reflections on the Research Communicator hiring process.
I just want to say that I love seeing this kind of thing on the EA Forum, and it is so different from most other parts of the internet: I have a proposal or a suggestion, and it doesn’t quite mesh with what you think/feel. Neither of us have great justifications or clear data, and rather than ad hominems or posturing or some type of ‘battle,’ there is simply a bit of exchange and some reflection.
I really like that your response was reflective/pensive, rather than aggressive or defensive. Thanks for being one of the people that makes the internet ever-so-slightly better than it otherwise would be. ☺