But the 2016 update found that they were actually almost identical – as I understand it, this update is because of a change in statistical technique (the handling of range restriction).
Regardless of whether structured or unstructured interviews are actually better, the fact that the result you get from academic literature depends on a fairly esoteric statistics question highlights how difficult it is to extract meaning from this research
For anyone who is curious, I have a bit of an update. (low priority, feel free to skip with no hard feelings)
It has puzzled me that a finding not supported by theory[1] didn’t get more press in the psychology world. It appears to have made almost no impact. I would expect that either A) there would be a bunch of articles citing it and claiming that we should change how we do hiring, or B) there would be some articles refuting the findings. I recently got the chance to ask some Industrial Organizational Psychologists what is going on here. Here is my little summary and exceprts of their answers:
The paper is a book chapter (which hasn’t gone through the peer-review process), and thus professional researchers don’t give it much consideration.
The paper is an exercise in methodological minutiae that’s not terribly useful or practical. (“what are you going to do with the results of that paper, even assuming they’re right on everything? Are you REALLY going to tell people that using an unstructured interview is fine? Just throw best practices, legal defensibility, etc. out the window?”[2])
There’s a time and a place for brute-force empiricism, but this is not one of them, especially when it informs high-stakes HR practices. “It really makes no theoretical sense to me why unstructured which by definition has so much noise introduced to it, would be more predictive then structured. I really would need a WHY and HOW to buy that argument, not just an application of statistical corrections.”
“That’s crazy that they would say unstructured interviews are better. That’s like choosing a Rorschach test over a MMPI.”
My takeaway is that (roughly speaking) I didn’t have enough domain knowledge and context to properly place and weigh that paper (thus supporting the claim that “average hiring manager can’t really get much useful information from these kinds of academic reviews”).
This and the other quotes are almost direct quotes, but I edited them to remove some profanity and correct grammar. People were really worked up about the issues with this paper, but I don’t think that kind of harsh language has a place on the EA Forum.
For anyone who is curious, I have a bit of an update. (low priority, feel free to skip with no hard feelings)
It has puzzled me that a finding not supported by theory[1] didn’t get more press in the psychology world. It appears to have made almost no impact. I would expect that either A) there would be a bunch of articles citing it and claiming that we should change how we do hiring, or B) there would be some articles refuting the findings. I recently got the chance to ask some Industrial Organizational Psychologists what is going on here. Here is my little summary and exceprts of their answers:
A more recent meta-analysis has re-asserted that structured interviews are more predictive.
The paper is a book chapter (which hasn’t gone through the peer-review process), and thus professional researchers don’t give it much consideration.
The paper is an exercise in methodological minutiae that’s not terribly useful or practical. (“what are you going to do with the results of that paper, even assuming they’re right on everything? Are you REALLY going to tell people that using an unstructured interview is fine? Just throw best practices, legal defensibility, etc. out the window?”[2])
There’s a time and a place for brute-force empiricism, but this is not one of them, especially when it informs high-stakes HR practices. “It really makes no theoretical sense to me why unstructured which by definition has so much noise introduced to it, would be more predictive then structured. I really would need a WHY and HOW to buy that argument, not just an application of statistical corrections.”
“That’s crazy that they would say unstructured interviews are better. That’s like choosing a Rorschach test over a MMPI.”
My takeaway is that (roughly speaking) I didn’t have enough domain knowledge and context to properly place and weigh that paper (thus supporting the claim that “average hiring manager can’t really get much useful information from these kinds of academic reviews”).
The finding that unstructured interviews have similar predictive validity to structured interviews, originally published as Rethinking the validity of interviews for employment decision making, and later cited in 2016 working paper.
This and the other quotes are almost direct quotes, but I edited them to remove some profanity and correct grammar. People were really worked up about the issues with this paper, but I don’t think that kind of harsh language has a place on the EA Forum.
Interesting, thanks for the follow-up Joseph! It makes sense that other meta-analyses would find different outcomes.