As I suggested in my first comment, you could do the same “by reporting other characteristics which play no role in selection, but which are heavily over-represented in successful applicants”: for example, you could report that >50% of successful applicants are male,[1] white, live in certain countries, >90% have liberal political beliefs, and probably a very disproportionately large number have read Harry Potter fan fic.[2] Presumably one could identify other traits which are associated with success via their association with these other traits e.g. if most successful applicants have PhDs and PhDs disproportionately tend to [drink red wine, ski etc.], then successful applicants may also disproportionately have these traits.
Of course, different people can disagree about whether or not each of these are causal. But even if they are predictive, I imagine that we would agree that at least one of these would likely mislead people. For example, having read Harry Potter fan fic is associated with being involved with communities interested in EA-related jobs for largely arbitrary historical reasons.[3]
This concern is particularly acute when we take into account the pragmatics of employers highlighting some specific fact.[4] People typically don’t offer irrelevant information for no reason. So if orgs go out of their way to say “>50% of successful applicants have PhDs”, even with the caveat about this not being causal, applicants will still reasonably wonder “Why are they telling me this?” and many will reasonably infer “What they want to convey is that this is a very competitive position and I should not apply.”
As I mentioned in the footnote of my comment above, there are jobs where this would be a reasonable inference. But I think most EA jobs are not like this.
If one wanted to provide applicants with full, non-misleading information, I think you would need to distinguish which of the cases applies, and provide a full account of the association which explains why successful applicants might often have PhDs, but that this is not the case when you control for x, y, z. That way (in theory), applicants would be able to know that conditional on them being a person who meets the requirements specified in the application (e.g. they can complete the coding test task), the fact that they don’t have a PhD does or does not imply anything about their chances of success. But I think that in practice, providing such an account for any given trait is either very difficult or impossible.[5]
Though in EA Survey data, there is no significant gender difference in likelihood of having an EA job. In fact, a slightly larger proportion of women tend to have EA jobs.
Of course, you could describe a situation where having read Harry Potter fan fic actually serves as a useful indicator of some relevant trait like involvement in the EA community. But, again, I’m not referring to cases like this. Even in cases where involvement in the EA community is of no relevance to the role at all (e.g. all you need to do to be hired is to perform some technical, testable skill, like coding very well), applicants are likely to be disproportionately interested in EA, and successful applicants may be yet further disproportionately interested in EA, even if it has nothing to do with selection.
This can happen if, for example, 50% of the applications are basically spam (e.g. applications from a large job site, who have barely read the job advert and don’t have any relevant skills but are applying for everything they can click on). In such cases, the subset of applications who are actually vaguely relevant, will be disproportionately people with an interest in EA, people with degrees etc.
In some countries there may be a norm of releasing information about certain characteristics, in which case this consideration doesn’t apply for those characteristics, but would for others.
And that is not taking into account the important question of whether all applicants would actually update on such information provided completely rationally, or would whether many would be irrationally inclined to be negative about their chances, and just conclude that they aren’t good enough to apply if they don’t have a PhD from a fancy institution.
As I suggested in my first comment, you could do the same “by reporting other characteristics which play no role in selection, but which are heavily over-represented in successful applicants”: for example, you could report that >50% of successful applicants are male,[1] white, live in certain countries, >90% have liberal political beliefs, and probably a very disproportionately large number have read Harry Potter fan fic.[2] Presumably one could identify other traits which are associated with success via their association with these other traits e.g. if most successful applicants have PhDs and PhDs disproportionately tend to [drink red wine, ski etc.], then successful applicants may also disproportionately have these traits.
Of course, different people can disagree about whether or not each of these are causal. But even if they are predictive, I imagine that we would agree that at least one of these would likely mislead people. For example, having read Harry Potter fan fic is associated with being involved with communities interested in EA-related jobs for largely arbitrary historical reasons.[3]
This concern is particularly acute when we take into account the pragmatics of employers highlighting some specific fact.[4] People typically don’t offer irrelevant information for no reason. So if orgs go out of their way to say “>50% of successful applicants have PhDs”, even with the caveat about this not being causal, applicants will still reasonably wonder “Why are they telling me this?” and many will reasonably infer “What they want to convey is that this is a very competitive position and I should not apply.”
As I mentioned in the footnote of my comment above, there are jobs where this would be a reasonable inference. But I think most EA jobs are not like this.
If one wanted to provide applicants with full, non-misleading information, I think you would need to distinguish which of the cases applies, and provide a full account of the association which explains why successful applicants might often have PhDs, but that this is not the case when you control for x, y, z. That way (in theory), applicants would be able to know that conditional on them being a person who meets the requirements specified in the application (e.g. they can complete the coding test task), the fact that they don’t have a PhD does or does not imply anything about their chances of success. But I think that in practice, providing such an account for any given trait is either very difficult or impossible.[5]
Though in EA Survey data, there is no significant gender difference in likelihood of having an EA job. In fact, a slightly larger proportion of women tend to have EA jobs.
None of these reflect real numbers from any actual hiring rounds, though they do reflect general disparities observed in the wider community.
Of course, you could describe a situation where having read Harry Potter fan fic actually serves as a useful indicator of some relevant trait like involvement in the EA community. But, again, I’m not referring to cases like this. Even in cases where involvement in the EA community is of no relevance to the role at all (e.g. all you need to do to be hired is to perform some technical, testable skill, like coding very well), applicants are likely to be disproportionately interested in EA, and successful applicants may be yet further disproportionately interested in EA, even if it has nothing to do with selection.
This can happen if, for example, 50% of the applications are basically spam (e.g. applications from a large job site, who have barely read the job advert and don’t have any relevant skills but are applying for everything they can click on). In such cases, the subset of applications who are actually vaguely relevant, will be disproportionately people with an interest in EA, people with degrees etc.
In some countries there may be a norm of releasing information about certain characteristics, in which case this consideration doesn’t apply for those characteristics, but would for others.
And that is not taking into account the important question of whether all applicants would actually update on such information provided completely rationally, or would whether many would be irrationally inclined to be negative about their chances, and just conclude that they aren’t good enough to apply if they don’t have a PhD from a fancy institution.