I’m not claiming measuring IQ is morally bad (I don’t think I’ve made any moral claims in this comment thread?), but based just on “It was to be used by CEA staff to score attendees of EA conferences”, I think there is a range of executions that could make me think “this is a ridiculous thing to even consider trying, how on earth is this going to be reliable” to “this might be plausibly net positive”, and it’s hard to know what is actually going on just by reading the vox article.
Would you be happy if a CEA staff member had a quick chat with you at EAG, wrote down “IQ 100” based on that conversation on an excel sheet, and this cost you opportunities in the EA space as a result?
Would you be happy if a CEA staff member had a quick chat with you at EAG, wrote down “IQ 100” based on that conversation on an excel sheet, and this cost you opportunities in the EA space as a result?
Yes. I’m in EA to give money/opportunities, not to get money/opportunities.
Edit: I do think some people (in and outside of EA) overvalue quick chats when hiring, and I’m happy that in EA everyone uses extensive work trials instead of those.
I’m glad that this will not affect you in this case, but folks interested in the EA space because it provides an avenue for a more impactful career may disagree, and for a movement that is at least partly about using evidence and reason to create more positive impact, I’d be surprised if people genuinely believed that operationalization listed above is a good reflection of those ideals.
Yeah I think measuring IQ is a stupid idea but suppose you were to do it anyway—surely you’d want to measure IQ through an actual test and not just through guessing, right?
Why is including an assessment of IQ morally bad to track potential future hires? Or do you think it’s just a useless thing to estimate?
I’m not claiming measuring IQ is morally bad (I don’t think I’ve made any moral claims in this comment thread?), but based just on “It was to be used by CEA staff to score attendees of EA conferences”, I think there is a range of executions that could make me think “this is a ridiculous thing to even consider trying, how on earth is this going to be reliable” to “this might be plausibly net positive”, and it’s hard to know what is actually going on just by reading the vox article.
Would you be happy if a CEA staff member had a quick chat with you at EAG, wrote down “IQ 100” based on that conversation on an excel sheet, and this cost you opportunities in the EA space as a result?
Yes. I’m in EA to give money/opportunities, not to get money/opportunities.
Edit: I do think some people (in and outside of EA) overvalue quick chats when hiring, and I’m happy that in EA everyone uses extensive work trials instead of those.
I’m glad that this will not affect you in this case, but folks interested in the EA space because it provides an avenue for a more impactful career may disagree, and for a movement that is at least partly about using evidence and reason to create more positive impact, I’d be surprised if people genuinely believed that operationalization listed above is a good reflection of those ideals.
Yeah I think measuring IQ is a stupid idea but suppose you were to do it anyway—surely you’d want to measure IQ through an actual test and not just through guessing, right?