Without confidently claiming that it’s the case with these organisations, it seems worth flagging that if the sort of hard cutoffs you’re talking about don’t track talent particularly well, it may be worth it for orgs to pay the cost of having to review more applications rather than to risk some of the few talented people self excluding. It’s noteworthy that I can instantly think of three field leaders in AI safety who either didn’t start or didn’t finish undergrad.
Having said that, Andy Jones of Anthropic did put a pretty clear bar into his recent post pointing out the need for more engineers in safety:
Could write a substantial pull request for a major ML library.
Without confidently claiming that it’s the case with these organisations, it seems worth flagging that if the sort of hard cutoffs you’re talking about don’t track talent particularly well, it may be worth it for orgs to pay the cost of having to review more applications rather than to risk some of the few talented people self excluding. It’s noteworthy that I can instantly think of three field leaders in AI safety who either didn’t start or didn’t finish undergrad.
Having said that, Andy Jones of Anthropic did put a pretty clear bar into his recent post pointing out the need for more engineers in safety: Could write a substantial pull request for a major ML library.