On the difference between the role we’ve tried to hire for at Open Phil specifically and a typical Security Analyst or Security Officer role, a few things come to mind, though we also think we don’t yet have a great sense of the range of security roles throughout the field. One possible difference is that many security roles focus on security systems for a single organization, whereas we’ve primarily looked for someone who could help both Open Phil and some of our grantees, each of whom have potentially quite different needs. Another possible difference is that our GCR focus in AI and biosecurity leads us to some non-standard threat models, and it has been difficult thus far for us to find experienced security experts who readily adapt standard security thinking to a somewhat different set of threat models.
Re: industry roles that would be particularly good or bad preparation. My guess is that for the GCR-mitigating roles we discuss above (i.e. not just potential future roles at Open Phil), the better-preparation for roles will tend to (a) expose one to many different types of challenges, and different aspects of those challenges, rather than being very narrowly scoped, (b) involve threat modeling of, and defense from, very capable and well-resourced attackers, and (c) require some development of novel solutions (not necessary new crypto research; could also just be new configurations of interacting hardware/software systems and user behavior policies and training), among other things.
On the difference between the role we’ve tried to hire for at Open Phil specifically and a typical Security Analyst or Security Officer role, a few things come to mind, though we also think we don’t yet have a great sense of the range of security roles throughout the field. One possible difference is that many security roles focus on security systems for a single organization, whereas we’ve primarily looked for someone who could help both Open Phil and some of our grantees, each of whom have potentially quite different needs. Another possible difference is that our GCR focus in AI and biosecurity leads us to some non-standard threat models, and it has been difficult thus far for us to find experienced security experts who readily adapt standard security thinking to a somewhat different set of threat models.
Re: industry roles that would be particularly good or bad preparation. My guess is that for the GCR-mitigating roles we discuss above (i.e. not just potential future roles at Open Phil), the better-preparation for roles will tend to (a) expose one to many different types of challenges, and different aspects of those challenges, rather than being very narrowly scoped, (b) involve threat modeling of, and defense from, very capable and well-resourced attackers, and (c) require some development of novel solutions (not necessary new crypto research; could also just be new configurations of interacting hardware/software systems and user behavior policies and training), among other things.