To provide anonymous (or non-anonymous) feedback, or to get in touch regarding other concerns, please use this form.
cwbakerlee
As highlighted in the job descriptions, the answer for what we’re looking for in both skills and experiences varies from role to role (i.e., operations ≠ grantmaking ≠ research). When we evaluate candidates, though, we usually are less asking “What degree do they have? How many years have they worked in this field?” and more asking “What have they accomplished so far (relative to their career stage)? Do they have the skills we’re looking for? Will they be a good fit with our team and help meet an ongoing need? Do we have reason to expect them to embody Open Philanthropy’s operating values from day one?”
AMA: Six Open Philanthropy staffers discuss OP’s new GCR hiring round
Agreed, and it’s something biosecurity folks (including some focused on GCBR mitigation) are increasingly thinking about. It’s a longstanding (and evolving) concern, but by no means a solved problem.
Hi more better,
Yeah, I can relate, these sorts of situations can be tough.
I work on the biosecurity & pandemic preparedness team at Open Philanthropy. In the realm of biosecurity at least, I’m happy to be a resource for helping troubleshoot these sorts of issues, including both general questions and more specific concerns. The best way to contact me, anonymously or non-anonymously, is through this short form.
Importantly, if you’re reaching out, please do not include potentially sensitive details of info hazards in form submissions – if necessary, we can arrange more secure means of follow-up communication, anonymous or otherwise (e.g., a phone call).
To build on Linch’s response here:
I work on the biosecurity & pandemic preparedness team at Open Philanthropy. Info hazard disclosure questions are often gnarly. I’m very happy to help troubleshoot these sorts of issues, including both general questions and more specific concerns. The best way to contact me, anonymously or non-anonymously, is through this short form. (Alternatively, you could reach my colleague Andrew Snyder-Beattie here.) Importantly, if you’re reaching out, please do not include potentially sensitive details of info hazards in form submissions – if necessary, we can arrange more secure means of follow-up communication, anonymous or otherwise (e.g., a phone call).
Thanks for the feedback, Vaidehi.
To your first point, I’ve modified the title to signal that this is a linkpost.
To your second point, this list is more concise in that it’s only ~35 items right now, which compares to ~60 in Greg’s list and ~90 in Tessa’s. It may be more friendly to newcomers in the sense that it may just be less overwhelming due to its brevity, and it includes fewer dense governmental reports and academic papers.
But overall I think different lists will work for different people, and for whatever reason when I made this list this is the presentation that struck me as aesthetically fitting the bill. Other people may disagree about which format is more useful, and my guess is that ultimately they’re just complementary.
[Linkpost] Yet another biosecurity reading list
I share in your hope that the attention to biorisks brought about by SARS-CoV-2 will make future generations safer and prevent even more horrible catastrophes from coming to pass.
However, I strongly disagree with your post, and I think you would have done well to heavily caveat the conclusion that this pandemic and other “endurable” disasters “may be overwhelmingly net-positive in expectation.”
Principally, while your core claim might hold water in some utilitarian analyses under certain assumptions, it almost definitely would be greeted with unambiguous opprobrium by other ethical systems, including the “common-sense morality” espoused by most people. As you note (but only in passing), this pandemic truly is an abject “tragedy.”
Given moral uncertainty, I think that, when making a claim as contentious as this one, it’s extremely important to take the time to explicitly consider it by the lights of several plausible ethical standards rather than applying just a single yardstick.
I suspect this lack of due consideration of other mainstream ethical systems underlies Khorton’s objection that the post “seems uncooperative with the rest of humanity.”
In addition, for what it’s worth, I would challenge your argument on its own terms. I’m sad to say that I’m far from convinced that the current pandemic will end up making us safer from worse catastrophes down the line. For example, it’s very possible that a surge in infectious disease research will lead to a rise in the number of scientists unilaterally performing dangerous experiments with pathogens and the likelihood of consequential accidental lab releases. (For more on this, I recommend Christian Enemark’s book Biosecurity Dilemmas, particularly the first few chapters.)
These are thorny issues, and I’d be more than happy to discuss all this offline if you’d like!
Hi Vaipan, I’ll take your question about the ratio of hires between AI safety and biosecurity. In short, no, it wouldn’t be correct to conclude that biosecurity is more talent constrained than AI safety. The number of roles is rather a reflection of our teams’ respective needs at the given moment.
And on the “more diverse consideration about GCR” question, note that my team is advertising for a contractor who will look into risks that lie outside biosecurity and AI safety, including nuclear weapons risks. Note though that I expect AI safety and biosecurity to remain more highly prioritized going forward.