https://www.ucl.ac.uk/risk-disaster-reduction/study/masters-programmes
Hi I wonder if anyone has come across this graduate programme and if they think this fits translates well into the EA space. There seems to be the option to focus on pandemics for example. Careers after graduating include policy/Phd/consulting—I could use the skills learned as leverage when looking into existential risks in the future? Your feedback is appreciated as I will be applying. Please also recommend anything else if you think it’s more relevant to the EA space.
I should note I’m coming in from the 80,000 hours space.
Thanks!
Hi there,
[Some unofficial thoughts from my own research before considering whether I should do a course like this one to be a civil servant. Other people come from different perspectives which could change the conclusions for them]
I wanted to learn more about global risks, and had the aim of working on security policy. I spent several months researching courses, speaking to people at the departments. There are quite a lot that I think could be good—this list are all places in London, but they seemed to be the best UK ones I could find when I was looking last year.
Risk and Disaster Science MSc at UCL
Science & International Security MA at Kings College London
Risk Analysis, Disasters and Resilience MSc at Kings College London
Security and Resilience: Science and Technology at Imperial College London
For the UCL courses, I found that the UCL Institute for Risk and Disaster Reduction has a big focus on natural risks, and so the degrees have a comparatively small amount of content on the anthropogenic risks (caused by humans) - see the unit guide for the course above here. In my view, in agreement with Toby Ord and much of the EA community, I think that anthropogenic risks are a much larger risk factor and so I felt the UCL course was not well targeted. For example there appeared to be more content on space weather (which is still important) than on nuclear security (which I think is far more important). I contacted the department at UCL to ask what they thought about their focus, given Toby Ord’s arguments, and didn’t get a response. Still, there could be something useful in that course. It seemed to me that this group at UCL was more focused on geography and physical risks rather than conflict, which I saw as a weakness.
I was much more impressed by the Kings College London faculty of War Studies, which has at least a dozen different degrees and as you might guess from the name, leans a lot more heavily on the anthropogenic risks, and looks at both state and non-state actors. Of all of their courses, I was most interested in the Science and International Security MA, and I’ve heard that one of the advisors at the Open Philanthropy Project did the course, and also a friend I made at EA London, who recommended the course. I found out more about the reading list, and I applied for the course and was made an offer. But when I went for the open day I didn’t find the other students interested in the course that motivating—I’d been working for about six years and most of the other people had come straight from university, though there were exceptions.
At the time I was (and still am) interested in lots of things: risks from new technologies, which included things like cybersecurity, encryption, autonomous weapons, and synthetic biology.
I also thought more about what I’d be learning on the course, and I had big update when I heard this in the 80K interview with Stuart Russell, a hugely influential figure in AI safety.
I realised that most of these courses would probably give me a good exposure to the key concepts in these areas, but having lurked around the EA forum and read a few books in these areas, and also worked in consulting, that I wasn’t likely to develop any radically new skills that would differentiate me as a graduate. I also looked up graduates from the courses on LinkedIn (a great trick, this one) and found that they often went into roles more junior than the one I was currently in. So I decided against the program, and accepted a job offer to work at CEA instead.
A few months ago, I stumbled across the Security and Resilience: Science and Technology at Imperial College London. This looks like a really interesting course, but I don’t have the technical knowledge to start on it.
I still want to develop technical skills, and I decided the best way to learn them was through doing some coding myself, and also doing the UCL MSc in Computer Science, which I was very pleased to get on to, and am looking forward to starting at the end of September. I think this one could potentially have a lot more value. I think this course has a lot of career capital which could be deployed in all of the many applications of ML, on information security, pandemic modelling, and lots of other things.
Other people will be in different positions to me, and I know several really smart people who’ve done these courses and got a lot of value out of them. For the position I was at with my career I ultimately decided they weren’t worth it for me personally, but other people in similar positions to me might have come to different conclusions.
Thanks so much for this response!
1) It does seem as though pandemics are now part of the agenda for the UCL course since 2020 (and climate change too) but you are right about it being focussed more on natural risks. I will e-mail/call their team to clarify about this and to also see if I can veer towards anthropogenic risks if I decide to take the programme.
2) I didn’t know about the Kings College courses. It’s great to know that this seems to be a recommended course by the community and I will look into it further. The Science and International Security Ma looks intriguing at first glance.
3) Congrats in regards to the Comp Sci Msc. MathisKirkBonde ( see below) has also suggested the same graduate course as an option. I’m currently waiting to see if I was accepted into a coding bootcamp. If I like it then I will surely consider this as an option to pursue in the following academic year.
Thanks—happy to help.
1) You’re right that pandemics and climate change are both part of the course. Taking the figures in the Precipice at face value, the biggest risks are unaligned AI and engineered pandemics. From the unit list at UCL, and the biographies of the unit leaders on the ‘natural and anthropogenic risks’ unit, Joanna Faure Walker (a geophysicist) and Punam Yadav (who focuses on humanitarian projects), I couldn’t see any specific content on weaponisation and conflict, which are topics I’m more interested in. That is not to say the course is not valuable—and there is no one EA route—but from my own perspective I think there’s a lot of technical background I’d like to cover. Also I could see nothing on risks from AI anywhere in the UCL course, which seems like an oversight given advances in autonomous weapons today.
2) Yes I’ve heard good things about the course. Having worked at CEA, I think it’s worth dispelling the myth that there are recommended courses by the whole EA community—it seems to me that EA is a lot more disparate than people think. And you might disagree with me, and decide to do the course anyway—or do many other totally different things!
3) It took me about 2 years to gradually move my interests over from international relations and conflict theory to wanting to study computer science.
Applying to masters courses is quite costly in time and fees—they require a personal statement, references from your undergraduate degree, and something like a £150 application fee, per application.
Bootcamps are typically rolling applications throughout the year, and if you did the masters you’d probably start Sep 2022, which means applying around Nov 2021 - June 2022 - quite a while away.
If you’re submitting masters applications and considering coding bootcamps at the same time, it seems to me like it’s worth spending a few months thinking about what you’re interested in, and your comparative advantage might be first. One way of doing this could be to try some online coding courses, and also read some IR books (e.g. Destined for War) and maybe write up a summary and review on the forum. Each of those would probably give you more information about each route and be a cheaper test than submitting applications.
1a) Thanks for the Precipice link, i didn’t know they quantified risks like this.
1b) I’ve just received an e-mail response from admissions stating they have “expertise in digital public health, climate change and catastrophic risk modelling but we don not consider existential risk in the module....it could be suggested as a topic for your MSc dissertation”. So +1 in regards to no content on weaponisation and conflict
2) haha yes I’m sure I’ll do many different things!
3) Yes I applied for the Risk programme because there were only 5 days left and wanted to reduce future regret. I won’t apply to many courses. Thanks for the suggestion in regards to writing a summary—sounds like a good idea!
I wrote some thoughts on risk analysis as a career path in my shortform here, which might be somewhat helpful. I echo people’s concern that this program focuses overly much on non-anthropogenic risk.
I also know an EA that did this course—I’ll send her details in a PM. :)
At first glance, this strikes me as a very exciting opportunity!
I think it’s difficult to find a degree that is more relevant to the EA community than this is. That said, there are degrees that are more relevant for specific cause areas. For example, an Msc. in Computer Science will probably prepare you better for work on direct AI alignment than this degree would.
I would advise you to think about what cause areas you could become excited to pursue, and how this degree will help you do so. I imagine it would be a great fit for quite a few!