1) You’re right that pandemics and climate change are both part of the course. Taking the figures in the Precipice at face value, the biggest risks are unaligned AI and engineered pandemics. From the unit list at UCL, and the biographies of the unit leaders on the ‘natural and anthropogenic risks’ unit, Joanna Faure Walker (a geophysicist) and Punam Yadav (who focuses on humanitarian projects), I couldn’t see any specific content on weaponisation and conflict, which are topics I’m more interested in. That is not to say the course is not valuable—and there is no one EA route—but from my own perspective I think there’s a lot of technical background I’d like to cover. Also I could see nothing on risks from AI anywhere in the UCL course, which seems like an oversight given advances in autonomous weapons today.
2) Yes I’ve heard good things about the course. Having worked at CEA, I think it’s worth dispelling the myth that there are recommended courses by the whole EA community—it seems to me that EA is a lot more disparate than people think. And you might disagree with me, and decide to do the course anyway—or do many other totally different things!
3) It took me about 2 years to gradually move my interests over from international relations and conflict theory to wanting to study computer science.
Applying to masters courses is quite costly in time and fees—they require a personal statement, references from your undergraduate degree, and something like a £150 application fee, per application.
Bootcamps are typically rolling applications throughout the year, and if you did the masters you’d probably start Sep 2022, which means applying around Nov 2021 - June 2022 - quite a while away.
If you’re submitting masters applications and considering coding bootcamps at the same time, it seems to me like it’s worth spending a few months thinking about what you’re interested in, and your comparative advantage might be first. One way of doing this could be to try some online coding courses, and also read some IR books (e.g. Destined for War) and maybe write up a summary and review on the forum. Each of those would probably give you more information about each route and be a cheaper test than submitting applications.
1a) Thanks for the Precipice link, i didn’t know they quantified risks like this.
1b) I’ve just received an e-mail response from admissions stating they have “expertise in digital public health, climate change and catastrophic risk modelling but we don not consider existential risk in the module....it could be suggested as a topic for your MSc dissertation”. So +1 in regards to no content on weaponisation and conflict
2) haha yes I’m sure I’ll do many different things!
3) Yes I applied for the Risk programme because there were only 5 days left and wanted to reduce future regret. I won’t apply to many courses. Thanks for the suggestion in regards to writing a summary—sounds like a good idea!
Thanks—happy to help.
1) You’re right that pandemics and climate change are both part of the course. Taking the figures in the Precipice at face value, the biggest risks are unaligned AI and engineered pandemics. From the unit list at UCL, and the biographies of the unit leaders on the ‘natural and anthropogenic risks’ unit, Joanna Faure Walker (a geophysicist) and Punam Yadav (who focuses on humanitarian projects), I couldn’t see any specific content on weaponisation and conflict, which are topics I’m more interested in. That is not to say the course is not valuable—and there is no one EA route—but from my own perspective I think there’s a lot of technical background I’d like to cover. Also I could see nothing on risks from AI anywhere in the UCL course, which seems like an oversight given advances in autonomous weapons today.
2) Yes I’ve heard good things about the course. Having worked at CEA, I think it’s worth dispelling the myth that there are recommended courses by the whole EA community—it seems to me that EA is a lot more disparate than people think. And you might disagree with me, and decide to do the course anyway—or do many other totally different things!
3) It took me about 2 years to gradually move my interests over from international relations and conflict theory to wanting to study computer science.
Applying to masters courses is quite costly in time and fees—they require a personal statement, references from your undergraduate degree, and something like a £150 application fee, per application.
Bootcamps are typically rolling applications throughout the year, and if you did the masters you’d probably start Sep 2022, which means applying around Nov 2021 - June 2022 - quite a while away.
If you’re submitting masters applications and considering coding bootcamps at the same time, it seems to me like it’s worth spending a few months thinking about what you’re interested in, and your comparative advantage might be first. One way of doing this could be to try some online coding courses, and also read some IR books (e.g. Destined for War) and maybe write up a summary and review on the forum. Each of those would probably give you more information about each route and be a cheaper test than submitting applications.
1a) Thanks for the Precipice link, i didn’t know they quantified risks like this.
1b) I’ve just received an e-mail response from admissions stating they have “expertise in digital public health, climate change and catastrophic risk modelling but we don not consider existential risk in the module....it could be suggested as a topic for your MSc dissertation”. So +1 in regards to no content on weaponisation and conflict
2) haha yes I’m sure I’ll do many different things!
3) Yes I applied for the Risk programme because there were only 5 days left and wanted to reduce future regret. I won’t apply to many courses. Thanks for the suggestion in regards to writing a summary—sounds like a good idea!