I have added a note to my RAISE post-mortem, which I’m cross-posting here:
Edit November 2021: there is now the Cambridge AGI Safety Fundamentals course, which promises to be successful. It is enlightening to compare this project with RAISE. Why is that one succeeding while this one did not? I’m quite surprised to find that the answer isn’t so much about more funding, more senior people to execute it, more time, etc. They’re simply using existing materials instead of creating their own. This makes it orders of magnitude easier to produce the thing, you can just focus on the delivery. Why didn’t I, or anyone around me, think of this? I’m honestly perplexed. It’s worth thinking about.
Yeah, I also feel confused about why I didn’t have this thought when talking to you about RAISE.
Most proximately, AGI safety fundamentals uses existing materials because its format is based on the other EA university programs; and also because I didn’t have time to write (many) new materials for it.
I think the important underlying dynamic here is starting with a specific group of people with a problem, and then making the minimum viable product that solves their problem. In this case, I was explicitly thinking about what would have helped my past self the most.
Perhaps I personally didn’t have this thought back in 2019 because I was still in “figure out what’s up with AI safety” mode, and so wasn’t in a headspace where it was natural to try to convey things to other people.
I have added a note to my RAISE post-mortem, which I’m cross-posting here:
Edit November 2021: there is now the Cambridge AGI Safety Fundamentals course, which promises to be successful. It is enlightening to compare this project with RAISE. Why is that one succeeding while this one did not? I’m quite surprised to find that the answer isn’t so much about more funding, more senior people to execute it, more time, etc. They’re simply using existing materials instead of creating their own. This makes it orders of magnitude easier to produce the thing, you can just focus on the delivery. Why didn’t I, or anyone around me, think of this? I’m honestly perplexed. It’s worth thinking about.
Yeah, I also feel confused about why I didn’t have this thought when talking to you about RAISE.
Most proximately, AGI safety fundamentals uses existing materials because its format is based on the other EA university programs; and also because I didn’t have time to write (many) new materials for it.
I think the important underlying dynamic here is starting with a specific group of people with a problem, and then making the minimum viable product that solves their problem. In this case, I was explicitly thinking about what would have helped my past self the most.
Perhaps I personally didn’t have this thought back in 2019 because I was still in “figure out what’s up with AI safety” mode, and so wasn’t in a headspace where it was natural to try to convey things to other people.