1) What was the length of time between you reading the sequences and doing research on the value alignment problem?
2) What portion of your time will now be spent on technical research? Also, what is Eliezer Yudkowsky spending most of his work-time on? Is he still writing up introductory stuff like he said in the HPMOR author notes?
3) What are any unstated pre-requisites for researching the value-alignment problem that aren’t in MIRI’s research guide? e.g. could include Real Analysis or particular types of programming ability
1) What was the length of time between you reading the sequences and doing research on the value alignment problem?
2) What portion of your time will now be spent on technical research? Also, what is Eliezer Yudkowsky spending most of his work-time on? Is he still writing up introductory stuff like he said in the HPMOR author notes?
3) What are any unstated pre-requisites for researching the value-alignment problem that aren’t in MIRI’s research guide? e.g. could include Real Analysis or particular types of programming ability