Crucial questions for longtermists—Michael Aird (me), work-in-progress (this contains a few questions related to the Long Reflection, and links to a doc with some more relevant sources)
This post I’m commenting on (though it largely just quotes some of the above sources)
I’m also working on some other relevant posts, which I could share drafts of on request
(The differences between this comment and the post are that I’ll keep this comment up to date, it will just list sources without including quotes, and it won’t include some of the less relevant sources because there’s now more work on the Long Reflection than there was when I made this post.)
Collection of sources that are highly relevant to the idea of the Long Reflection
The Precipice—Toby Ord, 2020 (particularly chapter 7 and some of its endnotes)
Toby Ord on the 80,000 Hours Podcast − 2020
Will MacAskill on the 80,000 Hours Podcast − 2018
Will MacAskill on the AI Alignment Podcast − 2018 (see also Rohin Shah’s summary and commentary)
Cause prioritization for downside-focused value systems—Lukas Gloor, 2018 (I think)
AI Alignment Podcast: An Overview of Technical AI Alignment in 2018 and 2019 with Buck Shlegeris and Rohin Shah—FLI, 2020
Research agenda—Global Priorities Institute, 2019
Toby Ord’s interview with the LA Review of Books − 2020 (mostly repeats things from The Precipice)
Crucial questions for longtermists—Michael Aird (me), work-in-progress (this contains a few questions related to the Long Reflection, and links to a doc with some more relevant sources)
This comment exchange between Lukas Gloor and Michael Aird (me) − 2020
This post I’m commenting on (though it largely just quotes some of the above sources)
I’m also working on some other relevant posts, which I could share drafts of on request
(The differences between this comment and the post are that I’ll keep this comment up to date, it will just list sources without including quotes, and it won’t include some of the less relevant sources because there’s now more work on the Long Reflection than there was when I made this post.)