I’m interested in hearing more about your thoughts on the Long Reflection. How likely is it to happen by default? How likely is it to produce a good outcome by default? What kind of things do you see as useful for making it more likely to happen and more likely to produce a good outcome? Anything else you want to say about it? Will you be writing it up somewhere in the near future (in which case I could just wait for that)?
The GPI Research Agenda references “Greg Lewis, The not-so-Long Reflection?, 2018” but I’m unable to find it anywhere.
ETA: I’ve been told that Greg’s article is currently in draft form and not publicly available, and both Toby Ord and Will MacAskill’s upcoming books will have some discussions of the Long Reflection.
I’m interested in hearing more about your thoughts on the Long Reflection. How likely is it to happen by default? How likely is it to produce a good outcome by default? What kind of things do you see as useful for making it more likely to happen and more likely to produce a good outcome? Anything else you want to say about it? Will you be writing it up somewhere in the near future (in which case I could just wait for that)?
The GPI Research Agenda references “Greg Lewis, The not-so-Long Reflection?, 2018” but I’m unable to find it anywhere.
ETA: I’ve been told that Greg’s article is currently in draft form and not publicly available, and both Toby Ord and Will MacAskill’s upcoming books will have some discussions of the Long Reflection.