The long reflection as I remember it doesn’t have much to do with AGI destroying humanity, since AGI is something that on most timelines we expect to have resolved within the next century or two, whereas the long reflection was something Toby envisaged taking multiple centuries. The same probably applies to whole brain emulation.
This seems like quite an important problem for the long reflection case—it may be so slow a scenario that none of its conclusions will matter.
The long reflection as I remember it doesn’t have much to do with AGI destroying humanity, since AGI is something that on most timelines we expect to have resolved within the next century or two, whereas the long reflection was something Toby envisaged taking multiple centuries. The same probably applies to whole brain emulation.
This seems like quite an important problem for the long reflection case—it may be so slow a scenario that none of its conclusions will matter.