The purpose of preserving alignment is not to get back to AI as quickly as possible, but to make it more likely that when we eventually do climb the tech tree we are more likely to be able to align advanced AIs. Even if we have to reinvent a large number of technologies, having alignment research ready represents a (slightly non-standard) form of differential technological development rather than simply speeding up the recovery overall.
The purpose of preserving alignment is not to get back to AI as quickly as possible, but to make it more likely that when we eventually do climb the tech tree we are more likely to be able to align advanced AIs. Even if we have to reinvent a large number of technologies, having alignment research ready represents a (slightly non-standard) form of differential technological development rather than simply speeding up the recovery overall.