Agreed, very important in my view! I’ve been meaning to post a very similar proposal with one important addition:
Anthropogenic causes of civilizational collapse are (arguably) much more likely than natural ones. These anthropogenic causes are enabled by technology. If we preserve an unbiased sample of today’s knowledge or even if it’s the knowledge that we consider to have been most important, it may just steer the next cycle of our civilization right into the same kind of catastrophe again. If we make the information particularly durable, maybe we’ll even steer all future cycles of our civilization into the same kind of catastrophe.
The selection of the information needs to be very carefully thought out. Maybe only information on thorium reactors rather than uranium ones; only information on clear energy sources; only information on proof of stake; only information on farming low-suffering food; no prose or poetry that glorifies natural death or war; etc.
I think that is also something that none of the existing projects take into account.
My comment from another thread applies here too:
Relatedly, see this post about continuing AI Alignment research after a GCR.
Very good!