Good to see more people thinking about this, but the vocabulary you say is needed already exists—look for things talking about “Global Catastrophic Risks” or “GCRs”.
A few other notes:
It would help if you embedded the images. (You just need to copy the image address from imgur.)
″ with a significant role played by their . ” ← ?
″ the ability for the future of our civilisation to deviate sufficiently from our set of values as to render this version of humanity meaningless from today’s perspective, similar to the ship of Theseus problem. ” ← I don’t think that’s a useful comparison.
Thanks for the comment! I am aware about the GCRI and GCRs but I don’t see the term getting used much and (in both cases) seem to get conflated with X-risks, but I haven’t addressed this at all in the piece so I will add an edit.
Thanks for catching the typo. I’ve been trying to embed the images but it hasn’t been working, so I’m contacting support for help.
The analogy I was making was that socially held values are liable to change and (usually) improve over time, and any specific value might not disqualify a future civilisation from being counted as valuable by us today, but at some point in the future there may be sufficient drift to make that claim. This may happen gradually and piecemeal as in the ship of Theseus. The full thought experiment also mentions restoration of rotting parts and asks whether these are also the ship of Theseus, similar to a Renaissance period.
Good to see more people thinking about this, but the vocabulary you say is needed already exists—look for things talking about “Global Catastrophic Risks” or “GCRs”.
A few other notes:
It would help if you embedded the images. (You just need to copy the image address from imgur.)
″ with a significant role played by their . ” ← ?
″ the ability for the future of our civilisation to deviate sufficiently from our set of values as to render this version of humanity meaningless from today’s perspective, similar to the ship of Theseus problem. ” ← I don’t think that’s a useful comparison.
Thanks for the comment! I am aware about the GCRI and GCRs but I don’t see the term getting used much and (in both cases) seem to get conflated with X-risks, but I haven’t addressed this at all in the piece so I will add an edit.
Thanks for catching the typo. I’ve been trying to embed the images but it hasn’t been working, so I’m contacting support for help.
The analogy I was making was that socially held values are liable to change and (usually) improve over time, and any specific value might not disqualify a future civilisation from being counted as valuable by us today, but at some point in the future there may be sufficient drift to make that claim. This may happen gradually and piecemeal as in the ship of Theseus. The full thought experiment also mentions restoration of rotting parts and asks whether these are also the ship of Theseus, similar to a Renaissance period.
These are not the same thing. GCR is just anything that’s bad on a massive scale, civilization doesn’t have to collapse.
There are a variety of definitions, but most of the GCR literature is in fact concerned with collapse risks. See Nick Bostrom’s book on the topic, for example, or Open Philanthropy’s definition: https://www.openphilanthropy.org/research/cause-reports/global-catastrophic-risks/global-catastrophic-risks