Interstellar civilization operating on technology indistinguishable from magic
‘Indistinguishable from magic’ is a huge overbid. No-one’s talking about FTL travel. There ’s nothing in current physics that prevents us building generation ships given a large enough economy, and a number of options consistent with known physics for propelling them some of which have alreadybeendeveloped, others of which are tangible but not yet in reach, others of which get prettyoutlandish.
I don’t see why nukes and pandemics and natural disaster risk should be approximately constant per planet or other relevant unit of volume for small groups of humans living in alien environments
Pandemics seem likely to be relatively constant. Biological colonies will have strict atmospheric controls, and might evolve (naturally or artificially) to be too different from each other for a single virus to target them all even if it could spread. Nukes aren’t a threat across star systems unless they’re accelerated to relativistic speeds (and then the nuclear-ness is pretty much irrelevant).
the risk of human extinction (as opposed to significant near-term utility loss) from pandemics, nukes or natural disasters is already zero
I don’t know anyone who asserts this. Ord and other longtermists think it’s very low, though not because of bunkers or vaccination. I think that the distinction between killing all and killing most people is substantially less important than those people (and you?) believe.
the AGI that destroys humans after they acquire interstellar capabilities is no more speculative than the AI that destroys humans next Tuesday
“Indistinguishable from magic” is an Arthur C Clarke quote about “any sufficiently advanced technology”, and I think you’re underestimating the complexity of building a generation ship and keeping it operational for hundreds, possibly thousands of years in deep space. Propulsion is pretty low on the list of problems if you’re skipping FTL travel, though you’re not likely to cross the galaxy with a solar sail or a 237mN thruster using xenon as propellant. (FWIW I actually work in the space industry and spent the last week speaking with people about projects to extract oxygen from lunar regolith and assemble megastructures in microgravity, so it’s not like I’m just dismissing the entire problem space here)
I think that the distinction between killing all and killing most people is substantially less important than those people (and you?) believe.
I’m actually in agreement with that point, but more due to putting more weight on the first 8 billion than the hypothetical orders of magnitude more hypothetical future humans. (I think in a lot of catastrophe scenarios technological knowledge and ambition rebounds just fine eventually, possibly stronger)
This is an absurd claim.
Why is it absurd? If humans can solve the problem of sending a generation ship to Alpha Centurai, an intelligence smart (and malevolent) enough to destroy 8 billion humans in their natural environment surely isn’t going to be stymied by the complexities involved in sending some weapons after them or transmitting a copy of itself to their computers...
‘Indistinguishable from magic’ is a huge overbid. No-one’s talking about FTL travel. There ’s nothing in current physics that prevents us building generation ships given a large enough economy, and a number of options consistent with known physics for propelling them some of which have already been developed, others of which are tangible but not yet in reach, others of which get pretty outlandish.
Pandemics seem likely to be relatively constant. Biological colonies will have strict atmospheric controls, and might evolve (naturally or artificially) to be too different from each other for a single virus to target them all even if it could spread. Nukes aren’t a threat across star systems unless they’re accelerated to relativistic speeds (and then the nuclear-ness is pretty much irrelevant).
I don’t know anyone who asserts this. Ord and other longtermists think it’s very low, though not because of bunkers or vaccination. I think that the distinction between killing all and killing most people is substantially less important than those people (and you?) believe.
This is an absurd claim.
“Indistinguishable from magic” is an Arthur C Clarke quote about “any sufficiently advanced technology”, and I think you’re underestimating the complexity of building a generation ship and keeping it operational for hundreds, possibly thousands of years in deep space. Propulsion is pretty low on the list of problems if you’re skipping FTL travel, though you’re not likely to cross the galaxy with a solar sail or a 237mN thruster using xenon as propellant. (FWIW I actually work in the space industry and spent the last week speaking with people about projects to extract oxygen from lunar regolith and assemble megastructures in microgravity, so it’s not like I’m just dismissing the entire problem space here)
I’m actually in agreement with that point, but more due to putting more weight on the first 8 billion than the hypothetical orders of magnitude more hypothetical future humans. (I think in a lot of catastrophe scenarios technological knowledge and ambition rebounds just fine eventually, possibly stronger)
Why is it absurd? If humans can solve the problem of sending a generation ship to Alpha Centurai, an intelligence smart (and malevolent) enough to destroy 8 billion humans in their natural environment surely isn’t going to be stymied by the complexities involved in sending some weapons after them or transmitting a copy of itself to their computers...