Even just a few decades ago, a longtermist altruist would not have thought of risk from AI or synthetic biology, and wouldn’t have known that they could have taken action on them.
Minor point, but I think this is unclear. On AI see e.g. here. On synbio I’m less familiar but I’m guessing someone more than a few decades ago was able to think thoughts like “Once we understand cell biology realy well, seems like we might be able to engineer pathogens much more destructive than those served up by nature.”
On synbio I’m less familiar but I’m guessing someone more than a few decades ago was able to think thoughts like “Once we understand cell biology realy well, seems like we might be able to engineer pathogens much more destructive than those served up by nature.”
+1. I don’t know the intellectual history well but the risk from engineered pathogens should have been apparent 4 decades ago in 1975 if not (more likely, IMO) earlier.
A fairly random sample of writing on the topic:
Jack London’s 1910 short story “An Unparalleled Invasion” [CW: really racist] imagines genocide through biological warfare and the possibility that a “hybridization” between pathogens created “a new and frightfully virulent germ” (I don’t think he’s suggesting the hybridization was intentional but it’s a bit ambiguous).
the possibility of engineering pathogens was seriously discussed 4 decades ago at the Asilomar Conference in 1975.
There’s a 1982 sci-fi book by a famous writer where a vengeful molecular biologist releases a pathogen engineered to be GCR-or-worse.
In 1986, a U.S. Defense Department official was quoted saying ““The technology that now makes possible so-called ‘designer drugs’ also makes possible designer BW.”
In 2000 (admittedly just 2 decades ago) ~x-risk from engineered pathogens was explicitly worried about in “The Future Doesn’t Need Us.”
Szilard anticipated nuclear weapons (and launched a large and effective strategy to cause the liberal democracies to get them ahead of totalitarian states, although with regret), and was also concerned about germ warfare (along with many of the anti-nuclear scientists). See this 1949 story he wrote. Szilard seems very much like an agenty sophisticated anti-xrisk actor.
Plus the Soviet bioweapons program was actively at work to engineer pathogens for enhanced destructiveness during the 70s and 80s using new biotechnology (and had been using progessively more advanced methods through the 20th century.
Huh, thanks for the great link! I hadn’t seen that before, and had been under the impression that though some people (e.g. Good, Turing) had suggested the intelligence explosion, no-one really worried about the risks. Looks like I was just wrong about that.
Great post!
Minor point, but I think this is unclear. On AI see e.g. here. On synbio I’m less familiar but I’m guessing someone more than a few decades ago was able to think thoughts like “Once we understand cell biology realy well, seems like we might be able to engineer pathogens much more destructive than those served up by nature.”
+1. I don’t know the intellectual history well but the risk from engineered pathogens should have been apparent 4 decades ago in 1975 if not (more likely, IMO) earlier.
A fairly random sample of writing on the topic:
Jack London’s 1910 short story “An Unparalleled Invasion” [CW: really racist] imagines genocide through biological warfare and the possibility that a “hybridization” between pathogens created “a new and frightfully virulent germ” (I don’t think he’s suggesting the hybridization was intentional but it’s a bit ambiguous).
the possibility of engineering pathogens was seriously discussed 4 decades ago at the Asilomar Conference in 1975.
There’s a 1982 sci-fi book by a famous writer where a vengeful molecular biologist releases a pathogen engineered to be GCR-or-worse.
In 1986, a U.S. Defense Department official was quoted saying ““The technology that now makes possible so-called ‘designer drugs’ also makes possible designer BW.”
In 2000 (admittedly just 2 decades ago) ~x-risk from engineered pathogens was explicitly worried about in “The Future Doesn’t Need Us.”
Szilard anticipated nuclear weapons (and launched a large and effective strategy to cause the liberal democracies to get them ahead of totalitarian states, although with regret), and was also concerned about germ warfare (along with many of the anti-nuclear scientists). See this 1949 story he wrote. Szilard seems very much like an agenty sophisticated anti-xrisk actor.
Plus the Soviet bioweapons program was actively at work to engineer pathogens for enhanced destructiveness during the 70s and 80s using new biotechnology (and had been using progessively more advanced methods through the 20th century.
Huh, thanks for the great link! I hadn’t seen that before, and had been under the impression that though some people (e.g. Good, Turing) had suggested the intelligence explosion, no-one really worried about the risks. Looks like I was just wrong about that.