Seth is a very smart, formidably well-informed and careful thinker—I’d highly recommend jumping on the opportunity to ask him questions.
His latest piece in the Bulletin of the Atomic Scientists is worth a read too. It’s on the “Stop Killer Robots” campaign. He agrees with Stuart Russell (and others)’s view that this is a bad road to go down, and also presents it as a test case for existential risk—a pre-emptive ban on a dangerous future technology:
“However, the most important aspect of the Campaign to Stop Killer Robots is the precedent it sets as a forward-looking effort to protect humanity from emerging technologies that could permanently end civilization or cause human extinction. Developments in biotechnology, geoengineering, and artificial intelligence, among other areas, could be so harmful that responding may not be an option. The campaign against fully autonomous weapons is a test-case, a warm-up. Humanity must get good at proactively protecting itself from new weapon technologies, because we react to them at our own peril.”
Seth is a very smart, formidably well-informed and careful thinker—I’d highly recommend jumping on the opportunity to ask him questions.
His latest piece in the Bulletin of the Atomic Scientists is worth a read too. It’s on the “Stop Killer Robots” campaign. He agrees with Stuart Russell (and others)’s view that this is a bad road to go down, and also presents it as a test case for existential risk—a pre-emptive ban on a dangerous future technology:
“However, the most important aspect of the Campaign to Stop Killer Robots is the precedent it sets as a forward-looking effort to protect humanity from emerging technologies that could permanently end civilization or cause human extinction. Developments in biotechnology, geoengineering, and artificial intelligence, among other areas, could be so harmful that responding may not be an option. The campaign against fully autonomous weapons is a test-case, a warm-up. Humanity must get good at proactively protecting itself from new weapon technologies, because we react to them at our own peril.”
http://thebulletin.org/stopping-killer-robots-and-other-future-threats8012