There’s obviously lots I disagree with here, but at bottom, I simply don’t think it’s the case that economically transformative AI necessarily entails singularity or catastrophe within 5 years in any plausible world: there are lots of imaginable scenarios compatible with the ground rules set for this exercise, and I think assigning accurate probabilities amongst them and relative to others is very, very difficult.
“Necessarily entails singularity or catastrophe”, while definitely correct, is a substantially stronger statement than I made. To violate the stated terms of the contest, an AGI must only violate “transforming the world sector by sector”. An AGI would not transform things gradually and limited to specific portions of the economy. It would be broad-spectrum and immediate. There would be narrow sectors which were rendered immediately unrecognizable and virtually every sector would be transformed drastically by five years in, and almost certainly by two years in.
An AGI which has any ability to self-improve will not wait that long. It will be months, not years, and probably weeks, not months. A ‘soft’ takeoff would still be faster than five years. These rules mandate not a soft takeoff, but no takeoff at all.
There’s obviously lots I disagree with here, but at bottom, I simply don’t think it’s the case that economically transformative AI necessarily entails singularity or catastrophe within 5 years in any plausible world: there are lots of imaginable scenarios compatible with the ground rules set for this exercise, and I think assigning accurate probabilities amongst them and relative to others is very, very difficult.
“Necessarily entails singularity or catastrophe”, while definitely correct, is a substantially stronger statement than I made. To violate the stated terms of the contest, an AGI must only violate “transforming the world sector by sector”. An AGI would not transform things gradually and limited to specific portions of the economy. It would be broad-spectrum and immediate. There would be narrow sectors which were rendered immediately unrecognizable and virtually every sector would be transformed drastically by five years in, and almost certainly by two years in.
An AGI which has any ability to self-improve will not wait that long. It will be months, not years, and probably weeks, not months. A ‘soft’ takeoff would still be faster than five years. These rules mandate not a soft takeoff, but no takeoff at all.