I thought the whole argument here was “how much value do we lose if (presumably misaligned) AI takes over”
I think the key question here is: compared to what? My position is that we lose a lot of potential value both from delaying AI and from having unaligned AI, but it’s not a crazy high reduction in either case. In other words they’re pretty comparable in terms of lost value.
Ranking the options in rough order (taking up your offer to be quantitative):
Aligned AIs built tomorrow: 100% of the value from my perspective
Aligned AIs built in 100 years: 50% of the value
Unaligned AIs built tomorrow: 15% of the value
Unaligned AIs built in 100 years: 25% of the value
Note that I haven’t thought about these exact numbers much.
What drives this huge drop? Naive utility would be very close to 100%. (Do you mean “aligned ais built in 100y if humanity still exists by that point”, which includes extinction risk before 2123?)
I attempted to explain the basic intuitions behind my judgement in this thread. Unfortunately it seems I did a poor job. For the full explanation you’ll have to wait until I write a post, if I ever get around to doing that.
The simple, short, and imprecise explanation is: I don’t really value humanity as a species as much as I value the people who currently exist, (something like) our current communities and relationships, our present values, and the existence of sentient and sapient life living positive experiences. Much of this will go away after 100 years.
I think the key question here is: compared to what? My position is that we lose a lot of potential value both from delaying AI and from having unaligned AI, but it’s not a crazy high reduction in either case. In other words they’re pretty comparable in terms of lost value.
Ranking the options in rough order (taking up your offer to be quantitative):
Aligned AIs built tomorrow: 100% of the value from my perspective
Aligned AIs built in 100 years: 50% of the value
Unaligned AIs built tomorrow: 15% of the value
Unaligned AIs built in 100 years: 25% of the value
Note that I haven’t thought about these exact numbers much.
What drives this huge drop? Naive utility would be very close to 100%. (Do you mean “aligned ais built in 100y if humanity still exists by that point”, which includes extinction risk before 2123?)
I attempted to explain the basic intuitions behind my judgement in this thread. Unfortunately it seems I did a poor job. For the full explanation you’ll have to wait until I write a post, if I ever get around to doing that.
The simple, short, and imprecise explanation is: I don’t really value humanity as a species as much as I value the people who currently exist, (something like) our current communities and relationships, our present values, and the existence of sentient and sapient life living positive experiences. Much of this will go away after 100 years.