I think a good TV series could be where each episode ends in AI Alignment failing and the world ending, but there is a gradual shift toward getting closer to solving things (or at least things failing in less obvious ways). Maybe there could be a plot device whereby a superintelligent AI is running multiple simulations, and each episode is a simulation run. Perhaps each one could start further back in history, with the Alignment problem becoming well known and acted upon earlier and earlier. Or there is somehow one character that is common to all simulation runs (episodes) and they accumulate knowledge about what doesn’t work. But generally, the world ending every episode is there to put emphasis on how difficult the problem is. Perhaps there could also be some Dying with Dignity involved. Maybe in the 7th season or something they will finally get it right (coinciding with similar developments happening in the real world? [One can hope.]) Or there could be a series with each episode depicting a crucial consideration for why alignment might happen by default, or become moot.
Needs some AGI x-risk stuff. I liked neXt.
I think a good TV series could be where each episode ends in AI Alignment failing and the world ending, but there is a gradual shift toward getting closer to solving things (or at least things failing in less obvious ways). Maybe there could be a plot device whereby a superintelligent AI is running multiple simulations, and each episode is a simulation run. Perhaps each one could start further back in history, with the Alignment problem becoming well known and acted upon earlier and earlier. Or there is somehow one character that is common to all simulation runs (episodes) and they accumulate knowledge about what doesn’t work. But generally, the world ending every episode is there to put emphasis on how difficult the problem is. Perhaps there could also be some Dying with Dignity involved. Maybe in the 7th season or something they will finally get it right (coinciding with similar developments happening in the real world? [One can hope.]) Or there could be a series with each episode depicting a crucial consideration for why alignment might happen by default, or become moot.
I will check out neXt, thanks. I like the idea of reboots, very Edge Of Tomorrow.