Thanks very much for writing this; I think it is important for EAs to become more aware of straussian and kabalistic messages which so suffuse our world.
However, I am skeptical of your analysis. While this was a very impressive effort for your second post on the forum, I think you fatally misinterpret the evidence here. This is not a story of her personal journey into EA, but a pre-mortem of existential risk.
You correctly start with the chorus:
And I fell from the pedestal Right down the rabbit hole Long story short, it was a bad time Pushed from the precipice Clung to the nearest lips Long story short, it was the wrong guy
This seems to be to be very clearly a direct reference not to the rabbit hole of reading EA literature, but to existential risk due to transformative AI.
At present humanity exists on a pedestal—we are richer and more numerous and more powerful than ever before, far exceeding any other species, and the whole light cone awaits us. But as mathematicians—like Lewis Carrol—develop stronger AI systems, alignment failure will result in the world getting weirder and weirder, as if falling down the rabbit hole. Humanity’s long future story of learning, growth and intergalactic colonisation is cut dramatically short—a very bad time indeed! - as these AIs push us over the edge of the precipice that Toby uses as an analogy for existential risk. We were blind to this risk because we clung to the words coming out of the lips of the wrong guy, rather than listening to sages like Toby, Nick and Eliezer.
This message is reinforced in the bridge:
Now I’m all about you I’m all about you, ah Yeah, Yeah I’m all about you, ah Yeah Yeah
This is actually a serious lament. Gone is the time of value diversity, where humans sought pleasure and art and friendship and love and wisdom and honour and joy and freedom and all the other good things in the world. Now this striving for the good has been replaced with a single goal that the AI is relentlessly optimising.
The verse continues in this theme:
Fatefully I tried to pick my battles ‘til the battle picked me Misery Like the war of words I shouted in my sleep And you passed right by I was in the alley, surrounded on all sides The knife cuts both ways If the shoe fits, walk in it ’til your high heels break
You are correct that the word ‘Fatefully’ references Schell’s book, but draw the wrong lesson from there on. In the past Taylor picked her own battles—she could focus on her own goals. But with the rise of TAI, this liberty was taken away from here, as she had to focus on protecting humanity. Alas, this was utterly futile, as she was too late—she should not have indulged in the belief she could choose her battles. This leads to misery, as her too-little-too-late efforts were as irrelevant as dreams, the development of AI passing right by. Eventually she succumbs to the robots, surrounded on all sides as they cut into her to harvest trace amounts of minerals from her body.
Moving on to the next verse:
Actually I always felt I must look better in the rear view Missing me At the golden gates they once held the keys to When I dropped my sword I threw it in the bushes and knocked on your door And we live in peace But if someone comes at us, this time, I’m ready
It is clear why she looks better in the rear view: in the past she was a beautiful and successful singer. Now her constituent atoms are being used for paperclip production. She does live in peace now—at least she has no further woes—but the last line is wistful. It is true vacuously, because all material implications are true if the antecedent is false: there is no-one left to come at her, because everyone is dead.
The next verse continues to describe the actions of the TAI.
No more keepin’ score Now I just keep you warm (Keep you warm) No more tug of war Now I just know there’s more (Know there’s more) No more keepin’ score Now I just keep you warm (Keep you warm) And my waves meet your shore Ever and evermore
There is no more keeping score because everything she once cared about has been destroyed, reset to zero, and there is no-one left to even observe this. The only thing being counted now is paperclips. As the AI is a singleton there is no more tug-of-war between different goals; only the relentless production of mental stationary, of which there can always be more.
Most tragic is when we learn what Taylor’s body is being used for—it appears her cells were fed into a furnace to power the factories that now cover the globe. Her waves—the waves of heat produced by the combustion of her body—meet the water that is being boiled to turn the turbines. And this will continue for ever and evermore, because the AI is a singleton, and there is no way to change course.
In the next verse Taylor provides advice to her past self:
Past me I wanna tell you not to get lost in these petty things Your nemeses Will defeat themselves before you get the chance to swing And he’s passing by Rare as the glimmer of a comet in the sky And he feels like home If the shoe fits, walk in it everywhere you go
She wants to tell her not to get tied up in petty things—namely anything other than AI alignment. After all, the nanobots will get her enemies before long. While she is lost in these petty things, AI development is passing her by. It is rare in the most important sense—it represents a hinge of history. Alas, because of her comfort and status quo bias she failed to see the danger, and now the silicon shoe she fits into perfectly (due to the disassembly of her body at the molecular level) controls her every step.
Alas, the last paragraph is the most tragic.
Climbed right back up the cliff Long story short, I survived
Like Winston in 1984, she has come to love Big Brother, and sees the relentless march of paperclips as the natural continuation of the human story.
Thanks very much for writing this; I think it is important for EAs to become more aware of straussian and kabalistic messages which so suffuse our world.
However, I am skeptical of your analysis. While this was a very impressive effort for your second post on the forum, I think you fatally misinterpret the evidence here. This is not a story of her personal journey into EA, but a pre-mortem of existential risk.
You correctly start with the chorus:
This seems to be to be very clearly a direct reference not to the rabbit hole of reading EA literature, but to existential risk due to transformative AI.
At present humanity exists on a pedestal—we are richer and more numerous and more powerful than ever before, far exceeding any other species, and the whole light cone awaits us. But as mathematicians—like Lewis Carrol—develop stronger AI systems, alignment failure will result in the world getting weirder and weirder, as if falling down the rabbit hole. Humanity’s long future story of learning, growth and intergalactic colonisation is cut dramatically short—a very bad time indeed! - as these AIs push us over the edge of the precipice that Toby uses as an analogy for existential risk. We were blind to this risk because we clung to the words coming out of the lips of the wrong guy, rather than listening to sages like Toby, Nick and Eliezer.
This message is reinforced in the bridge:
This is actually a serious lament. Gone is the time of value diversity, where humans sought pleasure and art and friendship and love and wisdom and honour and joy and freedom and all the other good things in the world. Now this striving for the good has been replaced with a single goal that the AI is relentlessly optimising.
The verse continues in this theme:
You are correct that the word ‘Fatefully’ references Schell’s book, but draw the wrong lesson from there on. In the past Taylor picked her own battles—she could focus on her own goals. But with the rise of TAI, this liberty was taken away from here, as she had to focus on protecting humanity. Alas, this was utterly futile, as she was too late—she should not have indulged in the belief she could choose her battles. This leads to misery, as her too-little-too-late efforts were as irrelevant as dreams, the development of AI passing right by. Eventually she succumbs to the robots, surrounded on all sides as they cut into her to harvest trace amounts of minerals from her body.
Moving on to the next verse:
It is clear why she looks better in the rear view: in the past she was a beautiful and successful singer. Now her constituent atoms are being used for paperclip production. She does live in peace now—at least she has no further woes—but the last line is wistful. It is true vacuously, because all material implications are true if the antecedent is false: there is no-one left to come at her, because everyone is dead.
The next verse continues to describe the actions of the TAI.
There is no more keeping score because everything she once cared about has been destroyed, reset to zero, and there is no-one left to even observe this. The only thing being counted now is paperclips. As the AI is a singleton there is no more tug-of-war between different goals; only the relentless production of mental stationary, of which there can always be more.
Most tragic is when we learn what Taylor’s body is being used for—it appears her cells were fed into a furnace to power the factories that now cover the globe. Her waves—the waves of heat produced by the combustion of her body—meet the water that is being boiled to turn the turbines. And this will continue for ever and evermore, because the AI is a singleton, and there is no way to change course.
In the next verse Taylor provides advice to her past self:
She wants to tell her not to get tied up in petty things—namely anything other than AI alignment. After all, the nanobots will get her enemies before long. While she is lost in these petty things, AI development is passing her by. It is rare in the most important sense—it represents a hinge of history. Alas, because of her comfort and status quo bias she failed to see the danger, and now the silicon shoe she fits into perfectly (due to the disassembly of her body at the molecular level) controls her every step.
Alas, the last paragraph is the most tragic.
Like Winston in 1984, she has come to love Big Brother, and sees the relentless march of paperclips as the natural continuation of the human story.