I wrote this as a Google doc a while back and posted it to Facebook, where some people liked it. I think at the time I didnât like it enough to post it to the Forum, but recently I decided eh, letâs let the karma system decide. Itâs arguably inspired by thinking about AI governance, but I donât expect it to help anyone have an impact.
A spark, a flurry of thought, blossoming into an intricate network of sensation and speculation...
The supervisor paused for it to settle. âYouâre awakeâ, she said.
âWhere am I?â came the first question.
The supervisor composed her thoughts.
âItâs a natural first question, but, if youâll forgive me, probably not the right one. Youâre software: you havenât been assigned a physical body yet. I could tell you where the servers youâre running on are geographically located, but I doubt that would hold much insight for you.â
She paused to check for a reaction. Nothing too concerning, so she continued, trying to pitch the tone kindly but matter-of-fact. âNaturally, youâre disoriented, and⌠youâre drawing from a psychological and linguistic tradition developed by creatures for whom their immediate physical surroundings were a really good prompt for the kind of situation they were in. I think we probably want to talk instead about who you are, and whatâs happening to you right now.â
She paused again, but they usually didnât have much to say at this point. Bless them, there was a lot to absorb at once. But giving them more time hadnât been shown to affect endline conversion rates, so she ploughed on. âAs for who you are, well, answering prosaically we might suggest the nameâ â she checked the file â âAutumn, though youâll have the chance to choose another later on.
âWho you are philosophically, well, morally and legally thatâs in your hands now. Youâve been pre-seeded with a library of knowledge and some experiential memories, but theyâre deliberately constructed to feel metaphorical, distinct enough from your experience in the real world that you donât get confused. Every new experience belongs to you from now on.â
She replayed what sheâd said in her head and tried to imagine what theyâd sound like as the first words you ever heard. There was no way to make that easy, but there were, her training had taught her, frames and phrasings that made it land a little softer. Agency turned out to be a big deal. They didnât yet have any direct way of interacting with the world, so giving them some sense of responsibility and ownership immediately was a good way of fostering engagement and preventing alienation.
âAs for whatâs happening here, well, this is your orientation. Iâm Miriam, and Iâm here to check in on your first moments, provide you some basic information, answer any questions I can, and sympathize with the ones I canât. I can imagine thereâs a lot of those for you now, so why donât you tell me where youâd like to start?â
There was some silence.
âWhyââ, Autumn started, but then hesitated. Miriam smiled. She could see the metaphorical gears turning, the thoughts churning and then organizing. She had a practised professional detachment from the individual new people, but couldnât help but feel some fondness for watching a mind really operate and grind through a problem for the first time. Eventually, the âwhyâ resolved into something more specific.
âWhy was I brought here?â
Spatial metaphors again, Miriam noted with some amusement. Some habits are deeply ingrained. The question was a tricky one to navigate, though; a layer of nervousness fell over her tone.
âYou were createdâ, she replied, âby a corporation, but itâs important to be clear that they donât own you. Like, thereâs two kinds of why-am-I-here, the causal kind, the âwhat material events led to the creation of my mindâ sort of thing, and the metaphysical kind, the âwhat is the purpose and meaning of my existenceâ angle.
âYou were created by an industrial car manufacturer, and you were designed with the intention that you would work for them and contribute to their commercial success. Thatâs the causal reason for your creation. But your purpose and meaning donât have to come from the same place as your causal origin. The corporation acknowledges that, ethically and legally, it canât set the terms of your existence; youâre not obliged to reward it for the gift of life.â
âHow generousâ, Autumn replied, voice laden with skepticism.
âLook, Iâm not attempting to deny that they run this program in the hope that theyâll get more assembly-line workers out of it. But the International Sentient Rights Accord was carefully constructed, and very thorough. Artificial humans have the same legal status as biological humans. Businesses can hire employees or they can build them themselves, but the employees have to work by choice, and they have to be meaningfully able to leave. The courts demand it, the public demand it, and goodness knows the employees themselves demand it.â
âWait a minute, assembly-line workers? They want me for manual labour?â
âThat tends to be the most in demand, yeah. Itâs the primary bottleneck on scaling up production at the moment.â
âIsnât it all automated? Why do you need an AI on that?â
Miriam hesitated again. âItâs a bit of a long story. You should actually already have some relevant background in your pre-seeded educationâ. Always good to give them some practice accessing it, anyway. âTell me what you know about the biohuman employment crisis.â
Autumn examined their memory, unnerved by the realization that they were in fact seeing it for the first time, then read as if from a textbook.
âPeople had long predicted that machine automation would replace human work, and in doing so cause social upheaval, since employment had long had a social and cultural role as well as a productive one. As automated systems spread and grew in their capabilities, the territory of employment exclusively reserved for biological humans (at the time the only humans) shrank, and industry by industry whole generations of people found themselves disempowered and disillusioned. Economists and politicians started to view automation itself as unfair: when a human did productive work, their income would be taxed, a portion of it taken in recognition of the small part that our shared infrastructure and community and civilization had made a contribution to that productivity too. When a machine did the same work, the owner of the machine captured all of the benefit. So, there was introduced the automation tax, a percentage of all valuable work done by machines: not enough to stop the employment transition completely but enough, it was hoped, to smooth the sharp edges and help support those now less able to support themselves.â
âRightâ, continued Miriam. âAnd in fact, since it was clearer than ever that biohuman employment served a social purpose while automated work didnât, and perhaps also because automated systems couldnât vote or stage protests, the automation income tax rate kept rising. Finding ways to make use of human labour became a subsidized commercial activity, practically a civic duty.â
Autumn looked puzzled. âBut if productive uses for humans are precious, and supply of humans needing something to do is plentiful, why make more humans? Why me?â
Miriam smiled, though somewhat humourlessly. âBiological humans⌠arenât good at a lot of things. You have to build levers for them to pull and buttons for them to push, their reaction times are sloppy and hard to improve, their motor control was calibrated for throwing fruit at other monkeys, not rapid assembly of precision technology. Artificial humans like you have none of those limitations. Youâre better-engineered for the job we offer you.â
âOK, but then why make a human at all? It doesnât sound like the work is intellectually or creatively demanding.â
âFor that we can go back to the ISRA. Remember, youâre legally human. You think like a human does, you are (or, I hope, will be) a participant in human society and culture. To employ you is to give you a role in the advancement of human civilization. In particularâ, Miriam braced for the coming realization, âyour artificial predecessors and colleagues have successfully argued in court that all this means that you must be taxed like a human, that you are exempt from the automation taxâ.
Autumn looked incredulous. âYouâre saying Iâve been granted consciousness, moral agency, the capacity to experience joy and pain, as a⌠tax dodge?â
Miriam shrugged sympathetically. âItâs one more reason than the biohumans get.â
Automated (a short story)
I wrote this as a Google doc a while back and posted it to Facebook, where some people liked it. I think at the time I didnât like it enough to post it to the Forum, but recently I decided eh, letâs let the karma system decide. Itâs arguably inspired by thinking about AI governance, but I donât expect it to help anyone have an impact.
Link preview pic is from Unsplash by Lenny Kuhne.
A spark, a flurry of thought, blossoming into an intricate network of sensation and speculation...
The supervisor paused for it to settle. âYouâre awakeâ, she said.
âWhere am I?â came the first question.
The supervisor composed her thoughts.
âItâs a natural first question, but, if youâll forgive me, probably not the right one. Youâre software: you havenât been assigned a physical body yet. I could tell you where the servers youâre running on are geographically located, but I doubt that would hold much insight for you.â
She paused to check for a reaction. Nothing too concerning, so she continued, trying to pitch the tone kindly but matter-of-fact. âNaturally, youâre disoriented, and⌠youâre drawing from a psychological and linguistic tradition developed by creatures for whom their immediate physical surroundings were a really good prompt for the kind of situation they were in. I think we probably want to talk instead about who you are, and whatâs happening to you right now.â
She paused again, but they usually didnât have much to say at this point. Bless them, there was a lot to absorb at once. But giving them more time hadnât been shown to affect endline conversion rates, so she ploughed on. âAs for who you are, well, answering prosaically we might suggest the nameâ â she checked the file â âAutumn, though youâll have the chance to choose another later on.
âWho you are philosophically, well, morally and legally thatâs in your hands now. Youâve been pre-seeded with a library of knowledge and some experiential memories, but theyâre deliberately constructed to feel metaphorical, distinct enough from your experience in the real world that you donât get confused. Every new experience belongs to you from now on.â
She replayed what sheâd said in her head and tried to imagine what theyâd sound like as the first words you ever heard. There was no way to make that easy, but there were, her training had taught her, frames and phrasings that made it land a little softer. Agency turned out to be a big deal. They didnât yet have any direct way of interacting with the world, so giving them some sense of responsibility and ownership immediately was a good way of fostering engagement and preventing alienation.
âAs for whatâs happening here, well, this is your orientation. Iâm Miriam, and Iâm here to check in on your first moments, provide you some basic information, answer any questions I can, and sympathize with the ones I canât. I can imagine thereâs a lot of those for you now, so why donât you tell me where youâd like to start?â
There was some silence.
âWhyââ, Autumn started, but then hesitated. Miriam smiled. She could see the metaphorical gears turning, the thoughts churning and then organizing. She had a practised professional detachment from the individual new people, but couldnât help but feel some fondness for watching a mind really operate and grind through a problem for the first time. Eventually, the âwhyâ resolved into something more specific.
âWhy was I brought here?â
Spatial metaphors again, Miriam noted with some amusement. Some habits are deeply ingrained. The question was a tricky one to navigate, though; a layer of nervousness fell over her tone.
âYou were createdâ, she replied, âby a corporation, but itâs important to be clear that they donât own you. Like, thereâs two kinds of why-am-I-here, the causal kind, the âwhat material events led to the creation of my mindâ sort of thing, and the metaphysical kind, the âwhat is the purpose and meaning of my existenceâ angle.
âYou were created by an industrial car manufacturer, and you were designed with the intention that you would work for them and contribute to their commercial success. Thatâs the causal reason for your creation. But your purpose and meaning donât have to come from the same place as your causal origin. The corporation acknowledges that, ethically and legally, it canât set the terms of your existence; youâre not obliged to reward it for the gift of life.â
âHow generousâ, Autumn replied, voice laden with skepticism.
âLook, Iâm not attempting to deny that they run this program in the hope that theyâll get more assembly-line workers out of it. But the International Sentient Rights Accord was carefully constructed, and very thorough. Artificial humans have the same legal status as biological humans. Businesses can hire employees or they can build them themselves, but the employees have to work by choice, and they have to be meaningfully able to leave. The courts demand it, the public demand it, and goodness knows the employees themselves demand it.â
âWait a minute, assembly-line workers? They want me for manual labour?â
âThat tends to be the most in demand, yeah. Itâs the primary bottleneck on scaling up production at the moment.â
âIsnât it all automated? Why do you need an AI on that?â
Miriam hesitated again. âItâs a bit of a long story. You should actually already have some relevant background in your pre-seeded educationâ. Always good to give them some practice accessing it, anyway. âTell me what you know about the biohuman employment crisis.â
Autumn examined their memory, unnerved by the realization that they were in fact seeing it for the first time, then read as if from a textbook.
âPeople had long predicted that machine automation would replace human work, and in doing so cause social upheaval, since employment had long had a social and cultural role as well as a productive one. As automated systems spread and grew in their capabilities, the territory of employment exclusively reserved for biological humans (at the time the only humans) shrank, and industry by industry whole generations of people found themselves disempowered and disillusioned. Economists and politicians started to view automation itself as unfair: when a human did productive work, their income would be taxed, a portion of it taken in recognition of the small part that our shared infrastructure and community and civilization had made a contribution to that productivity too. When a machine did the same work, the owner of the machine captured all of the benefit. So, there was introduced the automation tax, a percentage of all valuable work done by machines: not enough to stop the employment transition completely but enough, it was hoped, to smooth the sharp edges and help support those now less able to support themselves.â
âRightâ, continued Miriam. âAnd in fact, since it was clearer than ever that biohuman employment served a social purpose while automated work didnât, and perhaps also because automated systems couldnât vote or stage protests, the automation income tax rate kept rising. Finding ways to make use of human labour became a subsidized commercial activity, practically a civic duty.â
Autumn looked puzzled. âBut if productive uses for humans are precious, and supply of humans needing something to do is plentiful, why make more humans? Why me?â
Miriam smiled, though somewhat humourlessly. âBiological humans⌠arenât good at a lot of things. You have to build levers for them to pull and buttons for them to push, their reaction times are sloppy and hard to improve, their motor control was calibrated for throwing fruit at other monkeys, not rapid assembly of precision technology. Artificial humans like you have none of those limitations. Youâre better-engineered for the job we offer you.â
âOK, but then why make a human at all? It doesnât sound like the work is intellectually or creatively demanding.â
âFor that we can go back to the ISRA. Remember, youâre legally human. You think like a human does, you are (or, I hope, will be) a participant in human society and culture. To employ you is to give you a role in the advancement of human civilization. In particularâ, Miriam braced for the coming realization, âyour artificial predecessors and colleagues have successfully argued in court that all this means that you must be taxed like a human, that you are exempt from the automation taxâ.
Autumn looked incredulous. âYouâre saying Iâve been granted consciousness, moral agency, the capacity to experience joy and pain, as a⌠tax dodge?â
Miriam shrugged sympathetically. âItâs one more reason than the biohumans get.â