On some level, I get all of this, and try to think about it to inspire myself, but ultimately I find it very hard to actually be comforted by this sort of thing. I’m sort of hesitant to write about why, both because I hope other people can find comfort in this without me Eeyoring all over the comments, and in part because, like everything I write about this topic, it is going to make me sound very weird in some ways, but it is important to how I, and maybe some other people, relate to this weird big thing when we look at it too long.
Adding these details makes everything feel even less real. There’s of course a little rational basis for this:
But it feels even harder to grasp than I think even this suggests. There was a post on here a little while ago I can’t quite manage to find, that basically talked about how the genre of thing AGI risk was looked really suspicious and rapturey in a way that should maybe raise alarm bells. I’m not so sure about this, I think in a way this genre of prediction is something of an attractor to many people and suspicious because of it, but even that isn’t the end of the world (no pun intended). I would kind of rather that occasionally some group of smart, dedicated people throw their life work at some false alarm rapture (at least if the dynamics of the relevant groups they are in are otherwise healthy and reasonable) than, well, this:
But this genre does make it harder for people like me to viscerally feel anything about the event at all. The most I can usually manage is seeing it as like, some sort of big digital asteroid headed for us, not as an unborn god.
When I actually do take AGI seriously emotionally, my reactions are really weird. I’m probably not alone in this, but over the last few months, starting with Yudkowsky’s doom post, and escalating with all of the capabilities developments that got released, this risk did suddenly feel emotionally real to me in a way it never did before, and mostly this was just emotionally devastating. Probably in part because it is much easier for me to feel the digital asteroid than unborn god thing. It has led me to realize that I have some really weird feelings about the whole thing.
One interesting thing I have sometimes seen people point out is that, selfishly, you really ought to hope that AGI comes sooner, because if it comes during your lifetime, you have a chance of living indefinitely, and if it kills you, well, you would have died only a little while afterwards anyway. Therefore, if you really hope that it does come later, because that increases the odds that it goes well, it means you are a really genuinely altruistic person. I do hope that AGI comes later rather than sooner, but on reflection, this is not an altruistic feeling, but just a really really confusing one.
I would prefer that AGI arrive one day after my natural lifespan ends than one day before. Maybe I am just not enough of a techno-optimist transhumanist type in my heart of hearts, but it just seems so horrible to me that I might be there when everything and everyone I love ends, even if there’s a very good chance that won’t be the result and things will instead be great. I think this might be sort of similar to how parents are able to stand the idea that their children will one day die. Maybe it is easier to cope with when they think to themselves “but I’ll be dead by then anyway”, even if this comfort ultimately isn’t grounded in something in itself very different—I think if a parent learned they would die fifty years later, they would suddenly be much more scared of their child’s death even if it were to happen at the same time.
I am also much more sad at the idea that something like factory farming could be ended by everything ending, rather than by us actually fixing anything, than I am made hopeful by the idea that the future could be great. I really don’t understand any of these feelings, none of them are what I would have predicted, but they really suck, both personally and for my priorities, and I’m perversely grateful that, mostly by purposely reading less and less about AGI recently, I am managing to make it feel more unreal to myself again.
Maybe the takeaway from all of this is that I am personally a real bummer, I’m struggling with some other mental health problems right now anyway that maybe distort how I’m able to relate to this issue. Also I probably maybe just shouldn’t work on AI safety. But anyway, this is some of the weird ways my reactions to this issue have gone in practice.
On some level, I get all of this, and try to think about it to inspire myself, but ultimately I find it very hard to actually be comforted by this sort of thing. I’m sort of hesitant to write about why, both because I hope other people can find comfort in this without me Eeyoring all over the comments, and in part because, like everything I write about this topic, it is going to make me sound very weird in some ways, but it is important to how I, and maybe some other people, relate to this weird big thing when we look at it too long.
Adding these details makes everything feel even less real. There’s of course a little rational basis for this:
https://www.readthesequences.com/Burdensome-Details
But it feels even harder to grasp than I think even this suggests. There was a post on here a little while ago I can’t quite manage to find, that basically talked about how the genre of thing AGI risk was looked really suspicious and rapturey in a way that should maybe raise alarm bells. I’m not so sure about this, I think in a way this genre of prediction is something of an attractor to many people and suspicious because of it, but even that isn’t the end of the world (no pun intended). I would kind of rather that occasionally some group of smart, dedicated people throw their life work at some false alarm rapture (at least if the dynamics of the relevant groups they are in are otherwise healthy and reasonable) than, well, this:
https://astralcodexten.substack.com/p/heuristics-that-almost-always-work
But this genre does make it harder for people like me to viscerally feel anything about the event at all. The most I can usually manage is seeing it as like, some sort of big digital asteroid headed for us, not as an unborn god.
When I actually do take AGI seriously emotionally, my reactions are really weird. I’m probably not alone in this, but over the last few months, starting with Yudkowsky’s doom post, and escalating with all of the capabilities developments that got released, this risk did suddenly feel emotionally real to me in a way it never did before, and mostly this was just emotionally devastating. Probably in part because it is much easier for me to feel the digital asteroid than unborn god thing. It has led me to realize that I have some really weird feelings about the whole thing.
One interesting thing I have sometimes seen people point out is that, selfishly, you really ought to hope that AGI comes sooner, because if it comes during your lifetime, you have a chance of living indefinitely, and if it kills you, well, you would have died only a little while afterwards anyway. Therefore, if you really hope that it does come later, because that increases the odds that it goes well, it means you are a really genuinely altruistic person. I do hope that AGI comes later rather than sooner, but on reflection, this is not an altruistic feeling, but just a really really confusing one.
I would prefer that AGI arrive one day after my natural lifespan ends than one day before. Maybe I am just not enough of a techno-optimist transhumanist type in my heart of hearts, but it just seems so horrible to me that I might be there when everything and everyone I love ends, even if there’s a very good chance that won’t be the result and things will instead be great. I think this might be sort of similar to how parents are able to stand the idea that their children will one day die. Maybe it is easier to cope with when they think to themselves “but I’ll be dead by then anyway”, even if this comfort ultimately isn’t grounded in something in itself very different—I think if a parent learned they would die fifty years later, they would suddenly be much more scared of their child’s death even if it were to happen at the same time.
I am also much more sad at the idea that something like factory farming could be ended by everything ending, rather than by us actually fixing anything, than I am made hopeful by the idea that the future could be great. I really don’t understand any of these feelings, none of them are what I would have predicted, but they really suck, both personally and for my priorities, and I’m perversely grateful that, mostly by purposely reading less and less about AGI recently, I am managing to make it feel more unreal to myself again.
Maybe the takeaway from all of this is that I am personally a real bummer, I’m struggling with some other mental health problems right now anyway that maybe distort how I’m able to relate to this issue. Also I probably maybe just shouldn’t work on AI safety. But anyway, this is some of the weird ways my reactions to this issue have gone in practice.