I agree that there is something very confused about worries of value drift. I tried to write something up about it before, although that didn’t land so well. Let’s try again.
I keep noticing something is confused when people worry about value drift because to me it seems they are worried they might learn more and decide they were wrong and now want something different. That to me seems good: if you don’t update and change in the face of new information you’re less alive and agenty and more dead and static. People will often phrase this though as worries that their life will change and they won’t, for example, want to be as altruistic because they are pulled away by other things, but to me this is a kind of confused clinging to what is now and expecting it to forever be. If you truly, deeply care about altruism, you’ll keep picking it in every moment, up until the world changes enough that you don’t.
Talking in terms of incentives I think helps make this clearer, in that people may want to be against the world changing in ways that will make it less likely to continue into a future they like. I think it’s even more general, though, and we should be worried about something like “world state listing” where the world fails to be more filled with what we desire and starts to change at random rather than as a result of our efforts. In this light worry about value drift is a short-sighted way of noticing one doesn’t want to the world state to list.
It seems they are worried they might learn more and decide they were wrong and now want something different… If you truly, deeply care about altruism, you’ll keep picking it in every moment, up until the world changes enough that you don’t.
I don’t object to learning more and realizing that I value different things, but there are a lot of other reasons I might end up with different priorities or values. Some of those are not exactly epistemically virtuous.
As a concrete example, I worry that living in the SF bay area is making me care less about extreme wealth disparities. I witness them so regularly that it’s hard for me to feel the same flare of frustration that I once did. This change has felt like a gradual hedonic adaptation, rather than a thoughtful shifting of my beliefs; the phrase “value drift” fits that experience well.
One solution here is, of course, not to use my emotional responses as a guide for my values (cf. Against Moral Intuitions) but emotions are a very useful decision-making shortcut and I’d prefer not to take on the cognitive overhead of suppressing them.
As a concrete example, I worry that living in the SF bay area is making me care less about extreme wealth disparities. I witness them so regularly that it’s hard for me to feel the same flare of frustration that I once did. This change has felt like a gradual hedonic adaptation, rather than a thoughtful shifting of my beliefs; the phrase “value drift” fits that experience well.
This seems to me adequately and better captured as saying the conditions of the world are different in ways that make you respond differently that you wouldn’t have endorsed prior to those conditions changing. That doesn’t mean your values changed, but the conditions to which you are responded changed such that your values are differently expressed; I suspect your values themselves didn’t change because you say you are worried about this change in behavior you’ve observed in yourself, and if your values had really changed you wouldn’t be worried.
My values being differently expressed seems very important, though. If I feel as if I value the welfare of distant people, but I stop taking actions in line with that (e.g. making donations to global poverty charities), do I still value it to the same extent?
That said, my example wasn’t about external behaviour changes, so you probably weren’t responding with that in mind.
I’ve inarguably experienced drift in the legibility of my values to myself, since I no longer have the same emotional signal for them. I find the the term “Value Drift” a useful shorthand for that, but it sounds like you find it makes things unclear?
My values being differently expressed seems very important, though. If I feel as if I value the welfare of distant people, but I stop taking actions in line with that (e.g. making donations to global poverty charities), do I still value it to the same extent?
Right, it sounds to me like you identify with your values in some way, like you wouldn’t consider yourself to still be yourself if they were different. That confuses things because now there’s this extra thing going on that feels causally relevant but isn’t, but I’m not sure I can hope to convince you in a short comment that you are not your values, even if your values are (temporarily) you.
Thanks Gordon; this nicely puts into words something I think about this. If a person changes their views as the result of research, introspection, or new information, this could appear, from the perspective of people still holding their former views, to be value drift. Even someone’s process of becoming altruistic could appear this way to people who hold their former values.
I agree that there is something very confused about worries of value drift. I tried to write something up about it before, although that didn’t land so well. Let’s try again.
I keep noticing something is confused when people worry about value drift because to me it seems they are worried they might learn more and decide they were wrong and now want something different. That to me seems good: if you don’t update and change in the face of new information you’re less alive and agenty and more dead and static. People will often phrase this though as worries that their life will change and they won’t, for example, want to be as altruistic because they are pulled away by other things, but to me this is a kind of confused clinging to what is now and expecting it to forever be. If you truly, deeply care about altruism, you’ll keep picking it in every moment, up until the world changes enough that you don’t.
Talking in terms of incentives I think helps make this clearer, in that people may want to be against the world changing in ways that will make it less likely to continue into a future they like. I think it’s even more general, though, and we should be worried about something like “world state listing” where the world fails to be more filled with what we desire and starts to change at random rather than as a result of our efforts. In this light worry about value drift is a short-sighted way of noticing one doesn’t want to the world state to list.
I don’t object to learning more and realizing that I value different things, but there are a lot of other reasons I might end up with different priorities or values. Some of those are not exactly epistemically virtuous.
As a concrete example, I worry that living in the SF bay area is making me care less about extreme wealth disparities. I witness them so regularly that it’s hard for me to feel the same flare of frustration that I once did. This change has felt like a gradual hedonic adaptation, rather than a thoughtful shifting of my beliefs; the phrase “value drift” fits that experience well.
One solution here is, of course, not to use my emotional responses as a guide for my values (cf. Against Moral Intuitions) but emotions are a very useful decision-making shortcut and I’d prefer not to take on the cognitive overhead of suppressing them.
This seems to me adequately and better captured as saying the conditions of the world are different in ways that make you respond differently that you wouldn’t have endorsed prior to those conditions changing. That doesn’t mean your values changed, but the conditions to which you are responded changed such that your values are differently expressed; I suspect your values themselves didn’t change because you say you are worried about this change in behavior you’ve observed in yourself, and if your values had really changed you wouldn’t be worried.
My values being differently expressed seems very important, though. If I feel as if I value the welfare of distant people, but I stop taking actions in line with that (e.g. making donations to global poverty charities), do I still value it to the same extent?
That said, my example wasn’t about external behaviour changes, so you probably weren’t responding with that in mind.
I’ve inarguably experienced drift in the legibility of my values to myself, since I no longer have the same emotional signal for them. I find the the term “Value Drift” a useful shorthand for that, but it sounds like you find it makes things unclear?
Right, it sounds to me like you identify with your values in some way, like you wouldn’t consider yourself to still be yourself if they were different. That confuses things because now there’s this extra thing going on that feels causally relevant but isn’t, but I’m not sure I can hope to convince you in a short comment that you are not your values, even if your values are (temporarily) you.
Thanks Gordon; this nicely puts into words something I think about this. If a person changes their views as the result of research, introspection, or new information, this could appear, from the perspective of people still holding their former views, to be value drift. Even someone’s process of becoming altruistic could appear this way to people who hold their former values.