In other words, there seem to be values that are more related to executive functions (i.e. self-control) than affective states that feel good or bad? That seems like a plausible possibility.
There was a personality scale called ANPS (Affective Neuroscience Personality Scale) that was correlated with the Big Five personality traits. They found that conscienciousness wasn’t correlated with any of the six affects of the ANPS, while the other traits in the Big Five were correlated with at least one of the traits in the ANPS. So conscienciousness seems related to what you talk about (values that don’t come from affects). But at the same time, there was research about how much conscientious people are prone to experience guilt. They found that conscientiousness was positively correlated to how prone to guilt one is.
So, it seems that guilt is an experience of responsibility that is different in some way from the affective states that Panksepp talked about. And it’s related to conscientiousness that could be related to the ethical philosophical values you talked about and the executive functions.
Hm, I wonder if AIs should have something akin to guilt. That may lead to AI sentience, or it may not.
Bibliography
Uncovering the affective core of conscientiousness: the role of self-conscious emotions
Jennifer V Fayard et al., J Pers., 2012 Feb.
https://pubmed.ncbi.nlm.nih.gov/21241309/
Edit: I must say, I’m embarrassed by how much these comments of mine go by the “This makes intuitive sense!” logic, instead of doing rigurous reviews of scientific studies. I’m embarrassed by how my comments have such a low epistemic status. But I’m glad that at least some EA found this idea interesting.
Re edit, you should definitely not feel embarrassed. A forum comment will often be a mix of a few sources and intuition rather than a rigorous review of all available studies. I don’t think this must hold low epistemic status, especially for the purpose of the idea being exploration, rather than, say, a call for funding or such (which would require a higher standard of evidence). Not all EA discussions are literature reviews, otherwise chatting would be so cumbersome! I’d recommend using your studies to explore these and other ideas! Undergraduate studies are a wonderful time to soak up a ton of knowledge, and I look fondly upon mine—I hope you’ll have a similarly inspiring experience. Feel free to shoot me a pm if you ever want to discuss stuff.
Thanks for commenting!
In other words, there seem to be values that are more related to executive functions (i.e. self-control) than affective states that feel good or bad? That seems like a plausible possibility.
There was a personality scale called ANPS (Affective Neuroscience Personality Scale) that was correlated with the Big Five personality traits. They found that conscienciousness wasn’t correlated with any of the six affects of the ANPS, while the other traits in the Big Five were correlated with at least one of the traits in the ANPS. So conscienciousness seems related to what you talk about (values that don’t come from affects). But at the same time, there was research about how much conscientious people are prone to experience guilt. They found that conscientiousness was positively correlated to how prone to guilt one is.
So, it seems that guilt is an experience of responsibility that is different in some way from the affective states that Panksepp talked about. And it’s related to conscientiousness that could be related to the ethical philosophical values you talked about and the executive functions.
Hm, I wonder if AIs should have something akin to guilt. That may lead to AI sentience, or it may not.
Bibliography Uncovering the affective core of conscientiousness: the role of self-conscious emotions Jennifer V Fayard et al., J Pers., 2012 Feb. https://pubmed.ncbi.nlm.nih.gov/21241309/
A brief form of the Affective Neuroscience Personality Scales, Frederick S Barrett et al., Psychol Assess., 2013 Sep. https://pubmed.ncbi.nlm.nih.gov/23647046/
Edit: I must say, I’m embarrassed by how much these comments of mine go by the “This makes intuitive sense!” logic, instead of doing rigurous reviews of scientific studies. I’m embarrassed by how my comments have such a low epistemic status. But I’m glad that at least some EA found this idea interesting.
Re edit, you should definitely not feel embarrassed. A forum comment will often be a mix of a few sources and intuition rather than a rigorous review of all available studies. I don’t think this must hold low epistemic status, especially for the purpose of the idea being exploration, rather than, say, a call for funding or such (which would require a higher standard of evidence). Not all EA discussions are literature reviews, otherwise chatting would be so cumbersome!
I’d recommend using your studies to explore these and other ideas! Undergraduate studies are a wonderful time to soak up a ton of knowledge, and I look fondly upon mine—I hope you’ll have a similarly inspiring experience. Feel free to shoot me a pm if you ever want to discuss stuff.