RE: “I am curious, why do you care about Big Things without small things? Are Big Things not underpinned by values of small everyday things?”
Perhaps it has to do with the level of ambition. Let’s talk about a particular value to narrow down the discussion. Some people see “caring for all sentient beings” as an extension of empathy. Some others see it as a logical extension of a principle of impartiality or equality for all. I think I am more in this second camp. I don’t care about invertebrate welfare, for example, because I am particularly empathetic towards them. Most people find bugs to be a bit icky, particularly under a magnifying glass, which turns off their empathy.
Rather, they are suffering sentient beings, which means that the same arguments for why we should care about people (and their wellbeing/interests/preferences) also apply to these invertebrates. And caring about, say, invertebrate welfare, requires a use of reason towards impartiality that might sometimes make you de-prioritize friends and family.
Secondly, I also have a big curiosity about understanding the universe, society, etc. which makes me feel like I’m wasting my time in social situations of friends and family when the conversation topics are a bit trivial.
As I repeat a bit throughout the post, I realize I might be a bit of an psychological outlier here, but I hope people can also see why this perspective might be appealing. Most people are compartimenalizing their views on AI existential risk to a level that I’m not sure makes sense.
RE: “I am curious, why do you care about Big Things without small things? Are Big Things not underpinned by values of small everyday things?”
Perhaps it has to do with the level of ambition. Let’s talk about a particular value to narrow down the discussion. Some people see “caring for all sentient beings” as an extension of empathy. Some others see it as a logical extension of a principle of impartiality or equality for all. I think I am more in this second camp. I don’t care about invertebrate welfare, for example, because I am particularly empathetic towards them. Most people find bugs to be a bit icky, particularly under a magnifying glass, which turns off their empathy.
Rather, they are suffering sentient beings, which means that the same arguments for why we should care about people (and their wellbeing/interests/preferences) also apply to these invertebrates. And caring about, say, invertebrate welfare, requires a use of reason towards impartiality that might sometimes make you de-prioritize friends and family.
Secondly, I also have a big curiosity about understanding the universe, society, etc. which makes me feel like I’m wasting my time in social situations of friends and family when the conversation topics are a bit trivial.
As I repeat a bit throughout the post, I realize I might be a bit of an psychological outlier here, but I hope people can also see why this perspective might be appealing. Most people are compartimenalizing their views on AI existential risk to a level that I’m not sure makes sense.