Hi David, thanks for expanding the scope to dark traits.
The definition of D is insightful for speculations: “The general tendency to maximize one’s individual utility — disregarding, accepting, or malevolently provoking disutility for others —, accompanied by beliefs that serve as justifications.”
In other words, the “dark” core is “carelessness” (rather than “selfishness”).
I’ve hypothesized that one should expect a careless intelligent system pursuing a careless goal should be expected to exhibit dark traits (increasingly proportional to its intelligence, albeit with increased refinement, too). A system should simply be Machiavellian in pursuit of a goal that doesn’t involve consensual input from other systems.… Some traits may involve the interplay of D with the way the human mind works 😉🤓.
Reflecting on this implies that a “human-controlled AGI in pursuit of a careless goal” would still need to be reigned in compared with an authentically caring AGI (and corresponding goals)..
Hi David, thanks for expanding the scope to dark traits.
The definition of D is insightful for speculations: “The general tendency to maximize one’s individual utility — disregarding, accepting, or malevolently provoking disutility for others —, accompanied by beliefs that serve as justifications.”
In other words, the “dark” core is “carelessness” (rather than “selfishness”).
I’ve hypothesized that one should expect a careless intelligent system pursuing a careless goal should be expected to exhibit dark traits (increasingly proportional to its intelligence, albeit with increased refinement, too). A system should simply be Machiavellian in pursuit of a goal that doesn’t involve consensual input from other systems.… Some traits may involve the interplay of D with the way the human mind works 😉🤓.
Reflecting on this implies that a “human-controlled AGI in pursuit of a careless goal” would still need to be reigned in compared with an authentically caring AGI (and corresponding goals)..