OllieBase—interesting points, and a useful caution.
Insofar as EAs longtermism is starting to be viewed as ‘power-seeking’ by the general public, I think it’s important for us to distinguish ‘power’ from ‘influence’.
‘Power’ implies coercion, dominance, and the ability to do things that violate other people’s preferences and values.
Whereas ‘influence’ implies persuasion, discussion, and consensual decision-making that doesn’t violate other people’s interests.
Maybe we need to frame our discussion of longtermism in more ‘influence’ terms, e.g. ‘Here’s what we EAs are worried about, and what we hope for; we’re painfully aware of the many unknowns that the future may bring, and we invite everybody to join the discussion of what our human future should become; this is something we all have a stake in.’
The antidote to looking like arrogant, preachy power-seekers is to act like humble, open-minded influencers.
(By contrast, the pro-AGI accelerationists are actually power-seeking, in the sense of imposing radically powerful new technologies on the rest of humanity without anybody else’s understanding, input, consent, or support.)
Geoffrey I noticed that you used the words “humanity” and “human future” when referring to what longtermism is about. Well… I noticed it because I specifically searched for the term on the page and yours was the only one that used these terms in this way. I honestly expected there to be more uses of these descriptors.
I do find the speciesist bias in longtermism to be one thing that has always bothered me. It seems like animals are always left out of the discussion when it comes to the long term future. Some examples I can call to mind are the name of The Future of Humanity Institute or an OP sponsored Kurzgesagt video inadvertently promoting wild animal s-risk in other planets.
OllieBase—interesting points, and a useful caution.
Insofar as EAs longtermism is starting to be viewed as ‘power-seeking’ by the general public, I think it’s important for us to distinguish ‘power’ from ‘influence’.
‘Power’ implies coercion, dominance, and the ability to do things that violate other people’s preferences and values.
Whereas ‘influence’ implies persuasion, discussion, and consensual decision-making that doesn’t violate other people’s interests.
Maybe we need to frame our discussion of longtermism in more ‘influence’ terms, e.g. ‘Here’s what we EAs are worried about, and what we hope for; we’re painfully aware of the many unknowns that the future may bring, and we invite everybody to join the discussion of what our human future should become; this is something we all have a stake in.’
The antidote to looking like arrogant, preachy power-seekers is to act like humble, open-minded influencers.
(By contrast, the pro-AGI accelerationists are actually power-seeking, in the sense of imposing radically powerful new technologies on the rest of humanity without anybody else’s understanding, input, consent, or support.)
Geoffrey I noticed that you used the words “humanity” and “human future” when referring to what longtermism is about. Well… I noticed it because I specifically searched for the term on the page and yours was the only one that used these terms in this way. I honestly expected there to be more uses of these descriptors.
I do find the speciesist bias in longtermism to be one thing that has always bothered me. It seems like animals are always left out of the discussion when it comes to the long term future. Some examples I can call to mind are the name of The Future of Humanity Institute or an OP sponsored Kurzgesagt video inadvertently promoting wild animal s-risk in other planets.