Indeed, the proportion of “doomers” with those philosophical objections to longtermism should be just as high as the rate of such philosophical objections among those typically considered neartermist.
I don’t think we’ll see this, largely because I expect having high AI x-risk estimates correlates with taking abstract arguments like longtermism seriously.
I don’t think we’ll see this, largely because I expect having high AI x-risk estimates correlates with taking abstract arguments like longtermism seriously.