To me that doesn’t sound very different from “I want a future with less suffering, so I’m going to evaluate my impact based on how far humanity gets towards eradicating malaria and other painful diseases”. Which I guess is consistent with my views but doesn’t sound like most long-termists I’ve met.
Well, it wouldn’t work if you said “I want a future with less suffering, so I am going to evaluate my impact based on how many paper clips exist in the world at a given time”. Bostrom selects collaboration, technology and wisdom because he thinks they are the most important indicators of a better future and reduced x-risk. You are welcome to suggest other parameters for the evaluation function of course, but not every parameter works. If you read the analogy to chess in the link I posted it will become much more clear how Bostrom is thinking about this.
(if anyone reading this comment knows of evolutions in Bostrom’s thought since this lecture I would very much appreciate a reference)
To me that doesn’t sound very different from “I want a future with less suffering, so I’m going to evaluate my impact based on how far humanity gets towards eradicating malaria and other painful diseases”. Which I guess is consistent with my views but doesn’t sound like most long-termists I’ve met.
Well, it wouldn’t work if you said “I want a future with less suffering, so I am going to evaluate my impact based on how many paper clips exist in the world at a given time”. Bostrom selects collaboration, technology and wisdom because he thinks they are the most important indicators of a better future and reduced x-risk. You are welcome to suggest other parameters for the evaluation function of course, but not every parameter works. If you read the analogy to chess in the link I posted it will become much more clear how Bostrom is thinking about this.
(if anyone reading this comment knows of evolutions in Bostrom’s thought since this lecture I would very much appreciate a reference)