Re WWOTF: You can (and should) think that there’s huge amounts of value at stake in the future, and even think that there’s much more value at stake in the future than there is in the present century, without thinking that value is linear in number of happy people. It diminishes the case a bit, but nowhere near enough for longtermism to not go through.
Sure you could have a view that it’s great to have 10^12 people, but no more than that, but that seems like a really weird thing to have written in the stars. Or that all that matters is creating the Machine God, so we haven’t attained any value yet. But that doesn’t seem great.
Do you have a gloss on a kind of view that threads the needle nicely without being too crazy, even if it doesn’t ultimately withstand scrutiny?
How much do you think that having lots of mostly or entirely identical future lives is differently valuable than having vastly different positive lives? (Because that would create a reasonable view on which a more limited number of future people can saturate the possible future value.)
Bostrom discusses things like this in Deep Utopia, under the label of ‘interestingness’ (where even if we edit post-humans to never be subjectively bored, maybe they run out of ‘objectively interesting’ things to do and this leads to value not being nearly as high as it could otherwise be). I don’t think he takes a stance on whether or how much interestingness actually matters, but I am only ~half way through the book so far.
Sure you could have a view that it’s great to have 10^12 people, but no more than that, but that seems like a really weird thing to have written in the stars. Or that all that matters is creating the Machine God, so we haven’t attained any value yet. But that doesn’t seem great.
Do you have a gloss on a kind of view that threads the needle nicely without being too crazy, even if it doesn’t ultimately withstand scrutiny?
How much do you think that having lots of mostly or entirely identical future lives is differently valuable than having vastly different positive lives? (Because that would create a reasonable view on which a more limited number of future people can saturate the possible future value.)
Bostrom discusses things like this in Deep Utopia, under the label of ‘interestingness’ (where even if we edit post-humans to never be subjectively bored, maybe they run out of ‘objectively interesting’ things to do and this leads to value not being nearly as high as it could otherwise be). I don’t think he takes a stance on whether or how much interestingness actually matters, but I am only ~half way through the book so far.