In my expected/average future, almost all lives/experiences/utility/etc are in the long-term future.
Okay, so you’re thinking about what an outside observer would expect to happen. (Another approach is to focus on a single action A, and think about how A affects the long-run future in expectation.)
But regardless, the first quote is just about value, not about what we ought to do.
Coming back to this, in my experience the quote is used to express what we should do; it’s saying we should focus on affecting the far future, because that’s where the value is. It’s not merely pointing out where the value is, with no reference to being actionable.
To give a contrived example: suppose there’s a civilization in a galaxy far away that’s immeasurably larger than our total potential future, and we can give them ~infinite utility by sending them one photon. But they’re receding from us faster than the speed of light, so there’s nothing we can do about it. Here, all of the expected value is in this civilization, but it has no bearing on how the EA community should allocate our budget.
For the sake of abstract propositions about “what matters most,” it’s not necessarily constrained by what we know.
I just don’t think MacAskill/Greaves/others intended this to be interpreted as a perfect-information scenario with no practical relevance.
Okay, so you’re thinking about what an outside observer would expect to happen. (Another approach is to focus on a single action A, and think about how A affects the long-run future in expectation.)
Coming back to this, in my experience the quote is used to express what we should do; it’s saying we should focus on affecting the far future, because that’s where the value is. It’s not merely pointing out where the value is, with no reference to being actionable.
To give a contrived example: suppose there’s a civilization in a galaxy far away that’s immeasurably larger than our total potential future, and we can give them ~infinite utility by sending them one photon. But they’re receding from us faster than the speed of light, so there’s nothing we can do about it. Here, all of the expected value is in this civilization, but it has no bearing on how the EA community should allocate our budget.
I just don’t think MacAskill/Greaves/others intended this to be interpreted as a perfect-information scenario with no practical relevance.