Why would you discount it like that? I mean – it does make sense to discount the value of a change when it happens later in a person’s life because they do not enjoy it for so long. But for future people, it is not that they would enjoy something to a lesser extent just because they exist in the future? Unless, of course, you are assuming that –
For example, if there was a change that would have applied only to a fraction of people just before the civilization ends, then it could make sense to assign a lower moral value to it, because less persons would enjoy it.
But I think that in EA it is popular to prevent the extinction of humanity, so this is why it may not make so much sense when you would bring it up. Of course it should. For example, if the future is worsening, maybe people are increasingly more villainish, then they could be discounted more. Or, if they become less conscious – I think Jason Schukraft wrote about this—if you are interested it is the Intensity of Valenced Experience across Species post on the EA Forum.
But anyway, why are you even thinking about this are you like interested in moral value or the future? Or just like maybe population ethics?
I mean so cool but there is a lot of material that no one has actually probably reviewed in its entirety so I am not sure who to even refer you to for this.
Why would you discount it like that? I mean – it does make sense to discount the value of a change when it happens later in a person’s life because they do not enjoy it for so long. But for future people, it is not that they would enjoy something to a lesser extent just because they exist in the future? Unless, of course, you are assuming that –
For example, if there was a change that would have applied only to a fraction of people just before the civilization ends, then it could make sense to assign a lower moral value to it, because less persons would enjoy it.
But I think that in EA it is popular to prevent the extinction of humanity, so this is why it may not make so much sense when you would bring it up. Of course it should. For example, if the future is worsening, maybe people are increasingly more villainish, then they could be discounted more. Or, if they become less conscious – I think Jason Schukraft wrote about this—if you are interested it is the Intensity of Valenced Experience across Species post on the EA Forum.
But anyway, why are you even thinking about this are you like interested in moral value or the future? Or just like maybe population ethics?
I mean so cool but there is a lot of material that no one has actually probably reviewed in its entirety so I am not sure who to even refer you to for this.
Anyway, yeah a great question though!!
Thanks for your submission!