Then they will go have experiences, and regardless of what they experience, if they then choose to âpinâ the EV-calculation to their own experience, the EV of switching to benefitting non-humans will be positive. So theyâll pay 2 pennies to switch back again. So they 100% predictably lost a penny. This is irrational.
Youâre assuming they will definitely have a human experience (e.g. because they are human) and so switch to benefitting non-humans. If youâre assuming that, but not allowing them to assume that themselves, then theyâre being exploited through asymmetric information or their priors not matching the situation at hand, not necessarily irrationality.
If they assume theyâre human, then they can pin to what theyâd expect to experience and believe as a human (even if they havenât experienced it yet themself), and then theyâd just prioritize non-humans from the start and never switch.
But you can instead assume itâs actually 50-50 whether you end up as a human or an alien, and you have these two options:
at an extra cost of 1 penny, get the human experience, or get the alien experience, 50% probability each, pin to it, and help the other beings.
at no extra cost, flip a coin, with heads for helping humans and tails for helping aliens, and then commit to following through on that, regardless of whether you end up having human experience or alien experience.
I think thereâs a question of which is actually better. Does 2 stochastically dominate 1? You find something out in 1, and then help the beings you will come to believe itâs best to help (although this doesnât seem like a proper Bayesian update from a prior). In 2, if you end up pinning to your own experience, youâll regret prioritizing humans if your experience is human, and youâll regret prioritizing aliens if your experience is alien.
Youâre assuming they will definitely have a human experience (e.g. because they are human) and so switch to benefitting non-humans. If youâre assuming that, but not allowing them to assume that themselves, then theyâre being exploited through asymmetric information or their priors not matching the situation at hand, not necessarily irrationality.
If they assume theyâre human, then they can pin to what theyâd expect to experience and believe as a human (even if they havenât experienced it yet themself), and then theyâd just prioritize non-humans from the start and never switch.
But you can instead assume itâs actually 50-50 whether you end up as a human or an alien, and you have these two options:
at an extra cost of 1 penny, get the human experience, or get the alien experience, 50% probability each, pin to it, and help the other beings.
at no extra cost, flip a coin, with heads for helping humans and tails for helping aliens, and then commit to following through on that, regardless of whether you end up having human experience or alien experience.
I think thereâs a question of which is actually better. Does 2 stochastically dominate 1? You find something out in 1, and then help the beings you will come to believe itâs best to help (although this doesnât seem like a proper Bayesian update from a prior). In 2, if you end up pinning to your own experience, youâll regret prioritizing humans if your experience is human, and youâll regret prioritizing aliens if your experience is alien.
See also this comment and this thread.