Then they will go have experiences, and regardless of what they experience, if they then choose to “pin” the EV-calculation to their own experience, the EV of switching to benefitting non-humans will be positive. So they’ll pay 2 pennies to switch back again. So they 100% predictably lost a penny. This is irrational.
You’re assuming they will definitely have a human experience (e.g. because they are human) and so switch to benefitting non-humans. If you’re assuming that, but not allowing them to assume that themselves, then they’re being exploited through asymmetric information or their priors not matching the situation at hand, not necessarily irrationality.
If they assume they’re human, then they can pin to what they’d expect to experience and believe as a human (even if they haven’t experienced it yet themself), and then they’d just prioritize non-humans from the start and never switch.
But you can instead assume it’s actually 50-50 whether you end up as a human or an alien, and you have these two options:
at an extra cost of 1 penny, get the human experience, or get the alien experience, 50% probability each, pin to it, and help the other beings.
at no extra cost, flip a coin, with heads for helping humans and tails for helping aliens, and then commit to following through on that, regardless of whether you end up having human experience or alien experience.
I think there’s a question of which is actually better. Does 2 stochastically dominate 1? You find something out in 1, and then help the beings you will come to believe it’s best to help (although this doesn’t seem like a proper Bayesian update from a prior). In 2, if you end up pinning to your own experience, you’ll regret prioritizing humans if your experience is human, and you’ll regret prioritizing aliens if your experience is alien.
You’re assuming they will definitely have a human experience (e.g. because they are human) and so switch to benefitting non-humans. If you’re assuming that, but not allowing them to assume that themselves, then they’re being exploited through asymmetric information or their priors not matching the situation at hand, not necessarily irrationality.
If they assume they’re human, then they can pin to what they’d expect to experience and believe as a human (even if they haven’t experienced it yet themself), and then they’d just prioritize non-humans from the start and never switch.
But you can instead assume it’s actually 50-50 whether you end up as a human or an alien, and you have these two options:
at an extra cost of 1 penny, get the human experience, or get the alien experience, 50% probability each, pin to it, and help the other beings.
at no extra cost, flip a coin, with heads for helping humans and tails for helping aliens, and then commit to following through on that, regardless of whether you end up having human experience or alien experience.
I think there’s a question of which is actually better. Does 2 stochastically dominate 1? You find something out in 1, and then help the beings you will come to believe it’s best to help (although this doesn’t seem like a proper Bayesian update from a prior). In 2, if you end up pinning to your own experience, you’ll regret prioritizing humans if your experience is human, and you’ll regret prioritizing aliens if your experience is alien.
See also this comment and this thread.