The desert hitchhiker: My intuition here is that if you are completely rational, you realize that if you don’t believe you will pay later you won’t get a ride now. In this sense the question feels similar to simply going to the store and the clerk says, you have to pay for that, and you say no I don’t, and they say yes you do, and you say, no really you can’t make me, and they say, yes I can. At this point, you pay if you are rational. The only difference being, in this case, you don’t actually have to pay, you just have to convince yourself you are going to pay.
The same can be said for the firefighting example if you know they have a lie detector. Once you know you can’t lie, this simplifies down to a non temporal problem IMO other than you don’t actually have to change your brainstate to make you help, you just have to convince yourself that that is the brainstate you have.
For kate thewriter, it feels like she isn’t actually being selfish, but rather just not thinking long term. Would she really quit writing or just not write as much?
Schelling’s answer to armed robbery: Is bluffing irrational? Only when the costs outweigh the gains. If bluffing is rational but you are too scared to bluff, simply change your brain to be less scared :).
The alien virus
I’m confused- so the virus makes us do good things but we don’t enjoy doing those things? So are we being mind controlled? What does it feel like to have this alien virus?
It seems like the claim is more selfish = greater potential valence.
Humans are mostly unique in that we are both able to have utility and have profound influences on others utility, hence there is an equilibrium where past which as consequentialists we need to change our worldview towards being selfish (but we are not close to this equilibrium imo, if you consider future humans plus animals probably have much more potential utility than us).
If there is one human and one dog in the world thing that doesn’t get the virus (and let’s say they live forever), and we say the dog has up to 2 potential units of utility and the human has 0 when unselfish and 2 when selfish, the virus should regulate my behavior to switch between being selfish and unselfish to max out the sum of the utility. I guess you might run into problems of getting stuck in local equilibria here though.
Also- I enjoyed the post alot, thought experiments are always fun.
Right, I had similar thoughts.
The desert hitchhiker: My intuition here is that if you are completely rational, you realize that if you don’t believe you will pay later you won’t get a ride now. In this sense the question feels similar to simply going to the store and the clerk says, you have to pay for that, and you say no I don’t, and they say yes you do, and you say, no really you can’t make me, and they say, yes I can. At this point, you pay if you are rational. The only difference being, in this case, you don’t actually have to pay, you just have to convince yourself you are going to pay.
The same can be said for the firefighting example if you know they have a lie detector. Once you know you can’t lie, this simplifies down to a non temporal problem IMO other than you don’t actually have to change your brainstate to make you help, you just have to convince yourself that that is the brainstate you have.
For kate the writer, it feels like she isn’t actually being selfish, but rather just not thinking long term. Would she really quit writing or just not write as much?
Schelling’s answer to armed robbery: Is bluffing irrational? Only when the costs outweigh the gains. If bluffing is rational but you are too scared to bluff, simply change your brain to be less scared :).
The alien virus
I’m confused- so the virus makes us do good things but we don’t enjoy doing those things? So are we being mind controlled? What does it feel like to have this alien virus?
It seems like the claim is more selfish = greater potential valence.
Humans are mostly unique in that we are both able to have utility and have profound influences on others utility, hence there is an equilibrium where past which as consequentialists we need to change our worldview towards being selfish (but we are not close to this equilibrium imo, if you consider future humans plus animals probably have much more potential utility than us).
If there is one human and one dog in the world thing that doesn’t get the virus (and let’s say they live forever), and we say the dog has up to 2 potential units of utility and the human has 0 when unselfish and 2 when selfish, the virus should regulate my behavior to switch between being selfish and unselfish to max out the sum of the utility. I guess you might run into problems of getting stuck in local equilibria here though.
Also- I enjoyed the post alot, thought experiments are always fun.