This argument is one that makes intuitive sense, and of course I am no exception to that intuition. However intuition is not the path to truth, logic is. Unless you can provide a logic-founded reason why almost certain loss with a minuscule chance of a huge win is worse than unlikely loss with a probable win, then I can’t accept the argument.
Transient Altruist
[Sorry I’ve had to edit the wording of parts of this comment 2 years later because I just can’t have super cringey writing from 16yo me sitting on the most upvoted post on my permanent profile]
Wow this is exactly the reply I was looking for, and more. Thank you so much!
Since I’m pretty new into philosophy, I believe what you say although I don’t understand it. However you have given me a ton of invaluable starting points from which I can now begin learning how to answer these kind of questions myself.
I imagine at some point in my life I’ll use these ideas to engage in major reflection on my life goals since it sounds like utilitarianism in the form I have always followed is flawed and will need to be revised or even scrapped entirely.
Once again, thanks so much!
Please could you elaborate on the relevance to Pascal’s Wager? I don’t see who is “out to get you” in Pascal’s Wager
I see what you’re saying but you’d need to provide me with a reason to accept your axiom.
Since I’m a moral realist, you’d have to convince me that it is likely to be true, rather than simply that it is convenient.
Well the only existing evidence for the nature of a God, given it exists, are the beliefs billions of people have held over thousands of years. This evidence suggests (no matter how weakly), that God is as they think it is. In the absence of any other evidence, this means it is more likely that God is as they think than anything else.
(Especially so when you think about how many people have believed these things and over how much time; surely it’s reasonable to consider the possibility that they are right. [I think I might be talking about “epistemic humility” but I’m not familiar with the terminology])
While I still disagree that the decision is non-binary, you do bring up a possibility I hadn’t thought of which is that NO ACTION could be the BEST ACTION if you think practicing the wrong religion makes you more likely to go to hell and less likely to go to heaven.
Although now I think about it, that wouldn’t imply no action, rather that you should encourage atheism, behaviour generally agreed upon across religions, and possibly converting people from one religion to a more likely one.
The way I see it, the wager IS binary, but the choice is “act as though heaven/hell exists: yes or no”. If you answer “yes”, then of course there are multiple ways to proceed from that point, but that doesn’t mean the wager itself isn’t binary.
If I decide to accept the wager, the next step will be a WHOLE other thing and definitely not binary.
I mean accepting that the way to do the most good (in terms of expected value) is to prevent as many people as possible from going to hell and cause as many as possible to go to heaven.
As for what this would entail, I have no idea because I’m pretty uninformed when it comes to religion.
Obviously pieces of the Bible can be used to justify any viewpoint, but I think it’s at least worth mentioning this one verse that points directly against the Christian God being evidentialist:
John 20:29
Jesus said to him, “Because you have seen me, you have believed. Blessed are those who have not seen, and have believed.”I see this as saying that doubting your faith by needing evidence is less noble that having full trust in your faith by not requiring evidence. In other words, true faith doesn’t need evidence.
I found this quote when someone pointed it out displayed at the front of a church, and regardless of its relevance to this conversation, I think it’s a fascinating verse, especially since it was considered important enough for this church to place in large writing at its entrance.
Huh, that’s an interesting position that I wish I could agree with, but I just can’t see why beliefs billions of people have had for thousands of years would be less likely to be true than a God who does in fact exist but is totally different from what everyone thought and instead rewards… reason?
Do you think you could elaborate on why this Evidentialist God seems more likely to you?
Ahh yes your last paragraph is a good point that I hadn’t considered. It doesn’t convince me that I should reject the wager, but it does mean that I shouldn’t take extreme actions that go against most people’s moral beliefs in pursuit of these types of wagers.
:( this is not the answer I was hoping for… (I don’t believe in heaven or hell so the prospect of accepting the wager is a bit depressing)
Thanks a lot though for the response and the really helpful link!
It would be great to see stats for how many people identify whether longtermism, shorttermism, or neither.
This seems to be a major divide in cause prioritisation and questions are often raised about how much of the community are longtermist so it seems like this information would be very valuable.
Huh it’s concerning that you say you see standard utilitarianism as wrong because I have no idea what to believe if not utilitarianism.
Do you know where I can find out more about the “undefined” issue? For me this is pretty much the most important thing for me to understand since my conclusion will fundamentally determine my behaviour for the rest of my life, yet I can’t find any information except for your posts.
Thanks so much for your response and posts. They’ve been hugely helpful to me
I am not very experienced in philosophy but I have a question.
You present a problem that needs solving: The funnel-shaped action profiles lead to undefined expected utility. You say that this conclusion means that we must adjust our reasoning so that we don’t get this conclusion.
But why do you assume that this cannot simply be the correct conclusion from utilitarianism? Can we not say that we have taken the principle axioms of utilitarianism and, through correct logical steps, deduced a truth (from the axiomatic truths) that expected utility is undefined for all our decisions?
To me the next step after reaching this point would not be to change my reasoning (which requires assuming that the logical processes applied were incorrect, no?) but rather to reject the axioms of utilitarianism, since they have rendered themselves ethically useless.
I have a fundamental ethical reasoning that I would guess is pretty common here? It is this: Given what we know about our deterministic (and maybe probabilistic) universe, there is nothing to suggest any existence of such things as good/bad or right choices/wrong choices and we come to the conclusion that nothing matters. However this is obviously useless and if nothing matters anyway then we might as well live by a kind of “next best” ethical philosophy that does provide us with right/wrong choices, just in case of the minuscule chance it is indeed correct.
However you seem to have suggested that utilitarianism just takes you back to the “nothing matters” situation which would mean we have to go to the “next next best” ethical philosophy.
Hmm I just realised your post has fundamentally changed every ethical decision of my life...
It would be greatly appreciated if anyone answers my question not only the OP, thanks!
That’s true but I think we need to make the least number of intuition based assumptions possible. Yitz’s suggestion adds an extra assumption ON TOP of expected value theory, so I would need a reason to add that assumption.Oops I got mixed up and that response related to a totally different comment. See my reply below for my actual response