It seems that the best approach to this sort of uncertainty is probabilistic thinking outlined by Max Harms here.
Rather than looking for certainty of evidence, we should look for sufficiency of evidence to act. Thus, we should not ask the question “will this do the most good” before acting, but rather “do I have sufficient evidence that this action will likely lead to the most good”? Otherwise, we risk falling into “analysis paralysis” and information bias, the thinking error of asking for too much information before acting.
Why is it better to look for sufficient evidence rather than maximizing expected value (keeping in mind that we can’t take expected value estimates literally)? Or are you just saying the same thing in a different way?
Because the question of sufficient evidence enables us to avoid information bias/analysis paralysis. There are high opportunity costs to not acting, and that is a very dangerous trap to avoid. The longer we deliberate, the more time slips by while we are gathering evidence. This causes us to fall into the status quo bias.
I don’t see how information bias would go away if we were only worried about sufficient evidence, and analysis paralysis doesn’t seem to be a problem with our current community. People like me and Michael might be really unsure about these things, but it doesn’t really inhibit our lives (afaik). I at least don’t spend too much time thinking about these things, but what time I do spend seems to lead towards robustly better coherence and understanding of the issues.
We might be miscommunicating about information bias. Here is a specific description of information bias: “information bias is believing that the more information that can be acquired to make a decision, the better, even if that extra information is irrelevant for the decision.”
In other words, if we have sufficient evidence to make a decision, then we shouldn’t worry about acquiring additional evidence, since that evidence is irrelevant for making decisions. This was in response to Michael’s earlier points about nothing being certain and the concerns about acting when nothing is certain.
Now, this doesn’t mean we can’t think about these issues, and try to gain robustly better coherence and understanding of the issues, as you say. It only speaks to the difference between thinking and actions. If we spend too much time thinking and gathering information, we don’t spend that time acting to advance human flourishing. Thinking is resource-intensive, and we need to understand that as an opportunity cost. It might be a very worthwhile activity, but it’s a trade-off against other worthwhile activities. That’s my whole point.
It seems that the best approach to this sort of uncertainty is probabilistic thinking outlined by Max Harms here.
Rather than looking for certainty of evidence, we should look for sufficiency of evidence to act. Thus, we should not ask the question “will this do the most good” before acting, but rather “do I have sufficient evidence that this action will likely lead to the most good”? Otherwise, we risk falling into “analysis paralysis” and information bias, the thinking error of asking for too much information before acting.
Why is it better to look for sufficient evidence rather than maximizing expected value (keeping in mind that we can’t take expected value estimates literally)? Or are you just saying the same thing in a different way?
Because the question of sufficient evidence enables us to avoid information bias/analysis paralysis. There are high opportunity costs to not acting, and that is a very dangerous trap to avoid. The longer we deliberate, the more time slips by while we are gathering evidence. This causes us to fall into the status quo bias.
I don’t see how information bias would go away if we were only worried about sufficient evidence, and analysis paralysis doesn’t seem to be a problem with our current community. People like me and Michael might be really unsure about these things, but it doesn’t really inhibit our lives (afaik). I at least don’t spend too much time thinking about these things, but what time I do spend seems to lead towards robustly better coherence and understanding of the issues.
We might be miscommunicating about information bias. Here is a specific description of information bias: “information bias is believing that the more information that can be acquired to make a decision, the better, even if that extra information is irrelevant for the decision.”
In other words, if we have sufficient evidence to make a decision, then we shouldn’t worry about acquiring additional evidence, since that evidence is irrelevant for making decisions. This was in response to Michael’s earlier points about nothing being certain and the concerns about acting when nothing is certain.
Now, this doesn’t mean we can’t think about these issues, and try to gain robustly better coherence and understanding of the issues, as you say. It only speaks to the difference between thinking and actions. If we spend too much time thinking and gathering information, we don’t spend that time acting to advance human flourishing. Thinking is resource-intensive, and we need to understand that as an opportunity cost. It might be a very worthwhile activity, but it’s a trade-off against other worthwhile activities. That’s my whole point.