How sure are you are right and the other EA (who has also likely thought carefully about their donations) is wrong, though? I’m much more confident that I will increase the impact of someone’s donation /​ spending if they are not in EA, rather than being too convinced of my own opinion and causing harm (by negative side effects, opportunity costs or lowering the value of their donation).
Personally speaking, if I say I think something is 10x as effective, I mean that as an all-things-considered statement, which includes deferring however much I think it is appropriate to the views of others.
It’s the same thing—if I think the expected value of one thing vs another is 10x, all things considered, then that is what I think the expected value is, already factoring in whatever chance I think there is that I am various versions of wrong, which is very under specified here.
For example, let’s say I do a back of the envelope calculation that says ABC is 20x as valuable as XYZ, but I see lots of people disagree with me. Then my estimate of the relative value of ABC vs XYZ will not be 20x, but probably some lower number, which could be 15x or 2x or 0.5x or 0.001x or −2x even (if it seems ABC is harmful somehow), depending on how uncertain I am and the strength of the evidence provided. That adjustment is already attempting to take into account the possibility of my thought process being bad, my argument being wrong etc.
Keep in mind that you’re not coercing them to switch their donations, just persuading them. That means you can use the fact that they were persuaded as evidence that you were on the right side of the argument. You being too convinced of your own opinion isn’t a problem unless other people are also somehow too convinced of it, and I don’t see why they would be.
How sure are you are right and the other EA (who has also likely thought carefully about their donations) is wrong, though?
I’m much more confident that I will increase the impact of someone’s donation /​ spending if they are not in EA, rather than being too convinced of my own opinion and causing harm (by negative side effects, opportunity costs or lowering the value of their donation).
Personally speaking, if I say I think something is 10x as effective, I mean that as an all-things-considered statement, which includes deferring however much I think it is appropriate to the views of others.
That’s not what I asked: In percentage points, how likely do you think you are right (and people who value e.g. GHWB over Animal Welfare are wrong)?
It’s the same thing—if I think the expected value of one thing vs another is 10x, all things considered, then that is what I think the expected value is, already factoring in whatever chance I think there is that I am various versions of wrong, which is very under specified here.
For example, let’s say I do a back of the envelope calculation that says ABC is 20x as valuable as XYZ, but I see lots of people disagree with me. Then my estimate of the relative value of ABC vs XYZ will not be 20x, but probably some lower number, which could be 15x or 2x or 0.5x or 0.001x or −2x even (if it seems ABC is harmful somehow), depending on how uncertain I am and the strength of the evidence provided. That adjustment is already attempting to take into account the possibility of my thought process being bad, my argument being wrong etc.
Keep in mind that you’re not coercing them to switch their donations, just persuading them. That means you can use the fact that they were persuaded as evidence that you were on the right side of the argument. You being too convinced of your own opinion isn’t a problem unless other people are also somehow too convinced of it, and I don’t see why they would be.