This seems likely to be incorrect to me, at least sometimes. In particular I disagree with the suggestion that the improvement on the margin is likely to be only on the order of 5%.
Let’s take someone who moves from donating to global health causes to donating to help animals. It’s very plausible that they may think the difference in effectiveness there is by a factor of 10, or even more.
They may also think that non-EA dollars are more easily persuaded to donate to global health initiatives than animal welfare ones. In this case, if a non-EA dollar is 80% likely to go to global health, and 20% to animal welfare, then by their own lights the change in use of their dollar was more than 3x as important as the introduction of the extra non-EA dollar.
I think that EA donors are likely to be unusual in this respect—you’re pre-selecting for people who have signed up for a culture of doing what’s best even when it wasn’t what they thought it was before.
I guess also I think that my arguments for animal welfare charities are at their heart EA-style arguments, so I’m getting a big boost to my likelihood of persuading someone by knowing that they’re the kind of person who appreciates EA-style arguments.
How sure are you are right and the other EA (who has also likely thought carefully about their donations) is wrong, though? I’m much more confident that I will increase the impact of someone’s donation /​ spending if they are not in EA, rather than being too convinced of my own opinion and causing harm (by negative side effects, opportunity costs or lowering the value of their donation).
Personally speaking, if I say I think something is 10x as effective, I mean that as an all-things-considered statement, which includes deferring however much I think it is appropriate to the views of others.
It’s the same thing—if I think the expected value of one thing vs another is 10x, all things considered, then that is what I think the expected value is, already factoring in whatever chance I think there is that I am various versions of wrong, which is very under specified here.
For example, let’s say I do a back of the envelope calculation that says ABC is 20x as valuable as XYZ, but I see lots of people disagree with me. Then my estimate of the relative value of ABC vs XYZ will not be 20x, but probably some lower number, which could be 15x or 2x or 0.5x or 0.001x or −2x even (if it seems ABC is harmful somehow), depending on how uncertain I am and the strength of the evidence provided. That adjustment is already attempting to take into account the possibility of my thought process being bad, my argument being wrong etc.
Keep in mind that you’re not coercing them to switch their donations, just persuading them. That means you can use the fact that they were persuaded as evidence that you were on the right side of the argument. You being too convinced of your own opinion isn’t a problem unless other people are also somehow too convinced of it, and I don’t see why they would be.
This seems likely to be incorrect to me, at least sometimes. In particular I disagree with the suggestion that the improvement on the margin is likely to be only on the order of 5%.
Let’s take someone who moves from donating to global health causes to donating to help animals. It’s very plausible that they may think the difference in effectiveness there is by a factor of 10, or even more.
They may also think that non-EA dollars are more easily persuaded to donate to global health initiatives than animal welfare ones. In this case, if a non-EA dollar is 80% likely to go to global health, and 20% to animal welfare, then by their own lights the change in use of their dollar was more than 3x as important as the introduction of the extra non-EA dollar.
Similarly if you think animal charities are 10x global health charities in effectiveness, then you think these options are equally good:
Move 10 EA donors from global health to animal welfare
Add 9 new animal welfare donors who previously weren’t donating at all
To me, the first of these sounds way easier.
IIRC studies show it’s easier to motivate people to give more than to shift existing donations.
I think that EA donors are likely to be unusual in this respect—you’re pre-selecting for people who have signed up for a culture of doing what’s best even when it wasn’t what they thought it was before.
I guess also I think that my arguments for animal welfare charities are at their heart EA-style arguments, so I’m getting a big boost to my likelihood of persuading someone by knowing that they’re the kind of person who appreciates EA-style arguments.
How sure are you are right and the other EA (who has also likely thought carefully about their donations) is wrong, though?
I’m much more confident that I will increase the impact of someone’s donation /​ spending if they are not in EA, rather than being too convinced of my own opinion and causing harm (by negative side effects, opportunity costs or lowering the value of their donation).
Personally speaking, if I say I think something is 10x as effective, I mean that as an all-things-considered statement, which includes deferring however much I think it is appropriate to the views of others.
That’s not what I asked: In percentage points, how likely do you think you are right (and people who value e.g. GHWB over Animal Welfare are wrong)?
It’s the same thing—if I think the expected value of one thing vs another is 10x, all things considered, then that is what I think the expected value is, already factoring in whatever chance I think there is that I am various versions of wrong, which is very under specified here.
For example, let’s say I do a back of the envelope calculation that says ABC is 20x as valuable as XYZ, but I see lots of people disagree with me. Then my estimate of the relative value of ABC vs XYZ will not be 20x, but probably some lower number, which could be 15x or 2x or 0.5x or 0.001x or −2x even (if it seems ABC is harmful somehow), depending on how uncertain I am and the strength of the evidence provided. That adjustment is already attempting to take into account the possibility of my thought process being bad, my argument being wrong etc.
Keep in mind that you’re not coercing them to switch their donations, just persuading them. That means you can use the fact that they were persuaded as evidence that you were on the right side of the argument. You being too convinced of your own opinion isn’t a problem unless other people are also somehow too convinced of it, and I don’t see why they would be.