Unsurprisingly there is a strong relationship between overall karma and agreement
I’m not sure a strong correlation here is super unsurprising, and, even if it is, I can think of two rather different explanations:
Despite the split regular/agreement karma system’s raison d’etre being to separate out evaluations of usefulness from evaluations of agreement, people are implicitly biased and, on average, rate comments they agree with as better contributions than they really are.
There’s a positive correlation between what people tend to agree with, and what’s actually true. There’s also a positive correlation between what’s a useful contribution, and what’s true. Therefore, there’s a positive correlation between agreement karma (comments people tend to agree with) and regular karma (comments that make useful contributions).
One could view these two explanations as forming an axis, with (1) being at the “(EA Forum voting is) epistemically unvirtuous” end and (2) being at the “epistemically virtuous” end. I personally suspect voting is closer to virtuous than unvirtuous, but I wouldn’t take this as given, and I also suspect that, in theory at least, there’s room for improvement. (In practice, though, steering voting further in the epistemically virtuous direction may be difficult. I tried thinking for five minutes but couldn’t come up with any potentially promising interventions.)
I think this 3rd explanation can be decomposed into the first two (and possibly others that I haven’t laid out)?
(For what it’s worth, I agree with a “You don’t have to respond to every comment” norm. So, don’t feel obligated to reply, especially if you think—as may well be the case—that I’m talking past you and/or that I’m missing something and the resulting inferential distance is large.)
I’m not sure a strong correlation here is super unsurprising, and, even if it is, I can think of two rather different explanations:
Despite the split regular/agreement karma system’s raison d’etre being to separate out evaluations of usefulness from evaluations of agreement, people are implicitly biased and, on average, rate comments they agree with as better contributions than they really are.
There’s a positive correlation between what people tend to agree with, and what’s actually true. There’s also a positive correlation between what’s a useful contribution, and what’s true. Therefore, there’s a positive correlation between agreement karma (comments people tend to agree with) and regular karma (comments that make useful contributions).
One could view these two explanations as forming an axis, with (1) being at the “(EA Forum voting is) epistemically unvirtuous” end and (2) being at the “epistemically virtuous” end. I personally suspect voting is closer to virtuous than unvirtuous, but I wouldn’t take this as given, and I also suspect that, in theory at least, there’s room for improvement. (In practice, though, steering voting further in the epistemically virtuous direction may be difficult. I tried thinking for five minutes but couldn’t come up with any potentially promising interventions.)
There’s a 3rd reason, which I expect is the biggest contributor. Number of readers of the post/comment.
I think this 3rd explanation can be decomposed into the first two (and possibly others that I haven’t laid out)?
(For what it’s worth, I agree with a “You don’t have to respond to every comment” norm. So, don’t feel obligated to reply, especially if you think—as may well be the case—that I’m talking past you and/or that I’m missing something and the resulting inferential distance is large.)