I’d assume that forum members don’t notice that the reasoning is bad.
As evidence in favor of this view, at least sometimes after I post such a comment, the post’s karma starts to go down, suggesting that the comment informed voters about bad reasoning that they hadn’t previously noticed. (Possibly this happened in most of the examples above, I wasn’t carefully tracking this and don’t know of any way to check now.)
I’d assume that forum members don’t notice that the reasoning is bad.
Probably yeah, at least in part. Sometimes they may notice it a bit but put insufficient weight on it relative to the fact that they agree with the conclusion. But some may also miss it altogether.
My comment was in response to the claim that “to some extent it’s OK for bad posts to get upvoted”.
Ah, I interpreted that claim as “it’s not a huge priority to prevent bad posts from being upvoted, regardless of how that happens”, rather than “it’s fine for forum members to upvote posts whose conclusions they agree with even if they see that the reasons are bad”.
Hot take: strong upvoting things without great reasoning that also have conclusions I disagree with could be good for improving epistemics. At least, I think this gives us an opportunity to demonstrate common thinking processes in EA and what reasoning transparency looks like to newer people to the community. [1]
My best guess is that it also makes it more likely quality divergent thinking from established ideas happens in EA community spaces like the EA forum.
I’m aware that people on this thread might think my thinking processes and reasoning abilities aren’t stellar,* but I still think my point stands.
*My personal view is that this impression would be less because I’m bad at thinking clearly and more because our views are quite different.
A large inferential distance means it’s harder to diagnose epistemics accurately (but I’m not exactly an unbiased observer when it comes to judging my own ability to think clearly).
This leads me to another hot take: footnotes within footnotes are fun.
Maybe I want silent upvoting and downvoting to be disincentivized (or commenting with reasoning to be more incentivized). Commenting with reasoning is valuable but also hard work.
After 2 seconds of thought, I think I’d be massively in favour of a forum feature where any upvotes or downvotes count for more (e.g. double or triple the karma) once you’ve commented.[1]
Just having this incentive might make more people try and articulate what they think and why they think it. This extra incentive to stop and think might possibly make people change their votes even if they don’t end up submitting their comments.
Me commenting on my own comment shouldn’t mean the default upvote on my comment counts for more though: only the first reply should give extra voting power (I’m sure there are other ways to game it that I haven’t thought of yet but I feel like there could be something salvageable from the idea anyway).
I’d assume that forum members don’t notice that the reasoning is bad.
As evidence in favor of this view, at least sometimes after I post such a comment, the post’s karma starts to go down, suggesting that the comment informed voters about bad reasoning that they hadn’t previously noticed. (Possibly this happened in most of the examples above, I wasn’t carefully tracking this and don’t know of any way to check now.)
Probably yeah, at least in part. Sometimes they may notice it a bit but put insufficient weight on it relative to the fact that they agree with the conclusion. But some may also miss it altogether.
My comment was in response to the claim that “to some extent it’s OK for bad posts to get upvoted”.
Ah, I interpreted that claim as “it’s not a huge priority to prevent bad posts from being upvoted, regardless of how that happens”, rather than “it’s fine for forum members to upvote posts whose conclusions they agree with even if they see that the reasons are bad”.
Hot take: strong upvoting things without great reasoning that also have conclusions I disagree with could be good for improving epistemics. At least, I think this gives us an opportunity to demonstrate common thinking processes in EA and what reasoning transparency looks like to newer people to the community. [1]
My best guess is that it also makes it more likely quality divergent thinking from established ideas happens in EA community spaces like the EA forum.
My reasoning is in a footnote in my comment here.
I’m aware that people on this thread might think my thinking processes and reasoning abilities aren’t stellar,* but I still think my point stands.
*My personal view is that this impression would be less because I’m bad at thinking clearly and more because our views are quite different.
A large inferential distance means it’s harder to diagnose epistemics accurately (but I’m not exactly an unbiased observer when it comes to judging my own ability to think clearly).
This leads me to another hot take: footnotes within footnotes are fun.
Maybe I want silent upvoting and downvoting to be disincentivized (or commenting with reasoning to be more incentivized). Commenting with reasoning is valuable but also hard work.
After 2 seconds of thought, I think I’d be massively in favour of a forum feature where any upvotes or downvotes count for more (e.g. double or triple the karma) once you’ve commented.[1]
Just having this incentive might make more people try and articulate what they think and why they think it. This extra incentive to stop and think might possibly make people change their votes even if they don’t end up submitting their comments.
Me commenting on my own comment shouldn’t mean the default upvote on my comment counts for more though: only the first reply should give extra voting power (I’m sure there are other ways to game it that I haven’t thought of yet but I feel like there could be something salvageable from the idea anyway).