I disagree that the problem here is groupthink, and I think if you look at highly rated posts, you can’t reasonably conclude that people who criticise the orthodox position will be reliably downvoted. I think the problem here is that some people vote based on tone and some on content, which means that when something is downvoted different people draw different conclusions about why.
I think the problem here is that some people vote based on tone and some on content,
I hope to encourage more people to instead upvote based on rigor/epistemics/quality on the margin, rather than based on tone or based on agreement (which is some of “content”) or vibe.
EDIT: I also think a surprisingly high number of people upvote low-quality criticisms that have a good tone, which makes me surprised when others assert than the movement is systematically biased against criticisms (“insufficient discernment” will be a fairer criticism, but that’s a mistake, not a bias).
Doubtful if you look at Gideon’s first comment and remember it was downvoted through the floor almost immediately.
Questioning orthodoxy is ok within some bounds (often technical/narrow disagreements), or when expressed in suitable terms, e.g.
(Significant) underconfidence, regardless of expertise and/or lack of expertise among those criticised
Unreasonable assumptions of good faith, even in the face of hostility or malpractice (double standards, perhaps a lesser form of the expectation of a ‘perfect victim’)
Extensive use of EA buzzwords
Huge amounts of extra work/detail that would not be deemed necessary for non-critical writing
Essentially making oneself as small as possible so as not to set off the Bad Tone hair-trigger
This is difficult because knowing what you are talking about and being lazily dismissed by people you know for a fact know far less than you about a given subject matter makes one somewhat frustrated
As several EAs have noted, e.g. weeatquince, this is time-consuming and (emotionally) exhausting, and often results in dismissal anyway.
This is even harder to pull off when questioning sensitive issues like politics, funding ethics, foundational intellectual issues (e.g. the ways in which the TUA uses utterly unsuitable tools for its subject matter due to a lack of outside reading), competence of prominent figures, etc.
I actually think this forms a sort of positive feedback loop, where EAs become increasingly orthodox (and confident in that orthodoxy) due to perceived lack of substantive critiques, which makes making those critiques so frustrating, time-consuming, and low-impact that people just don’t bother. I’ve certainly done it.
4 are straightforwardly criticisms: (“Free-spending EA might be bad...”, “Bad Omens”, “case against randomista development”, “Critiques of EA”)
4 are partial criticisms (“Long-Termism” vs. “Existential Risk”, “My mistakes on the path to impact”,”EA for dumb people?”, “Are you really in a race?”)
1 (the most upvoted) was a response to criticism (“EA and the current funding situation”)
1 was about the former EAForum head leaving (“Announcing my retirement”)
This is a total of 40-80%, depending on how you count.
(In the next 10 posts, I “only” see 3 posts that are criticisms, but I don’t think that 30% is particularly low either. It does get lower further down however).
I don’t find this example convincing. I just read the review and found it pretty underwhelming. Take this:
MacAskill’s personal position is to basically to throw up his hands, declare that none of the solutions to the problems with utilitarianism look very good, and we should just compromise between various repugnant theories of how to deal with populations, hoping that whatever compromise we take isn’t that bad.
The paragraph is reacting to the following passage in WWOTF:
There is still deep disagreement within philosophy about what the right view of population ethics is. . . Indeed, I don’t think that there’s any view in population ethics that anyone should be extremely confident in.
If you want to reject the Repugnant Conclusion, therefore, then you’ve got to reject one of the premises that this argument was based on. But each of these premises seem incontrovertible. We are left with a paradox. One option is to simply accept the Repugnant Conclusion. . . This is the view that I incline towards. Many other philosophers believe that we should reject one of the other premises instead.
Like all views in population ethics, the critical level view has some very unappealing downsides.
There is still deep disagreement about what the right view of population ethics is. . . Indeed, I don’t think that there’s any view in population ethics that anyone should be extremely confident in.
But MacAskill is here describing problems with rival views in population ethics, not problems with utilitarianism! Because the author (1) conflates the two, (2) mischaracterizes the Repugnant Conclusion (“worlds where all available land is turned into places worse than the worst slums of Bangladesh”), and (3) fails to distinguish the Repugnant Conclusion from standard “repugnant” implications of utilitarianism that have nothing to do with it, he ends up attributing to longtermism a number of “ridiculous” views that do not in fact follow from that position.
Separately, if criticizing WWOTF is considered to be a paradigmatic case of “heterodoxy”, it seems worth mentioning that a recent critical review by Magnus Vinding has been very favorably received (179 karma at the time of writing).
This response completely ignores the main point of my comment.
Separately, if criticizing WWOTF is considered to be a paradigmatic case of “heterodoxy”, it seems worth mentioning that a recent critical review by Magnus Vinding has been very favorably received (179 karma at the time of writing).
Please reread my comment because the whole point was that A.C.Skraeling said that criticism is accepted within some boundaries, or when expressed in suitable terms. You essentially just repeated Linch’s point except that my whole point was that Linch’s point is perfectly compatible with what A.C. Skraeling said.
Regarding Hoel’s review, you seem to have read my point as being that it was particularly good or convincing to EAs, which is incorrect. My point was that it was downvoted to −12, a karma score I associate with trollish posts, despite its content being much better than that, because of the combination of criticizing EA orthodoxy (longtermism, utilitarianism, population ethics etc) and not expressing it in a suitable manner. This makes it a decent example of what A.C.Skraeling said. You are free to disagree of course.
Please reread my comment because the whole point was that A.C.Skraeling said that criticism is accepted within some boundaries, or when expressed in suitable terms.
I did misread some parts of your original comment. I thought you were saying that criticizing WWOTF was itself an example of criticism that is beyond the bounds Skraeling was describing. But I now see that you were not saying this. My apologies. (I have crossed out the part of my comment that is affected by this misreading.)
Regarding Hoel’s review, you seem to have read my point as being that it was particularly good or convincing to EAs, which is incorrect.
That is not how I read your point. I interpreted you as saying that the quality of the book review justified higher karma than it received (which is confirmed by your reply). My comment was meant to argue against this point, by highlighting some serious blunders and sloppy reasoning by the author that probably justify the low rating. (-12 karma is appropriate for a post of very low quality, in my opinion, and not just a trollish post.)
Regarding the Hoel piece, the fact that you highlighted the section you did and the way you analyzed it suggests to me you didn’t understand what his position was, and didn’t try particularly hard to do so. I don’t think you can truly judge whether his content is very low quality if you don’t understand it. Personally, I think he made some interesting points really engaging with some cores of EA, even if I disagree with much of what he said. I completely disagree that his content, separate from its language and tone towards EAs, is anywhere near very low quality, certainly nowhere near −12. If you want to understand his views better, I found his comments replying to his piece on why he’s not an EA illuminating, such as his response to my attempted summary of his position. But we can agree to disagree.
Edit note: I significantly edited the part of this comment talking about Hoel’s piece within a few hours of posting with the aim of greater clarity.
I disagree that the problem here is groupthink, and I think if you look at highly rated posts, you can’t reasonably conclude that people who criticise the orthodox position will be reliably downvoted
The highly rated posts I’ve seen so far, on the topic of X risk in particular, appear to me to typically be a product of group think. They’re typically very articulate, very polished form, highly informed on details, (ie. good academic style) but not escaping group think.
As evidence, please direct us to the writers here who have been laser focused on the critical importance of managing the pace of the knowledge explosion. Where are they? If they exist, and I sincerely hope they do (because I don’t have the authority to sell the case) I really would like to meet them.
In my opinion, group think is to some immeasurable degree built in to the fabric of academia, because academia is a business, and one does not stay in business by alienating one’s clients. Thus, to the degree the academic depends on their salary, they are somewhat imprisoned within the limits of whoever is signing their paycheck.
Here’s an example.
I can afford to persistently sell a “world without men” idea as an ambitious solution to human violence because nobody owns me, I have nothing of value at stake. Whatever the merits of such a case might be, (very clearly debatable) academics can’t afford to make that case, because the group consensus of their community will not tolerate it. And before you protest, know that I’ve already been threatened with banning on this site just for bringing the subject up.
Academia is business, and is thus governed by fear, and that is the source of group think.
If you would, please down vote this post at least 100 times, as I believe I’ve earned it. :-)
I’m looking at your profile, you have almost nothing but downvotes, but I haven’t seen you say anything dumb—just sassy. FWIW, I really like this comment.
I disagree that the problem here is groupthink, and I think if you look at highly rated posts, you can’t reasonably conclude that people who criticise the orthodox position will be reliably downvoted. I think the problem here is that some people vote based on tone and some on content, which means that when something is downvoted different people draw different conclusions about why.
I hope to encourage more people to instead upvote based on rigor/epistemics/quality on the margin, rather than based on tone or based on agreement (which is some of “content”) or vibe.
EDIT: I also think a surprisingly high number of people upvote low-quality criticisms that have a good tone, which makes me surprised when others assert than the movement is systematically biased against criticisms (“insufficient discernment” will be a fairer criticism, but that’s a mistake, not a bias).
Doubtful if you look at Gideon’s first comment and remember it was downvoted through the floor almost immediately.
Questioning orthodoxy is ok within some bounds (often technical/narrow disagreements), or when expressed in suitable terms, e.g.
(Significant) underconfidence, regardless of expertise and/or lack of expertise among those criticised
Unreasonable assumptions of good faith, even in the face of hostility or malpractice (double standards, perhaps a lesser form of the expectation of a ‘perfect victim’)
Extensive use of EA buzzwords
Huge amounts of extra work/detail that would not be deemed necessary for non-critical writing
Essentially making oneself as small as possible so as not to set off the Bad Tone hair-trigger
This is difficult because knowing what you are talking about and being lazily dismissed by people you know for a fact know far less than you about a given subject matter makes one somewhat frustrated
As several EAs have noted, e.g. weeatquince, this is time-consuming and (emotionally) exhausting, and often results in dismissal anyway.
This is even harder to pull off when questioning sensitive issues like politics, funding ethics, foundational intellectual issues (e.g. the ways in which the TUA uses utterly unsuitable tools for its subject matter due to a lack of outside reading), competence of prominent figures, etc.
I actually think this forms a sort of positive feedback loop, where EAs become increasingly orthodox (and confident in that orthodoxy) due to perceived lack of substantive critiques, which makes making those critiques so frustrating, time-consuming, and low-impact that people just don’t bother. I’ve certainly done it.
Quantitatively, if you look at the top 10 most upvoted posts:
4 are straightforwardly criticisms: (“Free-spending EA might be bad...”, “Bad Omens”, “case against randomista development”, “Critiques of EA”)
4 are partial criticisms (“Long-Termism” vs. “Existential Risk”, “My mistakes on the path to impact”,”EA for dumb people?”, “Are you really in a race?”)
1 (the most upvoted) was a response to criticism (“EA and the current funding situation”)
1 was about the former EAForum head leaving (“Announcing my retirement”)
This is a total of 40-80%, depending on how you count.
(In the next 10 posts, I “only” see 3 posts that are criticisms, but I don’t think that 30% is particularly low either. It does get lower further down however).
I think this is a non-sequitur in response to A.C.Skraeling’s comment. They said:
A high percentage of the most upvoted posts of all time being criticism of some sort is perfectly compatible with this.
Here’s a recent case of someone questioning orthodoxy (writing a negative review of WWOTF), not bothering to express it in EA-friendly enough language, and subsequently being downvoted to a trollish level (-12) for it despite their content being much better than that: https://forum.effectivealtruism.org/posts/AyPTZLTwm5hN2Kfcb/book-review-what-we-owe-the-future-erik-hoel
I don’t find this example convincing. I just read the review and found it pretty underwhelming. Take this:
The paragraph is reacting to the following passage in WWOTF:
But MacAskill is here describing problems with rival views in population ethics, not problems with utilitarianism! Because the author (1) conflates the two, (2) mischaracterizes the Repugnant Conclusion (“worlds where all available land is turned into places worse than the worst slums of Bangladesh”), and (3) fails to distinguish the Repugnant Conclusion from standard “repugnant” implications of utilitarianism that have nothing to do with it, he ends up attributing to longtermism a number of “ridiculous” views that do not in fact follow from that position.
Separately, if criticizing WWOTF is considered to be a paradigmatic case of “heterodoxy”, it seems worth mentioning thata recent critical review by Magnus Vindinghas been very favorably received (179 karma at the time of writing).This response completely ignores the main point of my comment.
Please reread my comment because the whole point was that A.C.Skraeling said that criticism is accepted within some boundaries, or when expressed in suitable terms. You essentially just repeated Linch’s point except that my whole point was that Linch’s point is perfectly compatible with what A.C. Skraeling said.
Regarding Hoel’s review, you seem to have read my point as being that it was particularly good or convincing to EAs, which is incorrect. My point was that it was downvoted to −12, a karma score I associate with trollish posts, despite its content being much better than that, because of the combination of criticizing EA orthodoxy (longtermism, utilitarianism, population ethics etc) and not expressing it in a suitable manner. This makes it a decent example of what A.C.Skraeling said. You are free to disagree of course.
I did misread some parts of your original comment. I thought you were saying that criticizing WWOTF was itself an example of criticism that is beyond the bounds Skraeling was describing. But I now see that you were not saying this. My apologies. (I have crossed out the part of my comment that is affected by this misreading.)
That is not how I read your point. I interpreted you as saying that the quality of the book review justified higher karma than it received (which is confirmed by your reply). My comment was meant to argue against this point, by highlighting some serious blunders and sloppy reasoning by the author that probably justify the low rating. (-12 karma is appropriate for a post of very low quality, in my opinion, and not just a trollish post.)
Thanks for the retraction.
Regarding the Hoel piece, the fact that you highlighted the section you did and the way you analyzed it suggests to me you didn’t understand what his position was, and didn’t try particularly hard to do so. I don’t think you can truly judge whether his content is very low quality if you don’t understand it. Personally, I think he made some interesting points really engaging with some cores of EA, even if I disagree with much of what he said. I completely disagree that his content, separate from its language and tone towards EAs, is anywhere near very low quality, certainly nowhere near −12. If you want to understand his views better, I found his comments replying to his piece on why he’s not an EA illuminating, such as his response to my attempted summary of his position. But we can agree to disagree.
Edit note: I significantly edited the part of this comment talking about Hoel’s piece within a few hours of posting with the aim of greater clarity.
The highly rated posts I’ve seen so far, on the topic of X risk in particular, appear to me to typically be a product of group think. They’re typically very articulate, very polished form, highly informed on details, (ie. good academic style) but not escaping group think.
As evidence, please direct us to the writers here who have been laser focused on the critical importance of managing the pace of the knowledge explosion. Where are they? If they exist, and I sincerely hope they do (because I don’t have the authority to sell the case) I really would like to meet them.
In my opinion, group think is to some immeasurable degree built in to the fabric of academia, because academia is a business, and one does not stay in business by alienating one’s clients. Thus, to the degree the academic depends on their salary, they are somewhat imprisoned within the limits of whoever is signing their paycheck.
Here’s an example.
I can afford to persistently sell a “world without men” idea as an ambitious solution to human violence because nobody owns me, I have nothing of value at stake. Whatever the merits of such a case might be, (very clearly debatable) academics can’t afford to make that case, because the group consensus of their community will not tolerate it. And before you protest, know that I’ve already been threatened with banning on this site just for bringing the subject up.
Academia is business, and is thus governed by fear, and that is the source of group think.
If you would, please down vote this post at least 100 times, as I believe I’ve earned it. :-)
I’m looking at your profile, you have almost nothing but downvotes, but I haven’t seen you say anything dumb—just sassy. FWIW, I really like this comment.