I think it’s because you’re making strong claims without presenting any supporting evidence. I don’t know what reading lists you’re referring to; I have doubts about not asking questions being an ‘unspoken condition’ about getting access to funding; and I have no idea what you’re conspiratorially alluding to regarding ‘quasi-censorship’ and ‘emotional blackmail’.
I also feel like the comment doesn’t seem to engage much with the perspective it criticizes (in terms of trying to see things from that point of view). (I didn’t downvote the OP myself.)
When you criticize a group/movement for giving money to those who seem aligned with their mission, it seems relevant to acknowledge that it wouldn’t make sense to not focus on this sort of alignment at all. There’s an inevitable, tricky tradeoff between movement/aim dilution and too much insularity. It would be fair if you wanted to claim that EA longtermism is too far on one end of that spectrum, but it seems unfair to play up the bad connotations of taking actions that contribute to insularity, implying that there’s something sinister about having selection criteria at all, without acknowledging that taking at least some such actions is part of the only sensible strategy.
I feel similar about the remark about “techbros.” If you’re able to work with rich people, wouldn’t it be wasteful not to do it? It would be fair if you wanted to claim that the rich people in EA use their influence in ways that… what is even the claim here? That their idiosyncrasies end up having an outsized effect? That’s probably going to happen in every situation where a rich person is passionate (and hands-on involved) about a cause – that doesn’t mean that the movement around that cause therefore becomes morally problematic. Alternatively, if your claim is that rich people in EA engage in practices that are bad, that could be a a fair thing to point out, but I’d want to learn about the specifics of the claim and why you think it’s the case.
I’m also not a fan of most EA reading lists but I’d say that EA longtermism addresses topics that up until recently haven’t gotten a lot of coverage, so the direct critiques are usually by people who know very little about longtermism. And “indirect critiques” don’t exist as a crisp category. If you wanted to write a reading list section to balance out the epistemic insularity effects in EA, you’d have to do a lot of pretty difficult work of unearthing what those biases are and then seeking out the exact alternative points of view that usefully counterbalance it. It’s not as easy as adding a bunch of texts by other political movements – that would be too random. Texts written by proponents of other intellectual movements contain important insights, but they’re usually not directly applicable to EA. Someone has to do the difficult work first of figuring out where exactly EA longtermism benefits from insights from other fields. This isn’t an impossible task, but it’s not easy, as any field’s intellectual maturation takes time (it’s an iterative process). Reading lists don’t start out as perfectly balanced. To summarize, it seems relevant to mention (again) that there are inherent challenges to writing balanced reading lists for young fields. The downvoted comment skips over that and dishes out a blanket criticism that one could probably level against any reading list of a young field.
If you’re able to work with rich people, wouldn’t it be wasteful not to do it? … [T]heir idiosyncrasies end up having an outsized effect? That’s probably going to happen in every situation where a rich person is passionate (and hands-on involved) about a cause
If that will happen whenever a rich person is passionate about a cause, then opting to work with rich people can cause more harm than good. Opting out certainly doesn’t have to be “wasteful”.
My initial thinking was that “idiosyncrasies” can sometimes be neutral or even incidentally good.
But I think you’re right that this isn’t the norm and it can quickly happen that it makes things worse when someone only has a lot of influence because they have money, rather than having influence because they are valued by their peers for being unusually thoughtful.
(FWIW, I think the richest individuals within EA often defer to the judgment of EA researchers, as opposed to setting priorities directly themselves?)
FWIW, I think the richest individuals within EA often defer to the judgment of EA researchers, as opposed to setting priorities directly themselves
I’m not saying I know anything to the contrary—but I’d like to point out that we have no way of knowing. This is a major disadvantage of philanthropy—where governments are required to be transparent regarding their fund allocations, individual donors are given privacy and undisclosed control over who receives their donations and what organisations are allowed to use them for.
My apologies, specific evidence was not presented with respect to...
...the quasi-censorship/emotional blackmail point because I think it’s up to the people involved to provide as much detail as they are personally comfortable with. All I can morally do is signal to those out of the loop that there are serious problems and hope that somebody with the right to name names does so. I can see why this may seem conspiratorial without further context. All I can suggest is that you keep an ear to the ground. I’m anonymous for a reason.
...the funding issue because either it fits the first category of “areas where I don’t have a right to name names” (cf. ”...any critique of central figures in EA would result in an inability to secure funding from EA sources...” above) or because the relevant information would probably be enough to identify me and thus destroy my career.
...the reading list issue because I thought the point was self-evident. If you would like some examples, see a very brief selection below, but this criticism applies to all relevant reading lists I have seen and is an area where I’m afraid we have prior form—see https://www.simonknutsson.com/problems-in-effective-altruism-and-existential-risk-and-what-to-do-about-them/#Systematically_problematic_syllabi_reading_lists_citations_writings_etc . I am not accusing those involved of being “indoctrinators” or of having bad intentions, I am merely observing that they ignore much of academic existential risk work in favour of a restricted range of texts by a few EA “thought leaders” and EA Forum posts, which, to newcomers, presents an idiosyncratic and ideological view of the field as the only view.
Again, I’m really not sure where these downvotes are coming from. I’m engaging with criticism and presenting what information I can present as clearly as possible.
I disagree with much of the original comment, but I’m baffled that you think this is appropriate content for the EA Forum. I strong-downvoted and reported this comment.
While this comment was deleted, the moderators discussed it in its original form (which included multiple serious insults to another user) and decided to issue a two-week ban to Charles, starting today. We don’t tolerate personal insults on the Forum.
Hi Charles. Please consider revising or retracting this comment; unlike your other comments in this thread, it’s unkind and not adding to the conversation.
I think it’s because you’re making strong claims without presenting any supporting evidence. I don’t know what reading lists you’re referring to; I have doubts about not asking questions being an ‘unspoken condition’ about getting access to funding; and I have no idea what you’re conspiratorially alluding to regarding ‘quasi-censorship’ and ‘emotional blackmail’.
I also feel like the comment doesn’t seem to engage much with the perspective it criticizes (in terms of trying to see things from that point of view). (I didn’t downvote the OP myself.)
When you criticize a group/movement for giving money to those who seem aligned with their mission, it seems relevant to acknowledge that it wouldn’t make sense to not focus on this sort of alignment at all. There’s an inevitable, tricky tradeoff between movement/aim dilution and too much insularity. It would be fair if you wanted to claim that EA longtermism is too far on one end of that spectrum, but it seems unfair to play up the bad connotations of taking actions that contribute to insularity, implying that there’s something sinister about having selection criteria at all, without acknowledging that taking at least some such actions is part of the only sensible strategy.
I feel similar about the remark about “techbros.” If you’re able to work with rich people, wouldn’t it be wasteful not to do it? It would be fair if you wanted to claim that the rich people in EA use their influence in ways that… what is even the claim here? That their idiosyncrasies end up having an outsized effect? That’s probably going to happen in every situation where a rich person is passionate (and hands-on involved) about a cause – that doesn’t mean that the movement around that cause therefore becomes morally problematic. Alternatively, if your claim is that rich people in EA engage in practices that are bad, that could be a a fair thing to point out, but I’d want to learn about the specifics of the claim and why you think it’s the case.
I’m also not a fan of most EA reading lists but I’d say that EA longtermism addresses topics that up until recently haven’t gotten a lot of coverage, so the direct critiques are usually by people who know very little about longtermism. And “indirect critiques” don’t exist as a crisp category. If you wanted to write a reading list section to balance out the epistemic insularity effects in EA, you’d have to do a lot of pretty difficult work of unearthing what those biases are and then seeking out the exact alternative points of view that usefully counterbalance it. It’s not as easy as adding a bunch of texts by other political movements – that would be too random. Texts written by proponents of other intellectual movements contain important insights, but they’re usually not directly applicable to EA. Someone has to do the difficult work first of figuring out where exactly EA longtermism benefits from insights from other fields. This isn’t an impossible task, but it’s not easy, as any field’s intellectual maturation takes time (it’s an iterative process). Reading lists don’t start out as perfectly balanced. To summarize, it seems relevant to mention (again) that there are inherent challenges to writing balanced reading lists for young fields. The downvoted comment skips over that and dishes out a blanket criticism that one could probably level against any reading list of a young field.
If that will happen whenever a rich person is passionate about a cause, then opting to work with rich people can cause more harm than good. Opting out certainly doesn’t have to be “wasteful”.
My initial thinking was that “idiosyncrasies” can sometimes be neutral or even incidentally good.
But I think you’re right that this isn’t the norm and it can quickly happen that it makes things worse when someone only has a lot of influence because they have money, rather than having influence because they are valued by their peers for being unusually thoughtful.
(FWIW, I think the richest individuals within EA often defer to the judgment of EA researchers, as opposed to setting priorities directly themselves?)
I’m not saying I know anything to the contrary—but I’d like to point out that we have no way of knowing. This is a major disadvantage of philanthropy—where governments are required to be transparent regarding their fund allocations, individual donors are given privacy and undisclosed control over who receives their donations and what organisations are allowed to use them for.
My apologies, specific evidence was not presented with respect to...
...the quasi-censorship/emotional blackmail point because I think it’s up to the people involved to provide as much detail as they are personally comfortable with. All I can morally do is signal to those out of the loop that there are serious problems and hope that somebody with the right to name names does so. I can see why this may seem conspiratorial without further context. All I can suggest is that you keep an ear to the ground. I’m anonymous for a reason.
...the funding issue because either it fits the first category of “areas where I don’t have a right to name names” (cf. ”...any critique of central figures in EA would result in an inability to secure funding from EA sources...” above) or because the relevant information would probably be enough to identify me and thus destroy my career.
...the reading list issue because I thought the point was self-evident. If you would like some examples, see a very brief selection below, but this criticism applies to all relevant reading lists I have seen and is an area where I’m afraid we have prior form—see https://www.simonknutsson.com/problems-in-effective-altruism-and-existential-risk-and-what-to-do-about-them/#Systematically_problematic_syllabi_reading_lists_citations_writings_etc . I am not accusing those involved of being “indoctrinators” or of having bad intentions, I am merely observing that they ignore much of academic existential risk work in favour of a restricted range of texts by a few EA “thought leaders” and EA Forum posts, which, to newcomers, presents an idiosyncratic and ideological view of the field as the only view.
https://forum.effectivealtruism.org/posts/u58HNBMBdKPbvpKqH/ea-reading-list-longtermism-and-existential-risks
http://www.global-catastrophic-risks.com/reading.html
https://forum.effectivealtruism.org/posts/wmAQavcKjWc393NXP/example-syllabus-existential-risks
Again, I’m really not sure where these downvotes are coming from. I’m engaging with criticism and presenting what information I can present as clearly as possible.
<Comment deleted>
I disagree with much of the original comment, but I’m baffled that you think this is appropriate content for the EA Forum. I strong-downvoted and reported this comment.
While this comment was deleted, the moderators discussed it in its original form (which included multiple serious insults to another user) and decided to issue a two-week ban to Charles, starting today. We don’t tolerate personal insults on the Forum.
Hi Charles. Please consider revising or retracting this comment; unlike your other comments in this thread, it’s unkind and not adding to the conversation.
Per your personal request, I have deleted my comment.
...um