I also feel like the comment doesn’t seem to engage much with the perspective it criticizes (in terms of trying to see things from that point of view). (I didn’t downvote the OP myself.)
When you criticize a group/movement for giving money to those who seem aligned with their mission, it seems relevant to acknowledge that it wouldn’t make sense to not focus on this sort of alignment at all. There’s an inevitable, tricky tradeoff between movement/aim dilution and too much insularity. It would be fair if you wanted to claim that EA longtermism is too far on one end of that spectrum, but it seems unfair to play up the bad connotations of taking actions that contribute to insularity, implying that there’s something sinister about having selection criteria at all, without acknowledging that taking at least some such actions is part of the only sensible strategy.
I feel similar about the remark about “techbros.” If you’re able to work with rich people, wouldn’t it be wasteful not to do it? It would be fair if you wanted to claim that the rich people in EA use their influence in ways that… what is even the claim here? That their idiosyncrasies end up having an outsized effect? That’s probably going to happen in every situation where a rich person is passionate (and hands-on involved) about a cause – that doesn’t mean that the movement around that cause therefore becomes morally problematic. Alternatively, if your claim is that rich people in EA engage in practices that are bad, that could be a a fair thing to point out, but I’d want to learn about the specifics of the claim and why you think it’s the case.
I’m also not a fan of most EA reading lists but I’d say that EA longtermism addresses topics that up until recently haven’t gotten a lot of coverage, so the direct critiques are usually by people who know very little about longtermism. And “indirect critiques” don’t exist as a crisp category. If you wanted to write a reading list section to balance out the epistemic insularity effects in EA, you’d have to do a lot of pretty difficult work of unearthing what those biases are and then seeking out the exact alternative points of view that usefully counterbalance it. It’s not as easy as adding a bunch of texts by other political movements – that would be too random. Texts written by proponents of other intellectual movements contain important insights, but they’re usually not directly applicable to EA. Someone has to do the difficult work first of figuring out where exactly EA longtermism benefits from insights from other fields. This isn’t an impossible task, but it’s not easy, as any field’s intellectual maturation takes time (it’s an iterative process). Reading lists don’t start out as perfectly balanced. To summarize, it seems relevant to mention (again) that there are inherent challenges to writing balanced reading lists for young fields. The downvoted comment skips over that and dishes out a blanket criticism that one could probably level against any reading list of a young field.
If you’re able to work with rich people, wouldn’t it be wasteful not to do it? … [T]heir idiosyncrasies end up having an outsized effect? That’s probably going to happen in every situation where a rich person is passionate (and hands-on involved) about a cause
If that will happen whenever a rich person is passionate about a cause, then opting to work with rich people can cause more harm than good. Opting out certainly doesn’t have to be “wasteful”.
My initial thinking was that “idiosyncrasies” can sometimes be neutral or even incidentally good.
But I think you’re right that this isn’t the norm and it can quickly happen that it makes things worse when someone only has a lot of influence because they have money, rather than having influence because they are valued by their peers for being unusually thoughtful.
(FWIW, I think the richest individuals within EA often defer to the judgment of EA researchers, as opposed to setting priorities directly themselves?)
FWIW, I think the richest individuals within EA often defer to the judgment of EA researchers, as opposed to setting priorities directly themselves
I’m not saying I know anything to the contrary—but I’d like to point out that we have no way of knowing. This is a major disadvantage of philanthropy—where governments are required to be transparent regarding their fund allocations, individual donors are given privacy and undisclosed control over who receives their donations and what organisations are allowed to use them for.
I also feel like the comment doesn’t seem to engage much with the perspective it criticizes (in terms of trying to see things from that point of view). (I didn’t downvote the OP myself.)
When you criticize a group/movement for giving money to those who seem aligned with their mission, it seems relevant to acknowledge that it wouldn’t make sense to not focus on this sort of alignment at all. There’s an inevitable, tricky tradeoff between movement/aim dilution and too much insularity. It would be fair if you wanted to claim that EA longtermism is too far on one end of that spectrum, but it seems unfair to play up the bad connotations of taking actions that contribute to insularity, implying that there’s something sinister about having selection criteria at all, without acknowledging that taking at least some such actions is part of the only sensible strategy.
I feel similar about the remark about “techbros.” If you’re able to work with rich people, wouldn’t it be wasteful not to do it? It would be fair if you wanted to claim that the rich people in EA use their influence in ways that… what is even the claim here? That their idiosyncrasies end up having an outsized effect? That’s probably going to happen in every situation where a rich person is passionate (and hands-on involved) about a cause – that doesn’t mean that the movement around that cause therefore becomes morally problematic. Alternatively, if your claim is that rich people in EA engage in practices that are bad, that could be a a fair thing to point out, but I’d want to learn about the specifics of the claim and why you think it’s the case.
I’m also not a fan of most EA reading lists but I’d say that EA longtermism addresses topics that up until recently haven’t gotten a lot of coverage, so the direct critiques are usually by people who know very little about longtermism. And “indirect critiques” don’t exist as a crisp category. If you wanted to write a reading list section to balance out the epistemic insularity effects in EA, you’d have to do a lot of pretty difficult work of unearthing what those biases are and then seeking out the exact alternative points of view that usefully counterbalance it. It’s not as easy as adding a bunch of texts by other political movements – that would be too random. Texts written by proponents of other intellectual movements contain important insights, but they’re usually not directly applicable to EA. Someone has to do the difficult work first of figuring out where exactly EA longtermism benefits from insights from other fields. This isn’t an impossible task, but it’s not easy, as any field’s intellectual maturation takes time (it’s an iterative process). Reading lists don’t start out as perfectly balanced. To summarize, it seems relevant to mention (again) that there are inherent challenges to writing balanced reading lists for young fields. The downvoted comment skips over that and dishes out a blanket criticism that one could probably level against any reading list of a young field.
If that will happen whenever a rich person is passionate about a cause, then opting to work with rich people can cause more harm than good. Opting out certainly doesn’t have to be “wasteful”.
My initial thinking was that “idiosyncrasies” can sometimes be neutral or even incidentally good.
But I think you’re right that this isn’t the norm and it can quickly happen that it makes things worse when someone only has a lot of influence because they have money, rather than having influence because they are valued by their peers for being unusually thoughtful.
(FWIW, I think the richest individuals within EA often defer to the judgment of EA researchers, as opposed to setting priorities directly themselves?)
I’m not saying I know anything to the contrary—but I’d like to point out that we have no way of knowing. This is a major disadvantage of philanthropy—where governments are required to be transparent regarding their fund allocations, individual donors are given privacy and undisclosed control over who receives their donations and what organisations are allowed to use them for.