It is actually ‘prima facie obvious’ to some people that philanthropic do-gooders—those who ‘aim at making the world better’ through individualised charity are not actually having a positive impact
And those people are wrong and lacking in good reasons for their point of view. (They’re also rare.)
But this has no implications for charity vs politics or anything else - it seems to be no more than the truism that its good to care about goodness.
You think that just because something is a truism, it has no implications? It contradicts your point of view, and you think it’s a truism with no implications? It tells us that we don’t need to play your game of overconfident subjective interpretations of the world in order to justify our actions.
I did indeed cherry pick some provocative issues to put in the article, but this was to illustrate the complexity of the issues rather than just to score cheap points.
But you gave a very narrow take where the “complexity of the issues” is actually reducing everything into a singular goal of implementing socialism. As I said already, you are picking one or two dimensions of the issue and ignoring others. You only talk about the kind of complexity that can further your point of view. That’s not illustrating complexity, it’s pretending that it doesn’t exist.
EAs do indeed have this prior assumption that charity is good and a charitable movement is a good movement.
You are misquoting me. I did not provide this as a prior assumption. I don’t grow the EA movement because of some prior assumption, I grow it because everywhere I look it is epistemically and morally superior to its alternatives, and each project it pursues is high-leverage and valuable. The prior assumption is that, when something is aimed at EA goals, it probably helps achieve EA goals.
If so, EA and its critics are in the same position
From your point of view, literally everyone is in the “same position” because you think that everyone’s point of view follows from subjective and controversial assumptions about the world. So sure, critics might be in the Same Position as EA, but only in the same banal and irrelevant sense that antivaxxers are in the Same Position as mainstream scientists, that holocaust deniers are in the Same Position as mainstream historiography, and so on for any dispute between right people and wrong people. But of course we can make judgments about these people: we can say that they are not rigorous, and that they are wrong, and that they are biased, and that they must stop doing harm to the world. So clearly something is missing from your framework. And whenever you identify that missing piece, it’s going to be the place where we stuff our criticisms (again, assuming that we are using your framework).
and so its not reasonable for EAs to chide their critics for somehow not caring about doing good or adopting anti-charity positions because it makes them sound cool to their radical friends.
There’s something annoying about writing a whole paper that is essentially jockeying for status rather than arguing for any actual idea on the object level. Interesting that this is the pattern for leftists nowadays.
And those people are wrong and lacking in good reasons for their point of view. (They’re also rare.)
You think that just because something is a truism, it has no implications? It contradicts your point of view, and you think it’s a truism with no implications? It tells us that we don’t need to play your game of overconfident subjective interpretations of the world in order to justify our actions.
But you gave a very narrow take where the “complexity of the issues” is actually reducing everything into a singular goal of implementing socialism. As I said already, you are picking one or two dimensions of the issue and ignoring others. You only talk about the kind of complexity that can further your point of view. That’s not illustrating complexity, it’s pretending that it doesn’t exist.
You are misquoting me. I did not provide this as a prior assumption. I don’t grow the EA movement because of some prior assumption, I grow it because everywhere I look it is epistemically and morally superior to its alternatives, and each project it pursues is high-leverage and valuable. The prior assumption is that, when something is aimed at EA goals, it probably helps achieve EA goals.
From your point of view, literally everyone is in the “same position” because you think that everyone’s point of view follows from subjective and controversial assumptions about the world. So sure, critics might be in the Same Position as EA, but only in the same banal and irrelevant sense that antivaxxers are in the Same Position as mainstream scientists, that holocaust deniers are in the Same Position as mainstream historiography, and so on for any dispute between right people and wrong people. But of course we can make judgments about these people: we can say that they are not rigorous, and that they are wrong, and that they are biased, and that they must stop doing harm to the world. So clearly something is missing from your framework. And whenever you identify that missing piece, it’s going to be the place where we stuff our criticisms (again, assuming that we are using your framework).
There’s something annoying about writing a whole paper that is essentially jockeying for status rather than arguing for any actual idea on the object level. Interesting that this is the pattern for leftists nowadays.