This feels misplaced to me. Making an argument for some cause to be prioritised highly is in some sense one of the core activities of effective altruism. Of course, many people whoâd like to centre their pet cause make poor arguments for its prioritisation, but in that case I think the quality of argument is the entire problem, not anything about the fact theyâre trying to promote a cause. âI want effective altruists to highly prioritise something that they currently donâtâ is in some sense how all our existing priorities got to where they are. I donât think we should treat this kind of thing as suspicious by nature (perhaps even the opposite).
It seems to me that one should draw a distinction between, âI see this cause as offering good value for money, and here is my reasoning whyâ, and âI have this cause that I like and I hope I can get EA to fund itâ. Sometimes the latter is masquerading as the former, using questionable reasoning.
Some examples that seem like they might be in the latter category to me:
In any case though, Iâm not sure it makes a difference in terms of the right way to respond. If the reasoning is suspect, or the claims of evidence are missing, we can assume good faith and respond with questions like, âwhy did you choose this programâ, âwhy did you conduct the analysis in this wayâ, or âhave you thought about these potentially offsetting considerationsâ. In the examples above, the original posters generally havenât engaged with these kind of questions.
If we end up with people coming to EA looking for resources for ineffective causes, and then sealioning over the reasoning, I guess that could be a problem, but I havenât seen that here much, and I doubt that sort of behavior would ultimately be rewarded in any way.
The third one seems at least generally fine to meâclearly the poster believes in their theory of change and isnât unbiased, but thatâs generally true of posts by organizations seeking funding. I donât know if the poster has made a (metaphorically) better bednet or not, but thought the Forum was enhanced by having the post here.
The other two are posts from new users who appear to have no clear demonstrated connection to EA at all. The occasional donation pitch or advice request from a charity that doesnât line up with EA very well at all is a small price to pay for an open Forum. The karma system dealt with preventing diversion of the Forum from its purposes. A few kind people offered some advice. I donât see any reason for concern there.
those posts all go out of their way to say theyâre new to EA. I feel pretty differently about someone with an existing cause discovering EA and trying to fundraise vs someone who integrated EA principles[1] and found a new cause they think is important.
I donât love the phrase âEA principlesâ, EA gets some stuff critically wrong and other subcultures get some stuff right. But it will do for these purposes.
I think that to a certain extent that is right, but this context was less along the lines of âhere is a cause that is going to be highly impactfulâ and more along the lines of âhere is a cause that I care about.â Less âmental health coaching via an app can be cost effectiveâ and more like âletâs protect elephants.â
But I do think that in a broad sense you are correct: proposing new interventions, new cause areas, etc., is how the overall community progresses.
This feels misplaced to me. Making an argument for some cause to be prioritised highly is in some sense one of the core activities of effective altruism. Of course, many people whoâd like to centre their pet cause make poor arguments for its prioritisation, but in that case I think the quality of argument is the entire problem, not anything about the fact theyâre trying to promote a cause. âI want effective altruists to highly prioritise something that they currently donâtâ is in some sense how all our existing priorities got to where they are. I donât think we should treat this kind of thing as suspicious by nature (perhaps even the opposite).
Hi Ben,
It seems to me that one should draw a distinction between, âI see this cause as offering good value for money, and here is my reasoning whyâ, and âI have this cause that I like and I hope I can get EA to fund itâ. Sometimes the latter is masquerading as the former, using questionable reasoning.
Some examples that seem like they might be in the latter category to me:
https://ââforum.effectivealtruism.org/ââposts/ââDytsn9dDuwadFZXwq/ââfundraising-for-a-school-in-liberia
https://ââforum.effectivealtruism.org/ââposts/ââR5r2FPYTZGDzWdJEY/ââhow-to-get-wealthier-folks-involved-in-mutual-aid
https://ââforum.effectivealtruism.org/ââposts/ââzsLcixRzqr64CacfK/ââzzappmalaria-twice-as-cost-effective-as-bed-nets-in-urban
In any case though, Iâm not sure it makes a difference in terms of the right way to respond. If the reasoning is suspect, or the claims of evidence are missing, we can assume good faith and respond with questions like, âwhy did you choose this programâ, âwhy did you conduct the analysis in this wayâ, or âhave you thought about these potentially offsetting considerationsâ. In the examples above, the original posters generally havenât engaged with these kind of questions.
If we end up with people coming to EA looking for resources for ineffective causes, and then sealioning over the reasoning, I guess that could be a problem, but I havenât seen that here much, and I doubt that sort of behavior would ultimately be rewarded in any way.
Ian
The third one seems at least generally fine to meâclearly the poster believes in their theory of change and isnât unbiased, but thatâs generally true of posts by organizations seeking funding. I donât know if the poster has made a (metaphorically) better bednet or not, but thought the Forum was enhanced by having the post here.
The other two are posts from new users who appear to have no clear demonstrated connection to EA at all. The occasional donation pitch or advice request from a charity that doesnât line up with EA very well at all is a small price to pay for an open Forum. The karma system dealt with preventing diversion of the Forum from its purposes. A few kind people offered some advice. I donât see any reason for concern there.
I agree, and to be clear Iâm not trying to say that any forum policy change is needed at this time.
those posts all go out of their way to say theyâre new to EA. I feel pretty differently about someone with an existing cause discovering EA and trying to fundraise vs someone who integrated EA principles[1] and found a new cause they think is important.
I donât love the phrase âEA principlesâ, EA gets some stuff critically wrong and other subcultures get some stuff right. But it will do for these purposes.
I think that to a certain extent that is right, but this context was less along the lines of âhere is a cause that is going to be highly impactfulâ and more along the lines of âhere is a cause that I care about.â Less âmental health coaching via an app can be cost effectiveâ and more like âletâs protect elephants.â
But I do think that in a broad sense you are correct: proposing new interventions, new cause areas, etc., is how the overall community progresses.