Thanks for bringing up Will’s post! I have now updated the question’s description to link to that.
I actually like Will’s definition more. The reason is two-fold:
Will’s definition adds a bit more mystery which makes me curious to actually work out what all the words mean. In fact, I would add this to the list of “principal desiderata for the definition” the post mentions: The definition should encourage people to think about EA a bit deeply. It should be a good starting point for research.
Will’s definition is not radically different from what is already there—the post says “little more rigorous”—which makes the cost of changing to this definition lesser. (One of the costs of changing something as fundamental as the definition could be giving the perception to the community that somehow there has been a significant change in the foundations of EA when hasn’t been any—we are just trying to better reflect what is actually done in EA).
One critique I have of Will’s alternative is that the proposed definition isn’t quite distinguishing the two schools of thought. To explain my thinking here is a bit more visual representation. Let () represent a bucket:
Will’s definition and existing definition makes things feel like - (Evidence, Careful reasoning) - it is just one bucket
But it should really feel like - (Evidence), (Careful reasoning) - two separate buckets
Apologies if that is too nitpicky but I don’t think it is. I think the distinctness of Evidence and Careful reasoning needs to come out. I guess rephrasing it this way would be better: Effective altruism attempts to improve the world by the use of experimental evidence and/or theoretical reasoning to work out how to maximize the good with a given unit of resources, tentatively understanding ‘the good’ in impartial welfarist terms.This rephrasing is inspired by the fact that many of the natural sciences split into two—theory and experiment (like Theoretical Physics and Experimental Physics). We are saying EA is also that way which I think it is. I think this also adds to the Science-aligned point that Will mentions. (I have edited this to say that I don’t think this definition is a good one. See my next comment below)
I actually disagree with your definition. Will’s definition allows for debate about what counts as evidence and careful reasoning, and whether hits based giving or focusing on RCTs is a better path. That ambiguity seems critical for capturing what EA is, a project still somewhat in flux and one that allows for refinement, rather than claiming there are 2 specific different things.
A concrete example* of why we should be OK with leaving things ambiguous is considering ideas like the mathematical universe hypothesis (MUH). Someone can ask; “Should the MUH be considered as a potential path towards non-causal trade with other universes?” Is that question part of EA? I think there’s a case to make that the answer is yes (in my view correctly,) because it is relevant to the question of revisiting the “tentatively understanding” part of Will’s definition.
*In the strangest sense of “concrete” I think I’ve ever used.
I agree that the ambiguity in whether giving in a hits-based way or evidence-based way is better, is an important aspect of current EA understanding. In fact, I think this could be a potential 4th point (I mentioned a third one earlier) to add to the definition desiderata: The definition should hint at the uncertainty that is in current EA understanding.
I also agree that my definition doesn’t bring out this ambiguity. I am afraid it might even be doing the opposite! The general consensus is that both experimental & theoretical parts of the natural sciences are equally important and must be done. But I guess EAs are actually unsure if the evidence-based giving & careful reasoning-based giving (hits based) should both be done or if we would be doing more good by just focussing on one. I should possibly read up more on this. (I would appreciate it if any of you can DM me any resources you have found on this) I just assumed EAs believed both must be done. My bad!
Disagreement:
I don’t see how Will’s definition allows for debating said ambiguity though. As I mentioned in my earlier comment, I don’t think that the definition distinguishes between the two schools of thought enough. As a consequence, I also don’t think it shows the ambiguity between them. I believe a conflict(aka ambiguity) requires at least two things but the definition actually doesn’t convincingly show there are two things in the first place, in my opinion.
Thanks for bringing up Will’s post! I have now updated the question’s description to link to that.
I actually like Will’s definition more. The reason is two-fold:
Will’s definition adds a bit more mystery which makes me curious to actually work out what all the words mean. In fact, I would add this to the list of “principal desiderata for the definition” the post mentions: The definition should encourage people to think about EA a bit deeply. It should be a good starting point for research.
Will’s definition is not radically different from what is already there—the post says “little more rigorous”—which makes the cost of changing to this definition lesser. (One of the costs of changing something as fundamental as the definition could be giving the perception to the community that somehow there has been a significant change in the foundations of EA when hasn’t been any—we are just trying to better reflect what is actually done in EA).
One critique I have of Will’s alternative is that the proposed definition isn’t quite distinguishing the two schools of thought. To explain my thinking here is a bit more visual representation. Let () represent a bucket:
Will’s definition and existing definition makes things feel like - (Evidence, Careful reasoning) - it is just one bucket
But it should really feel like - (Evidence), (Careful reasoning) - two separate buckets
Apologies if that is too nitpicky but I don’t think it is. I think the distinctness of Evidence and Careful reasoning needs to come out.
I guess rephrasing it this way would be better: Effective altruism attempts to improve the world by the use of experimental evidence and/or theoretical reasoning to work out how to maximize the good with a given unit of resources, tentatively understanding ‘the good’ in impartial welfarist terms.This rephrasing is inspired by the fact that many of the natural sciences split into two—theory and experiment (like Theoretical Physics and Experimental Physics). We are saying EA is also that way which I think it is. I think this also adds to the Science-aligned point that Will mentions.(I have edited this to say that I don’t think this definition is a good one. See my next comment below)I actually disagree with your definition. Will’s definition allows for debate about what counts as evidence and careful reasoning, and whether hits based giving or focusing on RCTs is a better path. That ambiguity seems critical for capturing what EA is, a project still somewhat in flux and one that allows for refinement, rather than claiming there are 2 specific different things.
A concrete example* of why we should be OK with leaving things ambiguous is considering ideas like the mathematical universe hypothesis (MUH). Someone can ask; “Should the MUH be considered as a potential path towards non-causal trade with other universes?” Is that question part of EA? I think there’s a case to make that the answer is yes (in my view correctly,) because it is relevant to the question of revisiting the “tentatively understanding” part of Will’s definition.
*In the strangest sense of “concrete” I think I’ve ever used.
I both agree and disagree with you.
Agreements:
I agree that the ambiguity in whether giving in a hits-based way or evidence-based way is better, is an important aspect of current EA understanding. In fact, I think this could be a potential 4th point (I mentioned a third one earlier) to add to the definition desiderata: The definition should hint at the uncertainty that is in current EA understanding.
I also agree that my definition doesn’t bring out this ambiguity. I am afraid it might even be doing the opposite! The general consensus is that both experimental & theoretical parts of the natural sciences are equally important and must be done. But I guess EAs are actually unsure if the evidence-based giving & careful reasoning-based giving (hits based) should both be done or if we would be doing more good by just focussing on one. I should possibly read up more on this. (I would appreciate it if any of you can DM me any resources you have found on this) I just assumed EAs believed both must be done. My bad!
Disagreement: I don’t see how Will’s definition allows for debating said ambiguity though. As I mentioned in my earlier comment, I don’t think that the definition distinguishes between the two schools of thought enough. As a consequence, I also don’t think it shows the ambiguity between them. I believe a conflict(aka ambiguity) requires at least two things but the definition actually doesn’t convincingly show there are two things in the first place, in my opinion.