tldr: I think this argument is in danger of begging the question, and rejecting criticisms that implicitly just say “EA isn’t that important” by asserting “EA is important!”
There’s an analogy I think is instructive here
I think the fireman analogy is really fun, but I do have a problem with it. The analogy is built around the “fire = EA cause areas” analogy, and gets almost all of its mileage out of the implicit assumption that fires are important and need to be put out.
This is why the first class of critics in the analogy look reasonable, and the second class of critics look ridiculous. The first go along with the assumption that fires are important, the second reject it (but all of this is implicit!)
I think criticism of the form “fires aren’t actually as important as you think they are” is a valid, if extremely foundational, criticism. It’s not vacuous. If someone has found the True Theory of Ethics and it says the most important thing is “live right” or “have diverse friends” or “don’t be cringe”, then I would want to know that!
I do wish they’d express it as the dramatic ethical claim it really is though, and not in this vague way that makes an unimportant criticism but implies it’s important through tone and so indirectly hints at the real value claim behind it.
So I definitely think I’m making a rhetorical argument there to make a point.
But I don’t think the problem is quite as bad as you imply here: I’m mostly using fire to mean “existential risks in our lifetimes,” and I don’t think almost any EA critic (save a few) think that would be fine. Maybe I should’ve stated it more clearly, but this is something most ethical systems (and common sense ethics) seem do agree upon.
So my point is that critics do agree dying of an existential risk would be bad, but are unfortunately falling into a bit of parochial discourse rather than thinking about how they can build bridges to solve this.
To the people who are actually fine with dying from x-risk (“fires aren’t as important as you think they are”), I agree my argument has no force, but I just hope that they are clear about their beliefs, as you say.
yup sounds like we’re on the same page—I think I steelmanned a little too hard. I agree that the people making these criticisms probably do in fact think that being shot by robots or something would be bad.
tldr: I think this argument is in danger of begging the question, and rejecting criticisms that implicitly just say “EA isn’t that important” by asserting “EA is important!”
I think the fireman analogy is really fun, but I do have a problem with it. The analogy is built around the “fire = EA cause areas” analogy, and gets almost all of its mileage out of the implicit assumption that fires are important and need to be put out.
This is why the first class of critics in the analogy look reasonable, and the second class of critics look ridiculous. The first go along with the assumption that fires are important, the second reject it (but all of this is implicit!)
I think criticism of the form “fires aren’t actually as important as you think they are” is a valid, if extremely foundational, criticism. It’s not vacuous. If someone has found the True Theory of Ethics and it says the most important thing is “live right” or “have diverse friends” or “don’t be cringe”, then I would want to know that!
I do wish they’d express it as the dramatic ethical claim it really is though, and not in this vague way that makes an unimportant criticism but implies it’s important through tone and so indirectly hints at the real value claim behind it.
So I definitely think I’m making a rhetorical argument there to make a point.
But I don’t think the problem is quite as bad as you imply here: I’m mostly using fire to mean “existential risks in our lifetimes,” and I don’t think almost any EA critic (save a few) think that would be fine. Maybe I should’ve stated it more clearly, but this is something most ethical systems (and common sense ethics) seem do agree upon.
So my point is that critics do agree dying of an existential risk would be bad, but are unfortunately falling into a bit of parochial discourse rather than thinking about how they can build bridges to solve this.
To the people who are actually fine with dying from x-risk (“fires aren’t as important as you think they are”), I agree my argument has no force, but I just hope that they are clear about their beliefs, as you say.
yup sounds like we’re on the same page—I think I steelmanned a little too hard. I agree that the people making these criticisms probably do in fact think that being shot by robots or something would be bad.