We are disputing a general heuristic that privileges the AI cause area and writes off all the others.
I think the most important argument towards this conclusion is “AI is a big deal, so we should prioritize work that makes it go better”. But it seems you have placed this argument out of scope:
[The claim we are interested in is] that the coming AI revolution undercuts the justification for doing work in other cause areas, rendering work in those areas useless, or nearly so (for now, and perhaps forever).
[...]
AI causes might be more cost-effective than projects in other areas, even if AI doesn’t undercut those projects’ efficacy. Assessing the overall effectiveness of these broad cause areas is too big a project to take on here.
I agree that lots of other work looks about as valuable as it did before, and isn’t significantly undercut by AI. This seems basically irrelevant to the general heuristic you are disputing, whose main argument is “AI is a big deal so is way more important”.
We wanted to focus on a specific and somewhat manageable question related to AI vs. non-AI cause prioritization. You’re right that it’s not the only important question to ask. If you think the following claim is true - ‘non-AI projects are never undercut but always outweighed’ - then it doesn’t seem like an important question at all. I doubt that claim holds generally, for reasons that were presented in the piece. When deciding what to prioritize, there are also broader strategic questions that matter—how is money and effort being allocated by other parties, what is your comparative advantage, etc. - that we don’t touch at all here.
If you think the following claim is true - ‘non-AI projects are never undercut but always outweighed’
Of course I don’t think this. AI definitely undercuts some non-AI projects. But “non-AI projects are almost always outweighed in importance” seems very plausible to me, and I don’t see why anything in the piece is a strong reason to disbelieve that claim, since this piece is only responding to the undercutting argument. And if that claim is true, then the undercutting point doesn’t matter.
When you say you doubt that claim holds generally, is that because you think that the weight of AI isn’t actually that high, or because you think that AI may make the other thing substantially more important too?
I’m generally pretty sceptical about the latter—something which looked like a great idea not accounting for AI will generally not look substantially better after accounting for AI. By default I would assume that’s false unless given strong arguments to the contrary.
I think there are probably cases of each. For the former, there might be some large interventions in things like factory farming or climate change (i) that could have huge impacts and (ii) for which we don’t think AI will be particularly efficacious or impactful.
For the latter, here are some cases off the top of my head. Suppose we think that if AI is used to make factory farming more efficient and pernicious, it will be via X (idk, some kind of precision farming technology). Efforts to make X illegal look a lot better after accounting for AI. Or, right now, making it harder for people to buy ingredients for biological weapons might be good bets but not great bets. It reduces the chances of bio weapons somewhat, but knowledge about how to create weapons is the main bottleneck. If AI removes that bottleneck, then those projects look a lot better.
I think the most important argument towards this conclusion is “AI is a big deal, so we should prioritize work that makes it go better”. But it seems you have placed this argument out of scope:
I agree that lots of other work looks about as valuable as it did before, and isn’t significantly undercut by AI. This seems basically irrelevant to the general heuristic you are disputing, whose main argument is “AI is a big deal so is way more important”.
We wanted to focus on a specific and somewhat manageable question related to AI vs. non-AI cause prioritization. You’re right that it’s not the only important question to ask. If you think the following claim is true - ‘non-AI projects are never undercut but always outweighed’ - then it doesn’t seem like an important question at all. I doubt that claim holds generally, for reasons that were presented in the piece. When deciding what to prioritize, there are also broader strategic questions that matter—how is money and effort being allocated by other parties, what is your comparative advantage, etc. - that we don’t touch at all here.
Of course I don’t think this. AI definitely undercuts some non-AI projects. But “non-AI projects are almost always outweighed in importance” seems very plausible to me, and I don’t see why anything in the piece is a strong reason to disbelieve that claim, since this piece is only responding to the undercutting argument. And if that claim is true, then the undercutting point doesn’t matter.
When you say you doubt that claim holds generally, is that because you think that the weight of AI isn’t actually that high, or because you think that AI may make the other thing substantially more important too?
I’m generally pretty sceptical about the latter—something which looked like a great idea not accounting for AI will generally not look substantially better after accounting for AI. By default I would assume that’s false unless given strong arguments to the contrary.
I think there are probably cases of each. For the former, there might be some large interventions in things like factory farming or climate change (i) that could have huge impacts and (ii) for which we don’t think AI will be particularly efficacious or impactful.
For the latter, here are some cases off the top of my head. Suppose we think that if AI is used to make factory farming more efficient and pernicious, it will be via X (idk, some kind of precision farming technology). Efforts to make X illegal look a lot better after accounting for AI. Or, right now, making it harder for people to buy ingredients for biological weapons might be good bets but not great bets. It reduces the chances of bio weapons somewhat, but knowledge about how to create weapons is the main bottleneck. If AI removes that bottleneck, then those projects look a lot better.