The big difference, however, is that Anthropic is essentially using the contractual vehicle to impose what feel less like technical constraints and more like policy constraints on the military. Think of the difference between “this fighter jet is not certified for flight above such-and-such an altitude, and if you fly above that altitude, you’ve breached your warranty,” and “you may not fly this jet above such-and-such an altitude”). It is probably the case that the military should not agree to terms like this, and private firms should not try to set them.
The contract was not illegal, just perhaps unwise, and even that probably only in retrospect. Note that this is true even if you agree with the underlying substance of the limitations. You can support restrictions on mass domestic surveillance and lethal autonomous weapons, but disagree that a defense contract is the optimal vehicle to achieve that policy outcome. The way you achieve new policy outcomes, under the usual rules of our republic, is to pass a law...
I agree that there’s something iffy/non-democratic in theory about putting that kind of constraint around the Pentagon, and that it would have been prudent for them to decline it in the first place. An analogy I read on Substack: if an epidural manufacturer told a government hospital “you’re welcome to use our drug so long as you don’t use it in any abortions,” it would probably be prudent to decline that contract (too much overhead).
Anyway this reframing put one sentence in particular by Dario into a new light: “To the extent that such surveillance is currently legal, this is only because the law has not yet caught up with the rapidly growing capabilities of AI.” In other words, because we know what the law should be and what it’s probably going to be, we should implement that policy today. I think many of us can think of examples where we’d be uncomfortable with a billionaire tech CEO saying that.
Surely you could phrase things the other way round?
“We’re pretty sure this will be made illegal in 10 years time, as the law catches up to our technology advances. However, it’s not illegal now, so feel free to buy it from us and use it!”
I’d be really uncomfortable with a billionaire tech CEO openly saying that.
I’m not sure democracy arguments work that well for military stuff. The people who military actions are going to be deployed against are extremely obvious stakeholders, but they get no input into any feasible “democratic” process that determines what the US military does, and procedural democracy is compatible with the US doing literally anything to non-citizens to advance US interests. Given that, attempting to restrain the US military in ways that are legal and non-deceptive doesn’t seem that procedurally dubious to me.
An analogy I read on Substack: if an epidural manufacturer told a government hospital “you’re welcome to use our drug so long as you don’t use it in any abortions,” it would probably be prudent to decline that contract (too much overhead).
Dean Ball’s commentary on this refamed the issue for me https://www.hyperdimensional.co/p/clawed
I agree that there’s something iffy/non-democratic in theory about putting that kind of constraint around the Pentagon, and that it would have been prudent for them to decline it in the first place. An analogy I read on Substack: if an epidural manufacturer told a government hospital “you’re welcome to use our drug so long as you don’t use it in any abortions,” it would probably be prudent to decline that contract (too much overhead).
Anyway this reframing put one sentence in particular by Dario into a new light: “To the extent that such surveillance is currently legal, this is only because the law has not yet caught up with the rapidly growing capabilities of AI.” In other words, because we know what the law should be and what it’s probably going to be, we should implement that policy today. I think many of us can think of examples where we’d be uncomfortable with a billionaire tech CEO saying that.
Surely you could phrase things the other way round?
“We’re pretty sure this will be made illegal in 10 years time, as the law catches up to our technology advances. However, it’s not illegal now, so feel free to buy it from us and use it!”
I’d be really uncomfortable with a billionaire tech CEO openly saying that.
I’m not sure democracy arguments work that well for military stuff. The people who military actions are going to be deployed against are extremely obvious stakeholders, but they get no input into any feasible “democratic” process that determines what the US military does, and procedural democracy is compatible with the US doing literally anything to non-citizens to advance US interests. Given that, attempting to restrain the US military in ways that are legal and non-deceptive doesn’t seem that procedurally dubious to me.
This is a great intuition pump.