I really liked Larks’ comment, but I’d like to add that this also incentivizes research teams to go into secret. Many AI projects (and some biotech) are currently privately funded rather than government funded, and so they could profit by not publicizing their efforts.
This is true, although I think the number of researchers who would be happy to work on something illegally would be quite a lot lower than those happy to work on something legally.
A similar effect I’m more worried about is pushing the research over to less safety-conscious regimes. But I’m not certain about the size of this effect; good regulation in one country is often copied, and this is an area where international agreements might be possible (and international law might provide some support, although it is untested: see pages 113-122 of this report in a geoengineering context).
I really liked Larks’ comment, but I’d like to add that this also incentivizes research teams to go into secret. Many AI projects (and some biotech) are currently privately funded rather than government funded, and so they could profit by not publicizing their efforts.
This is true, although I think the number of researchers who would be happy to work on something illegally would be quite a lot lower than those happy to work on something legally.
A similar effect I’m more worried about is pushing the research over to less safety-conscious regimes. But I’m not certain about the size of this effect; good regulation in one country is often copied, and this is an area where international agreements might be possible (and international law might provide some support, although it is untested: see pages 113-122 of this report in a geoengineering context).