possibly you can convince them to do so later, when the tech is improved enough to be dangerous
Sort of: because once you publish the weights for a model there’s no going back I’m hoping even the next round of models will not be published, or at least not published without a thorough set of evals. The problem is that if you miss that a private model is able to meaningfully lower the bar to causing harm (ex: telling people how to make pandemics) you can restrict access or modify it, while you learn that a public model can do that you’re out of luck.
Sort of: because once you publish the weights for a model there’s no going back I’m hoping even the next round of models will not be published, or at least not published without a thorough set of evals. The problem is that if you miss that a private model is able to meaningfully lower the bar to causing harm (ex: telling people how to make pandemics) you can restrict access or modify it, while you learn that a public model can do that you’re out of luck.