I definitely agree that it’s difficult to get organizations to improve governance. External pressure seems critical.
As stated in the post, I think that it’s possible that external pressure could come to AI capabilities organizations, in the form of regulation. Hard, but possible.
I’d (gently) push back against this part: > I think it’s very hard to get even the most basic forms of good epistemic practices
I think that there are clearly some practices that seem good that don’t get used. But there are many ones that do get used, especially at well-run companies. In fact, I’d go so far to say that at least for the issue of “performance and capability” (rather than alignment/oversight), I’d trust the best-run organizations today a lot more than EA ideas of good techniques.
These organizations are often highly meritocratic, very intelligent, and leaders are good at cutting out the BS and honing in on key problems (at least, when doing so is useful to them).
I expect that our techniques like probabilities and forecastable statements just aren’t that great at these top levels. If much better practices come out, using AI, I’d feel good about them being used.
Or, at least for the part of “AIs helping organizations make tons of money by suggesting strategies and changes”, I’d expect businesses to be fairly efficient.
I definitely agree that it’s difficult to get organizations to improve governance. External pressure seems critical.
As stated in the post, I think that it’s possible that external pressure could come to AI capabilities organizations, in the form of regulation. Hard, but possible.
I’d (gently) push back against this part:
> I think it’s very hard to get even the most basic forms of good epistemic practices
I think that there are clearly some practices that seem good that don’t get used. But there are many ones that do get used, especially at well-run companies. In fact, I’d go so far to say that at least for the issue of “performance and capability” (rather than alignment/oversight), I’d trust the best-run organizations today a lot more than EA ideas of good techniques.
These organizations are often highly meritocratic, very intelligent, and leaders are good at cutting out the BS and honing in on key problems (at least, when doing so is useful to them).
I expect that our techniques like probabilities and forecastable statements just aren’t that great at these top levels. If much better practices come out, using AI, I’d feel good about them being used.
Or, at least for the part of “AIs helping organizations make tons of money by suggesting strategies and changes”, I’d expect businesses to be fairly efficient.