I’d guess that humanity as a whole has a fairly low probability of success, with wide error bars.
Just out of curiosity how would your estimate update if you can enough resources to do anything you deemed necessary but not enough to affect current trajectory of the field
I’m not sure I understand the hypothetical—most of the actions that I deem necessary are aimed at affecting the trajectory of the AI field in one way or another.
Just out of curiosity how would your estimate update if you can enough resources to do anything you deemed necessary but not enough to affect current trajectory of the field
I’m not sure I understand the hypothetical—most of the actions that I deem necessary are aimed at affecting the trajectory of the AI field in one way or another.
Ok, that’s informative. So the dominant factor is not the ability to get to the finish line faster (which kind of makes sense)