Yes, that’s how I understood it as well. If you spend the same amount on inference as you did on training, then you get a hell of a lot of inference.
I would expect he’d also argue that, because companies are willing to spend tons of money on training, we should also expect them to be willing to spend lots on inference.
Nearly impossible to answer. This report by OpenPhil gives it a hell of an effort, but could still be wrong by orders of magnitude. Most fundamentally, the amount of compute necessary for AGI might not be related to the amount of compute used by the human brain, because we don’t know how similar our algorithmic efficiency is compared to the brain’s.
Yes, that’s how I understood it as well. If you spend the same amount on inference as you did on training, then you get a hell of a lot of inference.
I would expect he’d also argue that, because companies are willing to spend tons of money on training, we should also expect them to be willing to spend lots on inference.
Do we know the expected cost for training an AGI? Is that within a single company’s budget?
Nearly impossible to answer. This report by OpenPhil gives it a hell of an effort, but could still be wrong by orders of magnitude. Most fundamentally, the amount of compute necessary for AGI might not be related to the amount of compute used by the human brain, because we don’t know how similar our algorithmic efficiency is compared to the brain’s.
https://www.cold-takes.com/forecasting-transformative-ai-the-biological-anchors-method-in-a-nutshell/