Lots of awesome stuff requires AGI or superintelligence. People think LLMs (or stuff LLMs invent) will lead to AGI or superintelligence.
So wouldn’t slowing down LLM progress slow down the awesome stuff?
Yeah, that awesome stuff.
My impression is that most people who buy “LLMs --> superintelligence” favor caution despite caution slowing awesome stuff.
But this thread seems unproductive.
Lots of awesome stuff requires AGI or superintelligence. People think LLMs (or stuff LLMs invent) will lead to AGI or superintelligence.
So wouldn’t slowing down LLM progress slow down the awesome stuff?
Yeah, that awesome stuff.
My impression is that most people who buy “LLMs --> superintelligence” favor caution despite caution slowing awesome stuff.
But this thread seems unproductive.