Sure.
Often when people talk about awesome stuff they’re not referring to LLMs. In this case, there’s no need to slow down the awesome stuff they’re talking about.
Lots of awesome stuff requires AGI or superintelligence. People think LLMs (or stuff LLMs invent) will lead to AGI or superintelligence.
So wouldn’t slowing down LLM progress slow down the awesome stuff?
Yeah, that awesome stuff.
My impression is that most people who buy “LLMs --> superintelligence” favor caution despite caution slowing awesome stuff.
But this thread seems unproductive.
Sure.
Often when people talk about awesome stuff they’re not referring to LLMs. In this case, there’s no need to slow down the awesome stuff they’re talking about.
Lots of awesome stuff requires AGI or superintelligence. People think LLMs (or stuff LLMs invent) will lead to AGI or superintelligence.
So wouldn’t slowing down LLM progress slow down the awesome stuff?
Yeah, that awesome stuff.
My impression is that most people who buy “LLMs --> superintelligence” favor caution despite caution slowing awesome stuff.
But this thread seems unproductive.