Google’s challenge is that language models will eat up the profit margins of search. They currently make a couple of pennies per search, and that’s what it would cost to integrate ChatGPT into search.
Microsoft seems happy to use Bing as a loss leader to break Google’s monopoly on search. Over time, the cost of running language models will fall dramatically, making the business model viable again.
Google isn’t far behind the cutting edge of language models — their PaLM is 3x bigger than GPT-3 and beats it in many academic benchmarks. But they don’t want to play the scaling game and end up bankrupting themselves. So they try to save money, deploying a smaller Bard model, and producing lower quality answers as a result.
For what it’s worth, this is not a prediction, Sundar Pichai said it in an NYT interview: https://www.nytimes.com/2023/03/31/technology/google-pichai-ai.html
My best guess is it will be announced once the switch happens in order to get some good press for Google Bard.
Google’s challenge is that language models will eat up the profit margins of search. They currently make a couple of pennies per search, and that’s what it would cost to integrate ChatGPT into search.
Microsoft seems happy to use Bing as a loss leader to break Google’s monopoly on search. Over time, the cost of running language models will fall dramatically, making the business model viable again.
Google isn’t far behind the cutting edge of language models — their PaLM is 3x bigger than GPT-3 and beats it in many academic benchmarks. But they don’t want to play the scaling game and end up bankrupting themselves. So they try to save money, deploying a smaller Bard model, and producing lower quality answers as a result.
https://sunyan.substack.com/p/the-economics-of-large-language-models#§how-much-would-llm-powered-search-cost
https://www.semianalysis.com/p/the-inference-cost-of-search-disruption