AI-based search would be considerably more expensive per query for Google
For Bing or Google, a ChatGPT-style search engine is simply more expensiveBy Alfonso Maruccia 9 comments
Why it matters: ChatGPT and AI, the new overused buzzwords of the technology world that make investors and shareholders drool, have a huge cost that grows proportionally to the size of a company. For a giant like Google, this means an AI-based search engine would have a significant impact on revenues and net income.
Right now, everyone is talking about ChatGPT and how machine learning-based language models will change the future of everything. Chatbots' growing popularity is wreaking havoc in the web search industry in particular, so much so that Google felt the need to rush the debut of the company's own chat-based search model just a bit ahead of its time.
Aside from the theoretical advancements and disruption capabilities of ChatGPT, one thing is already certain: managing and running a search service heavily based on chatbot technology would be incredibly expensive. In an interview with Reuters, Alphabet Chairman John Hennessy said that a single (user) exchange with an AI-powered search service "likely costs ten times more than a standard keyword search."
The reason for this huge jump in costs is how current web search works behind the scenes. Google is building and maintaining a massive database of websites and other document types available on the internet, sending its crawlers around to index new or updated content to keep the database relevant. When a search query is performed, Google's search engine digs inside the database to rank and categorize its contents to finally provide search results (SERP) to the end user.
An ML-based chatbot is built upon a completely different technology paradigm: with every single search, the massive neural network model must be interpreted to generate a bunch of text (hopefully) related to the user's prompt. The probable need to also query the aforementioned web index to check factual information, and the fact that a chatbot would likely require a longer interaction than a search engine, would only add yet more computational efforts to an already complex and expensive service.
According to experts consulted by Reuters, this complexity would bring a significant cut in Google's $60 billion net income. Morgan Stanley said a chatbot would cost Google $6 billion per year for handling half the total search queries with 50-word answers, while consulting firm SemiAnalysis estimates a $3 billion additional cost.
Google's traditional web search is extremely fast, usually taking less than a second to provide users meaningful SERP results. A chatbot-based search, however, would be much slower and computationally intensive. This last problem could be solved in a few years, Hennessy said, but the expense required for this technological advancement could be too much even for Google.
For now, Mountain View is thinking about ways to implement a "lightweight model version" of its newly introduced Bard chatbot, a solution requiring "significantly less computing power," which gives the company a chance to scale to more users and get better feedback.