The cost of chat robot is 10 times that of traditional search

ChatGPT is popular all over the world,and major technology companies have added AI codes.However,the emerging technology of chat robot tools not only has problems in accuracy,but also makes the R&D companies bear high operating costs.

On February 22,local time,John Hennessy,chairman of Alphabet,Google’s parent company,said in an interview with Reuters that the conversation cost of AI such as large language models may be more than 10 times that of traditional search engines.He believes that Google must reduce the operating costs of such AI,but this process is not easy.”The worst case is that it will take several years.”

The cost of chat robots is high

Compared with traditional search engines,the operation cost of chat robots is slightly higher.

Morgan Stanley’s analysis showed that Google had 3.3 trillion searches last year,and the cost of each search was only 0.2 cents.Earlier,analysts pointed out that a reply from ChatGPT would cost 2 cents.

Morgan Stanley expects that if Bard,a chat robot owned by Google,is introduced into the search engine and used to handle half of Google’s searches and questions,the cost of the company in 2024 may increase by$6 billion based on 50 words per answer.

Similarly,SemiAnalysis,a consulting company focusing on chip technology,said that due to the influence of Google’s internal chip Tensor Processing Units,adding chat robots to the search engine might cause the company to spend an additional$3 billion.

Richard Socher,CEO of the search engine You.com,believes that adding chat robots,charts,videos and other generation technology applications will result in 30%to 50%of the additional expenses.However,he believes that”over time,such technologies will become cheaper and cheaper.”

The reason why the cost of chat robot is higher than that of traditional search is mainly because it requires more computing power.Analysts said that this kind of AI relies on billions of dollars worth of chips.If this cost is spread evenly over several years of use,the single use cost will increase.In addition,power consumption will also lead to higher costs and pressure on the company’s carbon emission indicators.

The working principle of chat robot AI is different from that of search engine.The process of AI processing user’s questions is called”interference”,which simulates the”neural network”of the human brain and infers the answer based on previous training data.In traditional search engines,Google’s web crawler will scan the Internet and compile it as an information index.When users enter search questions,Google will provide the most relevant answers stored in the index.

For chat robots,the larger the language model they rely on,the more chips they need to support”reasoning”,and the higher the corresponding costs.

Taking ChatGPT as an example,its technical source is the natural language processing(NLP)model GPT-3 under OpenAI.The model has 175 billion parameters,and the training cost is about 12 million dollars.It is one of the most expensive models with the largest number of parameters in the same class.

Therefore,the cost of ChatGPT remains high.Sam Altman,CEO of OpenAI,said on Twitter that the cost of ChatGPT talking to users is”staggering”,and each conversation costs several cents or more.