Sustainability-in-Tech : Search Engine Sustainability Shock

With the integration large language models (LLMs) into search engines, some are predicting that the massive increase in computing power needed could mean huge carbon emissions. 

What Are Large Language Models? 

Large language Models (LLMs) are types of artificial intelligence (AI) models that are trained on vast amounts of text data to understand natural language. These models are typically based on deep learning architectures such as neural networks, and are capable of generating human-like language and carrying out a variety of natural language processing tasks. OpenAI’s ChatGPT and Google’s Bard chatbots are examples of LLMs. 

Integrating LLMs Into Search Engines 

Following the massive success of OpenAI’s ChatGPT (OpenAI has close working links with Microsoft), both Google, Microsoft, and now Chinese search company Baidu have all announced plans to upgrade their search engines by integrating generative AI tools which use LLMs to enable their search engines to understand and respond to complex questions. This is intended to give search engine users a better search engine experience and enable the search engines to compete with eachother in this new area.  

For example: 

– Microsoft has announced that it is to introduce a “new, AI-powered Bing search engine and Edge browser” (in preview Bing.com), using OpenAI’s LLM, to “deliver better search, more complete answers, a new chat experience and the ability to generate content.”  

– Google has announced that it is testing and will soon be introducing its own conversational AI chatbot, powered by LaMDA, Google’s own AI, and that it will be integrated into the Google search engine. 

Environmental Implications 

In addition to worries about inaccuracies in the answers given by chatbots e.g., Bard’s recent costly wrong answer given in an advert for the chatbot, one major concern that many have overlooked is how much carbon emissions could be increased through the wider use of LLMs. 

How And Why? 

As highlighted in quotes from University of Surrey Professor Alan Woodword (in Wired), “There are already huge resources involved in indexing and searching internet content, but the incorporation of AI requires a different kind of firepower.” Professor Woodword is referring to his view that the wider use of LLMs could be a step change in online processing that could massively increase the power and cooling resources needed by large processing centres which could, of course, have a much bigger environmental impact i.e., more carbon generation. There may also be increased challenges in how data centres will deal with the extra heat produced. 

How Much? 

An idea of how big environmental problem this could be may come from a third-party study published on a Cornell University arXiv archive which states that “larger models translate to greater computing demands and, by extension, greater energy demands.” The research paper highlights how training GPT-3, autoregressive language model that ChatGPT is partly based on, consumed 1,287 MWh and that this led to emissions of more than 550 tons of carbon dioxide equivalent. To put the figure in perspective, this is the same amount of CO2 that would be produced by a single person taking 550 roundtrips between New York and San Francisco. Adding to this the fact that more LLMs are being introduced, and integrating chatbots into search engines such as Bing and Google which have tens of millions of users per day has some tech commentators, such as Martin Bouchard of Canadian data centre company QScale to estimate that this will mean “at least four or five times more computing per search.” In order to process this demand, more hardware and more data centres will be needed, which is an unwelcome prospect considering that data centres already account for one per cent (IEA) of the world’s greenhouse gas emissions. This may also make it very challenging for big tech companies to meet their green targets e.g., Microsoft aiming to be carbon negative by 2050.  

AI Can Also Help Reduce The Impact Of Itself 

That said, there are several ways that AI could be used to help offset the extra energy and carbon impacts that the increased use of Large Language Models (LLMs) produce. For example: 

– Helping to develop more energy-efficient training methods. AI researchers can use machine learning algorithms to optimize the training process and reduce the number of computations required to train a model, which can significantly reduce the energy consumption. 

– Cloud providers can use AI to optimise their data centres and reduce their energy consumption. For example, machine learning algorithms can be used to predict the demand for cloud resources and allocate them more efficiently, reducing the number of idle servers and minimising energy waste. 

– Researchers are also exploring the use of green computing technologies to reduce the energy consumption of LLMs. AI algorithms can be used to optimise the scheduling of computing tasks and reduce the number of idle processors, which can significantly reduce the energy consumption. 

– Sustainable computing practices can be adopted to ensure that LLMs are developed and used in an environmentally responsible way. This includes using renewable energy sources, reducing waste, and recycling materials whenever possible. 

What Does This Mean For Your Organisation? 

So much has been reported about the amazing capabilities of LLMs and the new generation of chatbots led by the arrival of ChatGPT, and of how search engines could be seriously upgraded by incorporating them, that the possible environmental impacts appear to have been overlooked and under-reported until now. Data centres are already struggling to cope with demand and the need to reduce energy consumption and carbon emissions, and incorporating chatbots (which already have large energy requirements) into search engines which process hundreds of millions of searches per day looks likely to have a huge negative environmental impact i.e., higher energy requirements, greater carbon emissions, and the need for even more data centres. Now may be the time for tech and computer giants to get together and focus on finding new and innovative ways to minimise the environmental impact of these new technologies e.g., perhaps using more environmentally friendly AI-based solutions. Also sourcing more green and sustainable energy and being transparent and ethical in the use of data could help, but in the short term, it looks as though the rise of these new super-powerful chatbots is likely to create more environmental challenges than solutions.