Artificial Intelligence (AI) has emerged as a powerful technological resource, with the potential to revolutionize trillion-dollar industries, from retail to medicine. However, creating each new chatbot and image generator requires a significant amount of electricity, raising concerns about carbon emissions associated with global warming.
Large companies like Microsoft Corp., Alphabet Inc. (Google's parent company), and OpenAI, the maker of ChatGPT, utilize cloud computing, which relies on thousands of chips within servers located in large data centers around the world, to train AI algorithms called models. The success of ChatGPT has encouraged other companies to launch their own AI systems and competing chatbots, or to develop products that use large AI models to offer features to users, from Instacart shoppers to Snap users and CFOs. However, AI consumes more energy than other forms of computing, and training a single model can consume more electricity than 100 US households use in an entire year. The total amount of electricity and carbon emissions attributed to AI is still unknown due to the sector's rapid growth and lack of transparency.
Transparency in the energy use and emissions of AI models is essential. With this information at hand, governments and companies can make informed decisions about the use of large AI models for medical research or the preservation of indigenous languages, for example, considering the electricity consumption and emissions involved. A lack of transparency can result in more rigorous scrutiny, as happened in the cryptocurrency industry, where Bitcoin's energy consumption generated criticism and regulatory measures.
AI models are becoming increasingly larger, with OpenAI's GPT-3 utilizing 175 billion parameters, compared to its predecessor's 1.5 billion. Training these models consumes a significant amount of energy, but actual model usage after training can consume even more energy due to the millions or billions of requests they receive. However, AI companies are working to improve the energy efficiency of their models and have commitments to negative or neutral carbon emissions. For example, Microsoft, Google, and Amazon have carbon neutrality goals, investing in research to measure AI's energy use and carbon impact and implementing measures to make their systems more efficient.
However, challenges remain. One of the main mysteries is the total amount of carbon emissions associated with graphics processing units (GPUs), which are widely used to train AI models. Nvidia, a leading GPU manufacturer, claims that its GPUs are more energy-efficient than CPUs for AI tasks. However, full disclosure of direct and indirect emissions related to GPUs is still unknown.
In summary, electrical energy plays a crucial role in the advancement of Artificial Intelligence. The increasing size of AI models and the demand for training and efficient use of these models are driving the need for transparency in energy consumption and carbon emissions associated with AI. Companies are working to improve the energy efficiency of their systems and achieve sustainability goals, but there are still challenges to overcome to ensure a balance between technological advancement and environmental protection.

