There are lots of things Open AI’s artificial intelligence tool ChatGPT is being used for. We asked it about its carbon footprint.
“As an artificial intelligence language model, I don’t have a direct carbon footprint because I don’t consume energy or produce emissions in the same way that humans or machines do,” ChatGPT told Ethos. But that’s only part of the story.
“The servers and data centers that host my computing infrastructure consume a significant amount of energy and produce carbon emissions,” the AI continued.
“The exact amount of emissions associated with my operation will depend on a variety of factors, including the specific hardware and data center infrastructure used to host me, as well as the usage patterns of users who interact with me,” it said.
AI isn’t new; it has been used for years across a number of industries. One of the earliest and most famous applications of AI was in the game of chess. In 1997, IBM’s Deep Blue computer famously defeated chess champion Garry Kasparov in a six-game match, marking a significant milestone in AI development.
The tech has also been used in the field of robotics and automation, allowing machines to perform tasks that were previously only possible for humans. For example, AI-powered robots are used in manufacturing to assemble products, and autonomous vehicles use AI to navigate and make decisions on the road.
It has been embraced in the arts, too. In 2020, Icelandic singer Bjork collaborated with Microsoft to create an AI-powered and constantly evolving soundscape that adapted to sunrises, sunsets, and changes in barometric pressure in partnership with the Sister City hotel in New York. Bjork also generated music in real-time for her live show, Cornucopia, through a system that was fed a set of parameters, such as mood, tempo, and key, and allowed to generate new music based on these inputs, among others.
Artificial intelligence, real environmental impact
But for all of its potential, AI does come with a sizeable carbon footprint.
A report by the Artificial Intelligence and Sustainable Development Commission found that AI is responsible for approximately two percent of global greenhouse gas emissions. This is equivalent to the carbon footprint of the aviation industry (about 2 percent of total global emissions) and it is expected to grow as AI becomes more widely adopted.
Data centers and communication networks accounted for approximately one percent of global electricity consumption in 2019 and were responsible for 0.3 percent of global carbon emissions, according to a report by the International Energy Agency.
The energy consumption of AI is primarily driven by the training of deep neural networks, which requires vast amounts of computational power. A study by researchers at the University of Massachusetts Amherst estimated that training a single large AI model can produce up to 284 tonnes of CO2 emissions, which is roughly equivalent to the lifetime emissions of five cars.
The researchers noted that the emissions associated with training AI models could be reduced by using more energy-efficient hardware and algorithms, as well as by improving the utilization of data center resources.
The training process involves running large numbers of calculations on vast amounts of data, which can take days or even weeks to complete. To speed up the process, AI systems use graphics processing units (GPUs), which are highly energy-intensive.
According to Bloomberg, it took 1.287 gigawatt hours to train ChatGPT-3, a single general-purpose AI program. That’s roughly as much electricity as it takes to power 120 U.S. homes for a year. And that’s just one program with more already in the world. “OpenAI’s GPT-3 uses 175 billion parameters, or variables, that the AI system has learned through its training and retraining. Its predecessor used just 1.5 billion,” Bloomberg reported.
Leading AI expert and the chairman and CEO of Sinovation Ventures, Dr. Kai-Fu Lee, says the development of more energy-efficient chips and algorithms is critical for reducing the carbon footprint of AI. “Companies should invest in developing specialized chips that are optimized for AI workloads, and researchers should focus on developing more efficient algorithms for training neural networks,” he said.
Dr. Lee also highlights the importance of optimizing the deployment of AI systems. “Companies should consider using cloud-based services that allow AI workloads to be distributed across multiple data centers, reducing the energy consumption of any single facility,” he said. “They should also consider using renewable energy sources to power their data centers and servers.”
Dr. Moustapha Cisse, the director of Google’s AI Center in Accra, Ghana, believes that AI can play a role in reducing carbon emissions in other industries. He says that AI can be used to optimize the energy consumption of buildings, transportation systems, and manufacturing processes, which are significant sources of greenhouse gas emissions. “By using AI to optimize these processes, we can reduce their carbon footprint and contribute to a more sustainable future,” Cisse said.
Indeed, AI is expected to become more pervasive and widely used in the coming decade, with significant implications for energy consumption and carbon emissions.
According to a report by IDC, worldwide spending on AI is expected to grow from $50.1 billion in 2020 to more than $110 billion in 2024. A report by PwC estimates that AI could contribute up to $15.7 trillion to the global economy by 2030, with a majority of this growth coming from increased productivity and efficiency gains in industries such as healthcare, manufacturing, and retail.
Dr. Lee predicts that AI will become increasingly integrated into many aspects of our lives. He says in the next decade, AI will be everywhere, “and it will become a fundamental part of our society.”
A report by the World Economic Forum predicts that AI will also play an increasingly important role in addressing global challenges such as climate change. “AI could help reduce greenhouse gas emissions by up to four percent by 2030, while also improving energy efficiency and reducing waste,” it said.
But this growth also comes with the challenge of managing the environmental impact of increased energy consumption and carbon emissions associated with AI.
Can AI’s carbon footprint be reduced?
As the industry works to improve AI’s functions, many are also looking at reducing its energy consumption and emissions. One approach is to develop more energy-efficient hardware. This could involve designing specialized chips that are optimized for AI workloads or using more energy-efficient cooling systems for data centers. Google, which has notably significant sustainability commitments already in place, has developed its own custom-designed AI chips, called Tensor Processing Units (TPUs), which are specifically designed for AI workloads and are more energy-efficient than traditional GPUs.
Another approach is to develop more energy-efficient AI algorithms. This could involve designing algorithms that require fewer calculations to achieve the same level of accuracy or using more efficient techniques for training neural networks. For example, researchers at MIT have developed an algorithm that can train neural networks with up to 90 percent less energy than traditional methods.
Others are working to optimize the deployment of AI systems. This could involve using cloud-based services that allow AI workloads to be distributed across multiple data centers, reducing the energy consumption of any single facility. It could also involve using renewable energy sources to power data centers and servers, such as solar or wind power.
Policymakers will also play an increasingly important role in reducing the carbon footprint of AI. Governments can incentivize the development of energy-efficient hardware and algorithms by providing funding or tax incentives to companies that invest in these technologies.
Governments can also regulate the energy consumption of data centers and servers, setting targets for energy efficiency and requiring companies to report on their energy usage. In Europe, for example, data centers are subject to the Energy Performance of Buildings Directive, which sets energy efficiency standards for buildings, including data centers.
Policymakers can also work to encourage the use of renewable energy sources by providing incentives for companies to invest in renewable energy, such as tax credits or feed-in tariffs. In the U.S., several states have implemented renewable portfolio standards, which require utilities to generate a certain percentage of their electricity from renewable sources.
Related on Ethos: