首页 News 正文

As technology giants compete to establish large-scale data centers around the world, the carbon time bomb caused by this has become their concern.
With more and more power consuming artificial intelligence (AI) coming online, a technology pioneered by Google is gaining more and more attention: using software to find clean electricity in areas of the world with excess solar and wind energy, and then increasing data center computing load there. This can reduce carbon emissions and costs.
Chris Noble, co-founder and CEO of cloud computing company Cirrus Nexus, stated that there is an urgent need to figure out how to operate data centers in a way that maximizes the use of renewable energy.
The company is effectively managing data centers owned by Google, Microsoft, and Amazon to reduce carbon emissions.
AI energy consumption raises concerns
The climate risks caused by computing demands driven by artificial intelligence are far-reaching, and if fossil fuel electricity is not shifted towards clean energy, global climate risks will worsen. Nvidia CEO Huang Renxun stated that artificial intelligence has reached a "critical point".
He also stated that the cost of data centers will double within five years.
According to the International Energy Agency, data centers and transmission networks each account for 1.5% of global energy consumption. The total amount of carbon dioxide they emit each year is equivalent to "one Brazil.".
Large scale companies - such as Google, Microsoft, and Amazon, the world's largest data center owners - have set climate goals and face internal and external pressures to achieve them. These lofty goals include decarbonizing their business.
But the rise of artificial intelligence has caused great difficulties in achieving these goals. Graphics processing units are the key to the rise of large-scale language models and consume more electricity than central processing units used in other forms of computation. According to the International Energy Agency's estimate, training an artificial intelligence model requires more than 100 households to use electricity per year.
Noble said, "The growth rate of artificial intelligence far exceeds humanity's ability to produce clean energy for it."
In addition, the energy consumption of artificial intelligence is unstable and more similar to a sawtooth graph, rather than the smooth lines that most data center operators are accustomed to. This makes decarbonization a challenge, let alone ensuring the stability of the power grid.
Dave Sterlace, Global Data Center Customer Director at Hitachi Energy, stated that the growth of artificial intelligence is driven by North American companies, which has concentrated computing power and energy usage there. This is a trend that he did not anticipate two years ago.
Response strategies
In order to reduce carbon dioxide emissions from data centers, large-scale and large-scale data center suppliers have provided funding for a large number of solar or wind power plants and used credit limits to offset emissions.
But this alone is not enough, especially with the increasing use of artificial intelligence. That's why operators are turning to the "load shifting" strategy adopted by Google, a subsidiary of Alphabet Inc. The concept is to reduce emissions by disrupting the operation of data centers.
Nowadays, most data centers seek to operate in a "stable state" so that their energy consumption is relatively stable. This puts them at the mercy of the power grid, considering the lack of transmission lines between regions for natural gas, nuclear energy, and other renewable energy generation. In order to break free from dependence on the power grid, technology giants are seeking opportunities to shift daily or even hourly data center operations globally to absorb excess renewable energy.
Google is making its first attempt to use zero carbon electricity on an hourly basis in certain data centers, allowing its machines to use clean energy 24/7. At present, no one has fully achieved this goal. Moreover, it can be certain that the strategy of transferring loads globally may become complex due to some countries implementing data sovereignty policies that restrict and protect cross-border data flows.
But the innovation that Cirrus Nexus and Google are trying may still be a key part of the emission reduction challenge. The former is searching for power grids around the world and measuring carbon emissions in units of 5 minutes to find the least emitting computing resources for themselves and customers. Last summer, the company also put this type of search into practice.
At that time, the Netherlands was in the most sunny June on record, which led to a decrease in the cost of solar power generation on the grid. This reduces the cost of running servers and reduces carbon emissions. When the sun sets in the Netherlands, Cirrus Nexus transfers its computing load to California to facilitate the use of solar energy that had just been launched on that day.
According to data, by tracking the sun from Europe to the West Coast of the United States and then returning, the company was able to reduce emissions from certain workloads of itself and its customers by 34%. This makes operations more flexible, with both benefits and risks.
Michael Terrell, Head of Carbon Free Energy Strategy at Google 7/24 (24/7), stated that approximately 64% of Google's data centers use carbon free energy, with 13 regional sites achieving a carbon free energy utilization rate of 85% and 7 global data centers achieving a carbon free energy utilization rate slightly above 90%.
"But if you don't replace fossil assets, you won't be able to fully achieve your climate goals," he said.
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

白云追月素 注册会员
  • 粉丝

    0

  • 关注

    0

  • 主题

    39