TRENDING

The ‘4 Ms’: Google’s Plan for AI Companies to Consume 1,000 Times Less Power

  • AI is consuming a monstrous amount energy of during its training and when providing users with responses to their questions.

  • Google proposes a four-pronged strategy to attack the problem: model, machine, mechanization, and mapping.

The '4Ms': Google's plan for AI companies to consume 1,000 times less power
No comments Twitter Flipboard E-mail

There’s growing concern about the required energy needed to fuel artificial intelligence, as well as its water consumption and overall carbon footprint. This isn’t exaggerated doom and gloom, it's a reality that puts increasing pressure on the energy grid and has even forced the International Energy Agency to convene a global summit. Google is proposing a four-pronged strategy to address the problem.

The 4 Ms. In a study published by IEEE, Google identifies four practices, which it calls “the 4 Ms,” that it says large AI companies can use to reduce the carbon footprint of their machine learning algorithms by 100 to 1,000 times:

  • Model: Use more efficient machine learning architectures to reduce computational requirements by a factor of 3 to 10.
  • Machine: Leverage specialized artificial intelligence hardware to improve efficiency by 2 to 5 times.
  • Mechanization: Favor cloud computing over on-premises computing to reduce energy requirements by a factor of 1.4 to 2 times.
  • Mapping: Optimize data center locations based on available clean energy to reduce emissions by 5 to 10 times.

Google Research scientist David Patterson, the study’s lead author, says that following these four practices would reduce rather than increase the carbon footprint associated with AI training.

M stands for model. At the architectural level, new AI models incorporate more developments to improve efficiency. Google, Microsoft, OpenAI, and Meta use the “knowledge distillation” technique to train smaller models that mimic a large model, the “master,” using less energy.

Companies continue to train larger and larger models, many of which aren’t available to users. In the case of Google, training these models accounts for 40% of the energy demand, while “inferencing” part of the models available to users (which processes its answers) accounts for 60%.

Although it may sound counterintuitive, the latest multimodal models released to the public, including Gemini 1.5 Pro and GPT-4o, are also more efficient than their predecessors due to their ability to leverage different input modalities, such as images and code: They learn with less data and examples than text-only models.

M stands for machine. Most companies developing AI buy hardware from Nvidia, which has specialized chips. However, more companies, such as Microsoft, OpenAI, and Huawei, are opting for the “Google model” when creating their hardware.

Google has used its own “TPUs” (tensor processing units, specialized for AI) for years. The latest generation, announced by the company in May, is called Trillium and is 67% more energy efficient than the previous one. It can perform more computations with less power in training and tuning and in AI inference in Google’s data centers.

M stands for mechanization. Another counterintuitive idea. Cloud computing uses less power than computing in a data center on the premises. Cloud data centers, especially those dedicated to AI, contain tens of thousands more servers than enterprise data centers. They’re designed with better power distribution and cooling systems because companies can amortize them.

Despite the disadvantage of entrusting data to big cloud companies like Amazon, Microsoft, and Google, cloud data centers have another distinct benefit: They’re more modern, which means they have more specialized machines for AI training and inference.

M stands for mapping. Google is also calling for more cloud computing and less on-premise computing because of the commitment of Big Tech companies to renewable energy. Some of these large data centers already run on 90% carbon-free energy.

Big Tech companies are building their new data centers in places where renewable resources are abundant, including the water used to cool the servers. As a result, companies like Google, Microsoft, and Apple are sourcing 100% of the electricity in their operations from renewable sources and are aiming for net-zero emissions by the end of this decade.

On the other hand, companies like Microsoft and OpenAI are still determining whether renewable energy supply can meet the growing energy demand and are already betting on expanding nuclear capacity, either with small modular reactors or by investing in fusion research.

Image | Google Cloud

Related | The New Energy Era Is Here: Seven Chinese Solar Companies Now Produce More Capacity Than Oil Companies

Home o Index