Google DeepMind researchers have unveiled a new AI training approach called JEST, a development that could reshape industry standards. Compared with prior methods, this technology promises faster AI training and substantially lower energy use, potentially tenfold reductions. Andrei Zhuk, a specialist in applying machine learning to telecommunications and high-tech equipment, explained the operation principles and the likely industry impact to a Canadian and US audience.
“In simple terms, JEST trains AI not on single examples as with CLIP or BLIP, but on entire packs of data. The process builds a smaller AI model that evaluates data from the best sources. Then, a compact model helps train a larger one by selecting the most relevant data packages for a given topic. It’s akin to a teacher curating top resources to create a complete curriculum that covers all essential aspects of a subject. JEST therefore assesses data holistically rather than in isolated iterations, accelerating learning and cutting energy use by ten times,” he explained.
Experts see this as a pivotal milestone in AI development, particularly given the environmental footprint of large data centers.
“For instance, a single ChatGPT query can consume much more energy than a typical Google search, because multiple candidate responses are generated before selecting the best one. An average 20- to 30-question ChatGPT session can require substantial cooling water for data centers. Goldman Sachs projects data centers could account for 3-4% of global electricity use by 2030, with higher shares in some regions,” the expert noted.
Zhuk added that AI workloads already contribute a significant portion of total data center energy use. Even when setting aside chip manufacturing and supply chains, analysts forecast AI computing demand reaching 13.5–20 gigawatts by 2028, a figure that rivals the total energy consumption of an entire small nation.
“And in 2023, major tech companies like Google and Microsoft together consumed around 24 TWh of electricity, a level exceeding the energy use of more than 100 countries,” the expert emphasized.
Leading firms are responding by prioritizing renewable energy for data centers. Google has set a goal of sourcing 100% carbon-free energy across its operations by 2030. Innovative cooling and resilience strategies are also gaining traction, including the use of underwater data centers where equipment sits in coastal waters to slash energy used for cooling.
“This approach can greatly reduce cooling energy and installation costs, though off-site deployment can be more challenging and expensive. Microsoft is experimenting with underwater data centers, while there is talk that Amazon, Google, and Meta are exploring water-based data center concepts,” he added.
As large tech players strive to minimize environmental impact, the JEST training method stands as another step toward lower energy use in AI development across the globe.
“In the next six months, JEST could see active adoption by Silicon Valley and leading Chinese tech firms to speed up the creation of large AI models. Yet the industry still faces a major ecological challenge: generating AI answers without producing dozens of candidate results in brute-force fashion for every query. This phase accounts for a sizable share of data center energy consumption,” he concluded.
Recent trends show broad student and professional adoption of AI services in higher education, underscoring the growing centrality of efficient AI tools in everyday work and study.