A ChatGPT request consumes 10 times more energy than a Google search request 07:30

Google DeepMind research team introduced a new AI training method called JEST, which will lead to significant changes in the industry. Compared to previous methods, the new technology will accelerate AI training and reduce energy consumption by 10 times. Andrei Zhuk, an expert in the use of machine learning and AI in the field of telecommunications technologies and high-tech equipment, told socialbites.ca about the principle of operation of this technology and how exactly it will affect the industry.

“Simply put, the principle of the new method is that it trains AI not on individual examples, as in the previously used CLIP and BLIP technologies, but on entire “packages” of data. The JEST process creates a smaller AI model that evaluates and sorts data from the highest quality sources. Then, with the help of a small AI model training a larger model, the most relevant “packages” of data are created for a given topic. This is similar to a school teacher choosing the best resources to teach children and creating a comprehensive curriculum that will cover all the important aspects of a given topic. In other words, JEST looks at the data loaded into it comprehensively, not separately, when many iterations would have to be made to assimilate a thesis. This is precisely how AI training is accelerated and energy consumption is reduced by 10 times,” he explained.

According to the expert, this is an important milestone in the development of artificial intelligence, because the environmental damage caused by the huge amount of equipment in data centers is already enormous.

“For example, a single ChatGPT query consumes 10 times more energy than a Google search query. Because in order to generate a single correct answer to a question, it first generates 10, then chooses the best one. In addition, an average “conversation” of 20-30 questions with ChatGPT “costs” half a liter of water, which data centers use to cool equipment. According to a Goldman Sachs report, by 2030, data centers will consume 3-4% of all electricity on the planet. And in some countries, such as the US, this figure will be 8%,” he said.

Zhuk noted that 20% of the total amount of energy consumed by data centers comes from AI work. Moreover, leaving aside the environmental damage caused by chip manufacturing and supply chains, analysts from various companies believe that AI computing consumption will reach 13.5-20 gigawatts by 2028, which is more than the energy consumption of all of Iceland.

“And in 2023, the largest technology companies, Google and Microsoft, each consumed 24 TWh of electricity, which exceeds the energy consumption of more than 100 countries,” the expert emphasized.

Giant companies are concerned about such indicators and focus on using renewable energy in the operation of their data centers. Google has gone even further, setting itself an ambitious goal to switch to 100% carbon-free energy across its business by 2030. Now, “underwater data centers” are also gaining popularity, when equipment is immersed in coastal waters of various countries.

“This allows you to significantly save energy on cooling equipment and reduce costs during its installation; oddly enough, but it is more difficult and expensive to do it off-site. Microsoft is already experimenting with “underwater data centers”, and there are also suggestions that Amazon, Google and Facebook (the owner of Meta is known as an extremist in Russia and is banned) are conducting their own research to control data centers in the water,” he emphasized.

As major tech companies look to minimize environmental damage, the new JEST AI training method is another major step toward reducing energy consumption associated with the rapid development of artificial intelligence worldwide.

“Over the next six months, JEST is likely to be actively used by Silicon Valley and Chinese tech companies to accelerate the development of large AI models. However, despite the technology providing significant energy savings during the AI ​​training phase, the industry has not yet solved its biggest “ecological” problem: generating AI answers without generating dozens of potential answers by brute force and then selecting the best answer for each query. Because this phase accounts for 85% of AI energy consumption in data centers,” he concluded.

Previously AppearedIt is seen that 90% of first-year students actively use artificial intelligence services in their studies.

What are you thinking?



Source: Gazeta

Popular

More from author

Lavrov announced that Russia is ready to fight for the Arctic 01:55

Russia is fully prepared to defend its interests in the Arctic, Russian Foreign Minister Sergei Lavrov said in an interview for the documentary series...

They can increase the contribution of transactions with foreigners to the budget in Russia 01:52

The relevant subcommittee of the government commission on foreign investment control discussed the issue of increasing the contribution to the federal budget from transactions...

Confidence in official medicine is increasing in Russia 02:00

A study conducted by NIZHFARM, the results of which were reviewed by socialbites.ca, showed that Russians' trust in official medicine and doctors is growing. According...

Russian troops managed to capture the fortress thanks to the skillful work of the radio 02:08

Russian troops managed to seize the stronghold of the Armed Forces of Ukraine thanks to a radio game, this was reported by the Russian...