You are viewing a single comment's thread from:

RE: LeoThread 2024-10-19 03:31

in LeoFinance3 months ago

AI

New algorithm could reduce artificial intelligence energy consumption by 95%

Artificial intelligence marks a new era of computing, but the gigantic demand for the relatively new technology leads to huge energy consumption by cloud computers that handle these services. Experts believe that, under current conditions, AI could use more energy than the entire population of India by 2030.

#newsonleo #ai #energy #technology

Sort:  

Faced with this challenge, a group of engineers from the company BitEnergy AI announced the development of a method capable of reducing the energy demand of applications based on artificial intelligence by up to 95%. This is a new processing that replaces floating point multiplication with the addition of integers.

Artificial intelligence services use floating point multiplication to be able to handle extremely large numbers, allowing applications to perform calculations with maximum precision, which is critical to ensuring their reliability. This technique is one of the main reasons why AI uses massive amounts of energy.

To put it in perspective, it is estimated that ChatGPT consumes hundreds of megawatt-hours daily. A simple query made with the most popular chatbot in the segment uses around 2.9 watt-hours — almost ten times more than a Google search.

The new method, called “Linear Complexity Multiplication” or “L-Mul, presents results close to floating point multiplication, but uses a simpler algorithm that requires fewer resources and, despite this, is capable of maintaining high precision and reliability necessary to enable artificial intelligence services.

According to the study published by the engineers, L-Mul could reduce the energy consumption of artificial intelligence processing by up to 95%. The results of the internal evaluation using popular benchmarks show that the direct application of the algorithm presents practically lossless operation.

The only challenge highlighted by the team is that the technique requires different hardware than that currently used by the overwhelming majority of artificial intelligence providers. However, engineers claim that the new type of hardware needed to handle the new algorithm has already been designed, built and tested.

A factor that would be crucial for the mass adoption of the new technology is the interest of the hardware giants that dominate the sector — in particular, NVIDIA. The way companies deal with the new development should directly influence the future of technology.