Get the latest tech news

Researchers Claim New Technique Slashes AI Energy Use By 95%


Researchers at BitEnergy AI, Inc. have developed Linear-Complexity Multiplication (L-Mul), a technique that reduces AI model power consumption by up to 95% by replacing energy-intensive floating-point multiplications with simpler integer additions. This method promises significant energy savings wit...

This method promises significant energy savings without compromising accuracy, but it requires specialized hardware to fully realize its benefits. Tests across natural language processing, vision tasks, and symbolic reasoning showed an average performance drop of just 0.07% -- a negligible tradeoff for the potential energy savings. "To unlock the full potential of our proposed method, we will implement the L-Mul and L-Matmul kernel algorithms on hardware level and develop programming APIs for high-level model design," the researchers say.

Get the Android app

Or read this on Slashdot

Read more on:

Photo of researchers

researchers

Photo of energy use

energy use

Related news:

News photo

Nearly 50% of researchers quit science within a decade, huge study reveals

News photo

In stunning Nobel win, AI researchers Hopfield and Hinton take 2024 Physics Prize

News photo

Geoffrey Hinton and John Hopfield win Nobel Prize in physics