New algorithm promises to slash AI power consumption by 95 percent
BitEnergy AI develops an L-Mul algorithm, cutting AI power use by 95% but needing new hardware.
BitEnergy AI's L-Mul algorithm offers a substantial breakthrough, potentially reducing AI power consumption by 95% through integer additions, especially aiding models like GPT, Llama, and Mistral. This approach addresses the growing energy consumption concerns in the AI sector, which could use 85 to 134 TWh by 2027, with ChatGPT alone using 564 MWh daily.
The algorithm maintains accuracy during tensor and dot product computations, delivering enhanced performance with minimal accuracy losses of 0.07%. This innovative method integrates seamlessly into transformer-based models' attention mechanisms, significantly improving efficiency without compromising output quality in tasks like NLP and machine vision.
Challenges for L-Mul include its need for specialized hardware since current AI processors aren't optimized. Although plans for compatible hardware and APIs exist, companies like Nvidia might resist the adoption to protect their market share; yet, such advancements pave the way for a more energy-efficient AI future.