Get the latest tech news

Energy-efficient AI model could be a game changer, 50 times better efficiency with no performance hit


AI up to this point has largely been a race to be first, with little consideration for metrics like efficiency. Looking to change that, the researchers trimmed...

Cutting corners: Researchers from the University of California, Santa Cruz, have devised a way to run a billion-parameter-scale large language model using just 13 watts of power – about as much as a modern LED light bulb. This key change was inspired by a paper from Microsoft, and means that all computation involves summing rather than multiplying – an approach that is far less hardware intensive. With these sort of efficiency gains in play and given a full data center worth of power, AI could soon take another huge leap forward.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Times

Times

Photo of energy

energy

Photo of Performance

Performance

Related news:

News photo

Wild Boar Has Five Times More PFAS Than Humans Allowed to Eat

News photo

US scientists turn dry air into drinking water with 5 times more efficiency | Even in desert-like conditions, the fins were saturated with water in about an hour.

News photo

Researchers run high-performing LLM on the energy needed to power a lightbulb