Get the latest tech news

A popular technique to make AI more efficient has drawbacks


One of the most widely used techniques to make AI models more efficient, quantization, has limits — and the industry could be fast approaching them. In

In the context of AI, quantization refers to lowering the number of bits — the smallest units a computer can process — needed to represent information. That could spell bad news for AI companies training extremely large models (known to improve answer quality) and then quantizing them in an effort to make them less expensive to serve. Evidence suggests that scaling up eventually provides diminishing returns; Anthropic and Google reportedly recently trained enormous models that fell short of internal benchmark expectations.

Get the Android app

Or read this on TechCrunch

Read more on:

Photo of drawbacks

drawbacks

Photo of popular technique

popular technique

Related news:

News photo

Drawbacks and solutions for the Meilisearch document indexer

News photo

A popular technique to make AI more efficient has drawbacks

News photo

The Drawbacks Of Using AI In Digital Marketing And Content Strategy