Get the latest tech news

LLMs are cheap


A lot of people think LLMs are expensive to operate. That was true a couple of years ago, but these people haven't updated their views after an approximately 1000x reduction in prices over two years.

Surely the typical LLM response is longer than that- I already picked the upper end of what the (very light) testing suggested as a reasonable range for the type of question that I'd use web search for. But the effect can't really be that large for a popular model: e.g. the allegedly leaked OpenAI financials claimed $4B/year spent on inference vs. $3B/year on training. Also, it seems quite plausible that some Search providers would accept lower margins, since at least Microsoft execs have testified under oath that they'd be willing to pay more for the iOS query stream than their revenue, just to get more usage data.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of LLMs

LLMs

Related news:

News photo

Focus and Context and LLMs

News photo

The last six months in LLMs, illustrated by pelicans on bicycles

News photo

A Knockout Blow for LLMs?