Get the latest tech news

LLMs have reached a point of diminishing returns


Science, sociology, and the likely financial collapse of the Generative AI bubble

A few days ago, the well-known venture capitalist Marc Andreesen started to spill the beans, saying on a podcast “we're increasing [graphics processing units] at the same rate, we're not getting the intelligent improvements at all out of it” – which is basically VC-ese for “deep learning is hitting a wall.” Just a few moments ago, Amir Efrati, editor of the industry trade journal The Information further confirmed that we have reached a period of diminishing returns, writing on X that “OpenAI's [upcoming] Orion model shows how GPT improvements are slowing down”. Media (with notable exceptions like Ezra Klein, who gave me a clear platform for skepticism in January 2023) has rarely listened, instead often glorifying the hype of people like Altman and Musk.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of LLMs

LLMs

Photo of point

point

Photo of diminishing returns

diminishing returns

Related news:

News photo

Puppygraph speeds up LLMs’ access to graph data insights

News photo

An embarrassingly simple approach to recover unlearned knowledge for LLMs

News photo

Why multi-agent AI tackles complexities LLMs can’t