Get the latest tech news

LLMs are not suitable for brainstorming


This may be obvious to many people already, but I recently thought about this in a few scenarios and figured it will be valuable to articulate it clearly. One note before the main discussion is we …

Its main training goal is to mimic the probabilistic distribution of the text, and as a by-product, it also learns some logic and deduction rules of human language and thought process. The occurrence (and preference, roughly speaking based on observation) of ideas generally aligns with the frequency and attention given by the media and main info sources. A less clear idea is how we can change the training process to not simply following existing data pattern but actually seek out knowledge, thinking and deductive reasoning skills.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of LLMs

LLMs

Related news:

News photo

Triomics taps LLMs to accelerate cancer care, raises $15M

News photo

Consistency LLM: converting LLMs to parallel decoders accelerates inference 3.5x

News photo

Atlan scores $105M for its data control plane, as LLMs boost importance of data