Get the latest tech news
The question that no LLM can answer and why it is important
Notes From the Desk: No. 32 - 2024.04.23
Pointed out by Information is Beautiful, a very interesting distribution forms when AI is asked to pick a number between 1 and 100. The line between hallucination and truth is simply a probability factored by the prevalence of training data and post-training processes like fine-tuning. Mission-critical systems that require deterministic, provably correct behavior are not something applicable to LLM automation or control.
Or read this on Hacker News