Get the latest tech news

Markov chains are the original language models


Back in my day, we used math for autocomplete.

I've come to the conclusion that there are four stages to the current AI hype cycle in an individual person's brain, at least as it pertains to large language models. Instead of a table, we may represent these possible transitions as a matrix TTT, and the Alice's current location as a vector s⃗\vec{s}s. Next, I go through the training text and match each word to it's index in the dictionary, effectively transforming the String into a Vec<usize>.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of Markov chains

Markov chains

Related news:

News photo

Rock, paper, scissors showdown

News photo

Large Language Models as Markov Chains

News photo

Markov chains are funnier than LLMs