Get the latest tech news
Markov chains are the original language models
Back in my day, we used math for autocomplete.
I've come to the conclusion that there are four stages to the current AI hype cycle in an individual person's brain, at least as it pertains to large language models. Instead of a table, we may represent these possible transitions as a matrix TTT, and the Alice's current location as a vector s⃗\vec{s}s. Next, I go through the training text and match each word to it's index in the dictionary, effectively transforming the String into a Vec<usize>.
Or read this on Hacker News