Get the latest tech news

Understanding Transformers via N-gram Statistics


Transformer based large-language models (LLMs) display extreme proficiency with language yet a precise understanding of how they work remains elusive. One way of demystifying transformer predictions would be to describe how they depend on their context in terms of simple template functions. This paper takes a first step in this direction by considering families of functions (i.e. rules) formed out of simple N-gram based statistics of the training data. By studying how well these rulesets approximate transformer predictions, we obtain a variety of novel discoveries: a simple method to detect overfitting during training without using a holdout set, a quantitative measure of how transformers progress from learning simple to more complex statistical rules over the course of training, a model-variance criterion governing when transformer predictions tend to be described by N-gram rules, and insights into how well transformers can be approximated by N-gram rulesets in the limit where these rulesets become increasingly complex. In this latter direction, we find that for 79% and 68% of LLM next-token distributions on TinyStories and Wikipedia, respectively, their top-1 predictions agree with those provided by our N-gram rulesets.

View PDFHTML (experimental) Abstract:Transformer based large-language models (LLMs) display extreme proficiency with language yet a precise understanding of how they work remains elusive. This paper takes a first step in this direction by considering families of functions (i.e. rules) formed out of simple N-gram based statistics of the training data. In this latter direction, we find that for 79% and 68% of LLM next-token distributions on TinyStories and Wikipedia, respectively, their top-1 predictions agree with those provided by our N-gram rulesets.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of Transformers

Transformers

Photo of N-gram Statistics

N-gram Statistics

Related news:

News photo

Beyond transformers: Nvidia’s MambaVision aims to unlock faster, cheaper enterprise computer vision

News photo

Transformers Without Normalization

News photo

Six minutes of Transformers: Reactivate gameplay footage leaks online