Get the latest tech news

Theoretical Analysis of Positional Encodings in Transformer Models


Positional encodings are a core part of transformer-based models, enabling processing of sequential data without recurrence. This paper presents a theoretical framework to analyze how various positional encoding methods, including sinusoidal, learned, relative, and bias-based methods like Attention with Linear Biases (ALiBi), impact a transformer's expressiveness, generalization ability, and extrapolation to longer sequences. Expressiveness is defined via function approximation, generalization bounds are established using Rademacher complexity, and new encoding methods based on orthogonal functions, such as wavelets and Legendre polynomials, are proposed. The extrapolation capacity of existing and proposed encodings is analyzed, extending ALiBi's biasing approach to a unified theoretical context. Experimental evaluation on synthetic sequence-to-sequence tasks shows that orthogonal transform-based encodings outperform traditional sinusoidal encodings in generalization and extrapolation. This work addresses a critical gap in transformer theory, providing insights for design choices in natural language processing, computer vision, and other transformer applications.

View a PDF of the paper titled Theoretical Analysis of Positional Encodings in Transformer Models: Impact on Expressiveness and Generalization, by Yin Li View PDFHTML (experimental) Abstract:Positional encodings are a core part of transformer-based models, enabling processing of sequential data without recurrence. The extrapolation capacity of existing and proposed encodings is analyzed, extending ALiBi's biasing approach to a unified theoretical context.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of transformer models

transformer models

Photo of theoretical analysis

theoretical analysis

Photo of positional encodings

positional encodings

Related news:

News photo

Music recommendation system using transformer models

News photo

Falcon Mamba 7B’s powerful new AI architecture offers alternative to transformer models

News photo

Etched looks to challenge Nvidia with an ASIC purpose-built for transformer models