Get the latest tech news

Harnessing the Universal Geometry of Embeddings


We introduce the first method for translating text embeddings from one vector space to another without any paired data, encoders, or predefined sets of matches. Our unsupervised approach translates any embedding to and from a universal latent representation (i.e., a universal semantic structure conjectured by the Platonic Representation Hypothesis). Our translations achieve high cosine similarity across model pairs with different architectures, parameter counts, and training datasets. The ability to translate unknown embeddings into a different space while preserving their geometry has serious implications for the security of vector databases. An adversary with access only to embedding vectors can extract sensitive information about the underlying documents, sufficient for classification and attribute inference.

View PDFHTML (experimental) Abstract:We introduce the first method for translating text embeddings from one vector space to another without any paired data, encoders, or predefined sets of matches. Our translations achieve high cosine similarity across model pairs with different architectures, parameter counts, and training datasets. An adversary with access only to embedding vectors can extract sensitive information about the underlying documents, sufficient for classification and attribute inference.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of thing

thing

Photo of text embeddings

text embeddings

Related news:

News photo

The only thing I want from Apple's big 2025 redesign is a

News photo

Show HN: A motherfucking app (does one thing, under 300 LOC)

News photo

10 years later, here’s why my Apple Watch is the first thing I put on every day