Get the latest tech news
Starting from scratch: Training a 30M Topological Transformer
uformer is a topological transformer (see paper) that replaces dot‑product attention with a Laplacian-derived scalar (taumode) per token/head, then attends using distances in that scalar space. Below is a post-style overview of the idea and the first training signals from a 30M-parameter run.
None
Or read this on Hacker News