Get the latest tech news

Alice's Adventures in a Differentiable Wonderland


Neural networks surround us, in the form of large language models, speech transcription systems, molecular discovery algorithms, robotics, and much more. Stripped of anything else, neural networks are compositions of differentiable primitives, and studying them means learning how to program and how to interact with these models, a particular example of what is called differentiable programming. This primer is an introduction to this fascinating field imagined for someone, like Alice, who has just ventured into this strange differentiable wonderland. I overview the basics of optimizing a function via automatic differentiation, and a selection of the most common designs for handling sequences, graphs, texts, and audios. The focus is on a intuitive, self-contained introduction to the most important design techniques, including convolutional, attentional, and recurrent blocks, hoping to bridge the gap between theory and code (PyTorch and JAX) and leaving the reader capable of understanding some of the most advanced models out there, such as large language models (LLMs) and multimodal architectures.

View a PDF of the paper titled Alice's Adventures in a Differentiable Wonderland -- Volume I, A Tour of the Land, by Simone Scardapane View PDF Abstract:Neural networks surround us, in the form of large language models, speech transcription systems, molecular discovery algorithms, robotics, and much more. I overview the basics of optimizing a function via automatic differentiation, and a selection of the most common designs for handling sequences, graphs, texts, and audios.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of Adventures

Adventures

Photo of alice

alice

Related news:

News photo

Adventures in Symbolic Algebra with Model Context Protocol

News photo

Adventures in Imbalanced Learning and Class Weight

News photo

ALICE detects the conversion of lead into gold at the LHC