Get the latest tech news

Micrograd.jl


A series on automatic differentiation in Julia. Part 1 provides an overview and defines explicit chain rules.

In the backward pass, the result is compared to a ground truth sample and the error is backpropagated throughout the model, from the final layers through to the start. The probability boundaries of a multi-layer perceptron trained on the moons dataset with MicroGrad.jl.Andrej Kaparthy made an excellent video where he built a minimal automatic differentiation module called Micrograd in Python. This is a useful example because (1) we can start with a target curve and so have ground truth values to compare and (2) this problem can be solved analytically without gradients.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of Micrograd.jl

Micrograd.jl