Get the latest tech news
Tiny-tpu: A minimal tensor processing unit (TPU), inspired by Google's TPU
A minimal tensor processing unit (TPU), inspired by Google's TPU V2 and V1 - tiny-tpu-v2/tiny-tpu
Performs element-wise operations after the systolic array Control: Module selection depends on the computation stage Modules (pipelined): Bias addition Leaky ReLU activation function MSE loss Leaky ReLU derivative Dual-port memory for storing intermediate values Stored Data: Input matrices Weight matrices Bias vectors Post-activation values for backpropagation Activation leak factors Inverse batch size constant for MSE backpropagation We want this resource to be the ultimate guide to breaking into building chip accelerators for all levels of technical expertise — even if you just learned high school math and only know y = mx + b.
Or read this on Hacker News