Get the latest tech news

Tiny-tpu: A minimal tensor processing unit (TPU), inspired by Google's TPU


A minimal tensor processing unit (TPU), inspired by Google's TPU V2 and V1 - tiny-tpu-v2/tiny-tpu

Performs element-wise operations after the systolic array Control: Module selection depends on the computation stage Modules (pipelined): Bias addition Leaky ReLU activation function MSE loss Leaky ReLU derivative Dual-port memory for storing intermediate values Stored Data: Input matrices Weight matrices Bias vectors Post-activation values for backpropagation Activation leak factors Inverse batch size constant for MSE backpropagation We want this resource to be the ultimate guide to breaking into building chip accelerators for all levels of technical expertise — even if you just learned high school math and only know y = mx + b.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of Google

Google

Photo of TPU

TPU

Related news:

News photo

Google's AI Overviews led users astray, reports say some phone numbers are scams

News photo

Google Pixel 10 launch event: New phones, foldables and watches at the Made by Google event

News photo

Google Pixel 10 launch event: Here's how to watch Made by Google on August 20