Get the latest tech news

Gpu.cpp: A lightweight library for portable low-level GPU computation


A lightweight library for portable low-level GPU computation using WebGPU. - GitHub - AnswerDotAI/gpu.cpp: A lightweight library for portable low-level GPU computation using WebGPU.

CUDA has been dominant at large scale training and inference but at the other end of the the spectrum in the world of GPU compute on personal devices, there exists far more heterogeneity in the hardware and software stack. gpu.cpp lets us implement and drop-in any algorithm with fine-grained control of data movement and GPU code, and explore outside boundaries of what is supported by existing production-oriented inference runtimes. At the same time we can write code that is portable and immediately usable on a wide variety of and GPU vendors and compute form factors - workstations, laptops, mobile, or even emerging hardware platforms such as AR/VR and robotics.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of GPU

GPU

Photo of lightweight library

lightweight library

Related news:

News photo

Google can totally explain why Chromium browsers quietly tell only its websites about your CPU, GPU usage | OK, now tell us why this isn't an EU DMA violation – asking for a friend in Brussels

News photo

Google can totally explain why Chromium browsers quietly tell only its websites about your CPU, GPU usage

News photo

Arm tweaks AMD’s FSR to bring battery-saving GPU upscaling to phones and tablets