The AI boom could soon send GPU prices soaring, so now's a good time to buy one
Customer buys RTX 5080 from Best Buy, but got rocks instead — $1,200 GPU arrived in tampered box with broken seal
Client-side GPU load balancing with Redis and Lua
Show HN: RunMat – runtime with auto CPU/GPU routing for dense math
AMD GPU Managed Memory Support Merged For The GCC 16 Compiler
Chinese startup founded by Google engineer claims to have developed its own TPU chip for AI — custom ASIC reportedly 1.5 times faster than Nvidia's A100 GPU from 2020, 42% more efficient
Nvidia reportedly no longer supplying VRAM to its GPU board partners in response to memory crunch — rumor claims vendors will only get the die, forced to source memory on their own
GPU prices are coming to earth just as RAM costs shoot into the stratosphere
ScaleOps' new AI Infra Product slashes GPU costs for self-hosted enterprise LLMs by 50% for early adopters
GPU depreciation could be the next big crisis coming for AI hyperscalers — after spending billions on buildouts, next-gen upgrades may amplify cashflow quirks
Racing karts on a Rust GPU kernel driver
OrthoRoute – GPU-accelerated autorouting for KiCad
Luminal raises $5.3 million to build a better GPU code framework
GPU goliaths are devouring supercomputing – and legacy storage can't feed the beast
GPU goliaths are devouring supercomputing – and legacy storage can't feed the beast
The question everyone in AI is asking: How long before a GPU depreciates?
Simulating a Planet on the GPU: Part 1 (2022)
Text rendering and effects using GPU-computed distances
Raycore: GPU accelerated and modular ray intersections
Show HN: OSS implementation of Test Time Diffusion that runs on a 24gb GPU