Get the latest tech news
How Neural Super Sampling Works: Architecture, Training, and Inference
AI-powered upscaling for mobile gaming.
Training is done in PyTorch using well-established practices, including the Adam optimizer, a cosine annealing learning rate schedule, and standard data augmentation strategies. These include PSNR (Peak Signal-to-Noise Ratio), SSIM (Structural Similarity Index), and FLIP, a rendering focused perceptual error metric. Unlike non-neural approaches such as Arm ASR or AMDs FSR 2, NSS also handles particle effects without needing a reactive mask.
Or read this on Hacker News