Get the latest tech news
HyAB k-means for color quantization
July 9th, 2025 I’ve been obsessing over color quantization algorithms lately, and when I learned that an image conversion app called Composite did its so-called pixel mapping step in CIELAB space, I instantly thought of the “HyAB” color distance formula I’d seen in the FLIP error metric paper from 2020. By “pixel mapping” I mean choosing for each pixel the closest color in a fixed palette.
They ran an experiment with 17 participants and its results suggest HyAB is more faithful to large observed color differences than an Euclidean distance or the CIEDE2000 formula from 2001. Below a quick comparison of 16-color paletted versions between an input image, Pillow ’s quantize(), libimagequant, and my implementation ( GitHub, local copy) of modified k-means in CIELAB space with the HyAB formula plugged in. I’m talking about practical matters such as choosing the number of colors automatically, hiding banding, adding a small contrast boost, or doing correct linear-space downscaling.
Or read this on Hacker News