Get the latest tech news
Optimizing a 6502 image decoder, from 70 minutes to 1 minute
A little walkthrough on an image decoder algorithm's high-level optimization, in order to make more digestable for a 6502 processor.
When I set out to write a program that would allow me to do basic digital photography on the Apple II, I decided I would do it with the Quicktake cameras. Getting where I’m now took, I think, five or six deep dives with each time, one or two weeks worth of late evenings and full week-ends dedicated to progressing, wading through hundreds or thousands of debug printf()s, gdb’ing, variables and offsets comparisons, etc. I figured I still had extra processing I didn’t need, (and dropped the #ifdef COLOR conditional to make things clearer).
Or read this on Hacker News