Get the latest tech news

Optimizing a 6502 image decoder, from 70 minutes to 1 minute


A little walkthrough on an image decoder algorithm's high-level optimization, in order to make more digestable for a 6502 processor.

When I set out to write a program that would allow me to do basic digital photography on the Apple II, I decided I would do it with the Quicktake cameras. Getting where I’m now took, I think, five or six deep dives with each time, one or two weeks worth of late evenings and full week-ends dedicated to progressing, wading through hundreds or thousands of debug printf()s, gdb’ing, variables and offsets comparisons, etc. I figured I still had extra processing I didn’t need, (and dropped the #ifdef COLOR conditional to make things clearer).

Get the Android app

Or read this on Hacker News

Read more on:

Photo of minutes

minutes

Photo of image decoder

image decoder

Related news:

News photo

I put away my iPad just minutes after testing this $120 Android tablet

News photo

Show HN: Blocks – Dream work apps and AI agents in minutes

News photo

Turn MEP prompts into AutoCAD drawings in minutes