Get the latest tech news
Optimizing Large-Scale OpenStreetMap Data with SQLite
The personal website of JT Archie. Includes a blog, work ethic, and projects they have worked on.
This initial SQLite database was enormous, around 100 gigabytes for the United States, which necessitated determining which data was essential and how to optimize searches. Initially, I used GZIP compression via Go’s built-in functionality, but it proved too slow due to the need to decompress large portions of the file for random reads. Further research led me to Facebook’s Zstandard (ZSTD) compression, which supports a seekable format suitable for random access reads.
Or read this on Hacker News