Get the latest tech news

Optimizing Large-Scale OpenStreetMap Data with SQLite


The personal website of JT Archie. Includes a blog, work ethic, and projects they have worked on.

This initial SQLite database was enormous, around 100 gigabytes for the United States, which necessitated determining which data was essential and how to optimize searches. Initially, I used GZIP compression via Go’s built-in functionality, but it proved too slow due to the need to decompress large portions of the file for random reads. Further research led me to Facebook’s Zstandard (ZSTD) compression, which supports a seekable format suitable for random access reads.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of scale

scale

Photo of openstreetmap data

openstreetmap data

Related news:

News photo

Newswire: A large-scale structured database of a century of historical news

News photo

As we approach launch, excitement for Shadow of the Erdtree is off the scale

News photo

I don’t think we are in Kansas anymore as Minecraft builder replicates US city at 1:1 scale using software they developed