Get the latest tech news

Running local models on an M4 with 24GB memory


Experiments with getting usable outputs out of local models on a standard Macbook

None

Get the Android app

Or read this on Hacker News

Read more on:

Photo of GB Memory

GB Memory

Photo of local models

local models

Related news:

News photo

Running local models on Macs gets faster with Ollama's MLX support | Apple Silicon Macs get a performance boost thanks to better unified memory usage.

News photo

Speechify’s Windows app uses local models for transcription and dictation

News photo

Fully spec'd Mac Studio with M3 Ultra chip is $14,099 with 512GB memory and 16TB storage