local models

Read news on local models with our app.

Read more in the app

Running local models on Macs gets faster with Ollama's MLX support | Apple Silicon Macs get a performance boost thanks to better unified memory usage.

Speechify’s Windows app uses local models for transcription and dictation