Get the latest tech news
Ollama Now Runs Faster on Macs Thanks to Apple's MLX Framework
Ollama, the popular app for running AI models locally on a computer, has released an update that takes advantage of Apple's own machine learning framework, MLX. The result is a hefty speed boost on Macs with Apple silicon. According to Ollama, the new version processes prompts around 1.6 times faster (prefill speed) and nearly doubles the speed at which it generates responses (decode speed).
None
Or read this on Mac Rumors
