Get the latest tech news

This is the fastest local AI I've tried, and it's not even close - how to get it


I've been on the hunt for the quickest model I can find, and I believe gpt-oss:20b might just be it.

I've tried quite a few different local LLMs, using Ollama on both Linux and MacOS, and I've recently run into one that blew all the others away -- with regard to speed. What you'll need: To make this work, you'll need either a running version of Ollama(it doesn't matter what desktop OS you're using) or you'll need to install it fresh. To update Ollama on either MacOS or Windows, you would simply download the binary installer, launch it, and follow the steps as described in the wizard.

Get the Android app

Or read this on ZDNet

Read more on:

Photo of fastest local AI

fastest local AI