Get the latest tech news

I Run LLMs Locally


I document how I run Large Language Models locally.

Open WebUI 6 is a frontend which offers familiar chat interface for text and image input and communicates with Ollama back-end and streams the output back to the user. I haven’t fine-tuned or quantized any models on my machine yet as my Intel CPU may have a manufacturing defect 18 so I don’t want to push it to high temperatures for long durations during training. I strive to write low frequency, High quality content on Health, Product Development, Programming, Software Engineering, DIY, Security, Philosophy and other interests.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of LLMs

LLMs

Related news:

News photo

It's only a matter of time before LLMs jump start supply-chain attacks

News photo

Can LLMs accurately recall the Bible?

News photo

LLMs are everything that it wrong in computing