Get the latest tech news

Show HN: BrowserAI – Run LLMs directly in browser using WebGPU (open source)


Run local LLMs inside your browser. Contribute to sauravpanda/BrowserAI development by creating an account on GitHub.

🔒 Privacy First: All processing happens locally - your data never leaves the browser 💰 Cost Effective: No server costs or complex infrastructure needed 🌐 Offline Capable: Models work offline after initial download 🚀 Blazing Fast: WebGPU acceleration for near-native performance 🎯 Developer Friendly: Simple API, multiple engine support, ready-to-use models ⚡ WebGPU acceleration for blazing fast inference 🔄 Seamless switching between MLC and Transformers engines 📦 Pre-configured popular models ready to use 🛠️ Easy-to-use API for text generation and more 🎯 Simplified model initialization 📊 Basic monitoring and metrics 🔍 Simple RAG implementation 🛠️ Developer tools integration

Get the Android app

Or read this on Hacker News

Read more on:

Photo of LLMs

LLMs

Photo of browser

browser

Photo of open source

open source

Related news:

News photo

Show HN: A submarine combat game in the browser

News photo

LLMs Demonstrate Behavioral Self-Awareness [pdf]

News photo

Microsoft previews Game Assist in-game browser in Edge Stable