Get the latest tech news

Show HN: BrowserAI โ€“ Run LLMs directly in browser using WebGPU (open source)


Run local LLMs inside your browser. Contribute to sauravpanda/BrowserAI development by creating an account on GitHub.

๐Ÿ”’ Privacy First: All processing happens locally - your data never leaves the browser ๐Ÿ’ฐ Cost Effective: No server costs or complex infrastructure needed ๐ŸŒ Offline Capable: Models work offline after initial download ๐Ÿš€ Blazing Fast: WebGPU acceleration for near-native performance ๐ŸŽฏ Developer Friendly: Simple API, multiple engine support, ready-to-use models โšก WebGPU acceleration for blazing fast inference ๐Ÿ”„ Seamless switching between MLC and Transformers engines ๐Ÿ“ฆ Pre-configured popular models ready to use ๐Ÿ› ๏ธ Easy-to-use API for text generation and more ๐ŸŽฏ Simplified model initialization ๐Ÿ“Š Basic monitoring and metrics ๐Ÿ” Simple RAG implementation ๐Ÿ› ๏ธ Developer tools integration

Get the Android app

Or read this on Hacker News

Read more on:

Photo of LLMs

LLMs

Photo of browser

browser

Photo of open source

open source

Related news:

News photo

Show HN: A submarine combat game in the browser

News photo

LLMs Demonstrate Behavioral Self-Awareness [pdf]

News photo

Microsoft previews Game Assist in-game browser in Edge Stable