bit llms

Read news on bit llms with our app.

Read more in the app

BitNet: Inference framework for 1-bit LLMs

Matrix-vector multiplication implemented in off-the-shelf DRAM for Low-Bit LLMs

“Imprecise” language models are smaller, speedier, and nearly as accurate